There’s a lot of junk analytics around. Some analyses look sophisticated, but are simply incorrect. Others may be mathematically correct, but of little practical value. Early in my career, as the lone statistician at a large company, I was often asked to review statistical analysis done by consultants and coworkers. Most of it was plain and simple garbage. Though I run in more analytically sophisticated circles today, I still frequently encounter examples of glaringly bad data analysis.

Don’t assume that the bad stuff only comes from obvious amateurs. Yes, rookie mistakes are made. And there will always be some naughty people determined to make the analysis support the answers they want. But there are also well-meaning PhDs, even college professors, spouting nonsense, often paid quite handsomely to do so.

You can cut through the crap.

Do you doubt yourself? Are you thinking, “If professors are getting it wrong, how could little old me hope to get it right?” Have no fear. Most of the junk is easy to spot, and I’m going to teach you how to do it, right here, right now.

How do you recognize good, healthy food when you see it? You use your senses – do I recognize this food? Is it familiar? Have I read or experienced something that tells me whether this is a healthful food? How about the ingredients – what’s in this stuff? Are these ingredients that I should eat? You might sometimes misjudge, and you might give in to temptation and eat something you know is unhealthy, but in general you can do a pretty good job of identifying which foods are healthy choices, and which are not.

You don’t have as much education and experience with analytics as you have with food, but you can learn to assess analytics in the same way – using simple, reasonable questions.

Here are three questions that will carry you a long way:

What are you assuming in this analysis?

Why did you choose this approach?

How does this analysis relate to action?

What are you assuming in this analysis?

Assumptions are the most basic ingredients of classical statistical analysis. They set the stage, defining what analysis methods are appropriate.

However, if the analyst is willing to discuss assumptions, but admits that some of the assumptions are not utterly realistic, what you have is a conscientious analyst stuck with a real-life problem. Ask about how the assumptions differ from reality, and how that will affect the results. Some analysis methods are quite sensitive to such imperfections, others are not. This is a tough discussion even for many pretty good analysts. If you find that the analyst can give you a clear explanation (Hint: “That’s not important” is not an explanation.) of how deviations from assumptions can affect the outcome of an analysis, you have found yourself a very good analyst.

Written reports should always include discussion of assumptions. It can be in the Appendix, but it’s gotta be there.

This issue must be approached differently when working with data miners. The central idea of data mining is empowering businesspeople to make discoveries without slogging through theory, so data miners should not expected to be able to explain underlying assumptions of all the analytic tools they use. Instead, ask about how you might go about validating discoveries. Since you don’t have theory to support data mining results, the proof is found in testing – on new data and in the field. But again, the data miner should not be offended or resistant about answering your questions.

Why did you choose this approach?

When you ask an analyst “why,” the response had better include a little thoughtful discussion of the business problem – what you want to find out, how you intend to use the information and so forth. If you don’t hear a credible description of your own business concerns, it’s a pretty safe bet the analyst does not understand your business concerns. A good analyst may also point out issues that you had not thought of yourself, and help clarify, simplify or prioritize issues.

Of course, you are already aware of your own problems. You brought in an analyst for analytics expertise. So the analyst should also discuss a variety of analytics methods and reasons why they might or might not fit your application. The analyst should be able to discuss how different analytic techniques might mesh (or not) with your application, the nature of your data, and if using classical statistics, the assumptions behind the techniques.

It’s appropriate to ask about the resources the analyst uses – which may include written materials, colleagues and software, and why the analyst finds those to be relevant, credible and practical. There is no magic best book or piece of software, no one theory to beat all the rest, but there are good reasons why some fit your project better than others.

Here are some examples from my own experience that show the value of asking why:

A consultant using sampling in a statistical analysis did not clearly explain the reasons for the particular sample size selected. Asking why that sample size was used and what resources supported that choice would reveal that the consultant had simply used a guideline learned during earlier projects, and did not know how to properly determine sample sizes for statistical analysis.

A writer criticized published research on the grounds that the sample size was too large. Asked to point out any statistical theory supporting his objection, the critic admitted there was none.

These analysts were confident. On the surface, both sounded good, but both were making serious errors. Indeed, both were making the same error – determining sample sizes without understanding the proper methods. The same flaw showed up in two different ways. Yet you only need one way to unearth evidence of the problem – ask “why” and keep asking until you have either satisfied yourself that the analyst is doing a good job, or discovered issues that call for further investigation.

How does this analysis relate to action?

What good is a brilliant insight if nobody gets it but the analyst? No good at all. So, it’s time ask the analyst how you can put analysis into action to benefit your business.

This is where you find out whether you are working with serious data analyst, or just a report jockey, code monkey or math whiz. An analyst who understands business does not stop with telling you that the results were significant, or which model fit best.  What should you expect? First, results in plain business English – like “The evidence suggests that the new coupon generates no more revenue than the old coupon,” or “Landing page tests indicate that the test design will draw 20% more conversions than the control design.” The analyst should be able to tell you there’s no reason to change your coupons and good reason to change your landing page, in words you understand completely.

Summary

Even if you don’t know much about analytics yourself, you have the power to separate the good analytics from the bad. Just remember three simple questions:

What are you assuming in this analysis?

Why did you choose this approach?

How does this analysis relate to action?

And listen carefully to the answers!

Want to know more about working effectively with analysts? Read these articles:

Analytics, Schmanalytics! How to Evaluate an Analyst

No Smokescreen Area: Tips for Hiring Analysts