Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    data driven insights
    How Data-Driven Insights Are Addressing Gaps in Patient Communication and Equity
    8 Min Read
    pexels pavel danilyuk 8112119
    Data Analytics Is Revolutionizing Medical Credentialing
    8 Min Read
    data and seo
    Maximize SEO Success with Powerful Data Analytics Insights
    8 Min Read
    data analytics for trademark registration
    Optimizing Trademark Registration with Data Analytics
    6 Min Read
    data analytics for finding zip codes
    Unlocking Zip Code Insights with Data Analytics
    6 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: Analytics BS: 3 Questions to Spot It
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Data Management > Best Practices > Analytics BS: 3 Questions to Spot It
AnalyticsBest PracticesBusiness IntelligenceCulture/LeadershipData MiningModelingPredictive AnalyticsStatisticsWeb Analytics

Analytics BS: 3 Questions to Spot It

metabrown
metabrown
11 Min Read
SHARE

There’s a lot of junk analytics around. Some analyses look sophisticated, but are simply incorrect. Others may be mathematically correct, but of little practical value. Early in my career, as the lone statistician at a large company, I was often asked to review statistical analysis done by consultants and coworkers. Most of it was plain and simple garbage. Though I run in more analytically sophisticated circles today, I still frequently encounter examples of glaringly bad data analysis.

There’s a lot of junk analytics around. Some analyses look sophisticated, but are simply incorrect. Others may be mathematically correct, but of little practical value. Early in my career, as the lone statistician at a large company, I was often asked to review statistical analysis done by consultants and coworkers. Most of it was plain and simple garbage. Though I run in more analytically sophisticated circles today, I still frequently encounter examples of glaringly bad data analysis.

Don’t assume that the bad stuff only comes from obvious amateurs. Yes, rookie mistakes are made. And there will always be some naughty people determined to make the analysis support the answers they want. But there are also well-meaning PhDs, even college professors, spouting nonsense, often paid quite handsomely to do so.

More Read

What’s the Difference? – A Social Media Monitoring Strategy that Analyzes Intentions Instead of Mentions
Technology Terminology: What’s in a Name?
Top Ten Predictions for 2011 from IDC
Technologies are being developed that enable tiny computing…
Communication, Social Networks, and Railroads

You can cut through the crap.

Do you doubt yourself? Are you thinking, “If professors are getting it wrong, how could little old me hope to get it right?” Have no fear. Most of the junk is easy to spot, and I’m going to teach you how to do it, right here, right now.

How do you recognize good, healthy food when you see it? You use your senses – do I recognize this food? Is it familiar? Have I read or experienced something that tells me whether this is a healthful food? How about the ingredients – what’s in this stuff? Are these ingredients that I should eat? You might sometimes misjudge, and you might give in to temptation and eat something you know is unhealthy, but in general you can do a pretty good job of identifying which foods are healthy choices, and which are not.

You don’t have as much education and experience with analytics as you have with food, but you can learn to assess analytics in the same way – using simple, reasonable questions.

Here are three questions that will carry you a long way:

What are you assuming in this analysis?

Why did you choose this approach?

How does this analysis relate to action?

Let’s explore each of these and how it can help you.

 

What are you assuming in this analysis?

Assumptions are the most basic ingredients of classical statistical analysis. They set the stage, defining what analysis methods are appropriate.

 If the analyst claims to understand statistics, then this question should be nonthreatening, and answering should not be difficult. If you are speaking to the analyst in person, look for signs of nervousness, resistance to respond or difficulty in responding, all bad signs. If you sense hesitation, ask about it. If you find that the analyst objects to discussing assumptions, claims they are unimportant, doesn’t know them, or, heaven forbid, does anything to suggest that it is not your business to go down that line of questioning, that’s bad, very bad. If the analyst appears totally confident but laughs off your questions, or uses any other tactic to avoid giving you clear answers, that’s also very, very bad.

However, if the analyst is willing to discuss assumptions, but admits that some of the assumptions are not utterly realistic, what you have is a conscientious analyst stuck with a real-life problem. Ask about how the assumptions differ from reality, and how that will affect the results. Some analysis methods are quite sensitive to such imperfections, others are not. This is a tough discussion even for many pretty good analysts. If you find that the analyst can give you a clear explanation (Hint: “That’s not important” is not an explanation.) of how deviations from assumptions can affect the outcome of an analysis, you have found yourself a very good analyst.

Written reports should always include discussion of assumptions. It can be in the Appendix, but it’s gotta be there.

This issue must be approached differently when working with data miners. The central idea of data mining is empowering businesspeople to make discoveries without slogging through theory, so data miners should not expected to be able to explain underlying assumptions of all the analytic tools they use. Instead, ask about how you might go about validating discoveries. Since you don’t have theory to support data mining results, the proof is found in testing – on new data and in the field. But again, the data miner should not be offended or resistant about answering your questions.

Why did you choose this approach?

When you ask an analyst “why,” the response had better include a little thoughtful discussion of the business problem – what you want to find out, how you intend to use the information and so forth. If you don’t hear a credible description of your own business concerns, it’s a pretty safe bet the analyst does not understand your business concerns. A good analyst may also point out issues that you had not thought of yourself, and help clarify, simplify or prioritize issues.

Of course, you are already aware of your own problems. You brought in an analyst for analytics expertise. So the analyst should also discuss a variety of analytics methods and reasons why they might or might not fit your application. The analyst should be able to discuss how different analytic techniques might mesh (or not) with your application, the nature of your data, and if using classical statistics, the assumptions behind the techniques.

It’s appropriate to ask about the resources the analyst uses – which may include written materials, colleagues and software, and why the analyst finds those to be relevant, credible and practical. There is no magic best book or piece of software, no one theory to beat all the rest, but there are good reasons why some fit your project better than others.

Here are some examples from my own experience that show the value of asking why:

A consultant using sampling in a statistical analysis did not clearly explain the reasons for the particular sample size selected. Asking why that sample size was used and what resources supported that choice would reveal that the consultant had simply used a guideline learned during earlier projects, and did not know how to properly determine sample sizes for statistical analysis.

A writer criticized published research on the grounds that the sample size was too large. Asked to point out any statistical theory supporting his objection, the critic admitted there was none.

These analysts were confident. On the surface, both sounded good, but both were making serious errors. Indeed, both were making the same error – determining sample sizes without understanding the proper methods. The same flaw showed up in two different ways. Yet you only need one way to unearth evidence of the problem – ask “why” and keep asking until you have either satisfied yourself that the analyst is doing a good job, or discovered issues that call for further investigation.

How does this analysis relate to action?

What good is a brilliant insight if nobody gets it but the analyst? No good at all. So, it’s time ask the analyst how you can put analysis into action to benefit your business.

This is where you find out whether you are working with serious data analyst, or just a report jockey, code monkey or math whiz. An analyst who understands business does not stop with telling you that the results were significant, or which model fit best.  What should you expect? First, results in plain business English – like “The evidence suggests that the new coupon generates no more revenue than the old coupon,” or “Landing page tests indicate that the test design will draw 20% more conversions than the control design.” The analyst should be able to tell you there’s no reason to change your coupons and good reason to change your landing page, in words you understand completely.

Summary

Even if you don’t know much about analytics yourself, you have the power to separate the good analytics from the bad. Just remember three simple questions:

What are you assuming in this analysis?

Why did you choose this approach?

How does this analysis relate to action?

And listen carefully to the answers!

 

Want to know more about working effectively with analysts? Read these articles:

Analytics, Schmanalytics! How to Evaluate an Analyst

No Smokescreen Area: Tips for Hiring Analysts

 

©2011 Meta S. Brown

 

Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

ESG reporting software
Data Shows How ESG Reporting Software Helps Companies Achieve Sustainability Goals
Big Data Infographic
ai in marketing
AI Helps Businesses Develop Better Marketing Strategies
Artificial Intelligence Exclusive
agenic ai
How Businesses Are Using AI to Make Smarter, Faster Decisions
Artificial Intelligence Exclusive
accountant using ai
AI Improves Integrity in Corporate Accounting
Exclusive

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

Dealing with Online Profiles

10 Min Read

Business Intelligence Foundations

10 Min Read

Business Intelligence 2.0: Simpler, More Accessible, Inevitable…

1 Min Read

Red Dog and Windows Cloud: Microsoft is coming!

3 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

AI chatbots
AI Chatbots Can Help Retailers Convert Live Broadcast Viewers into Sales!
Chatbots
ai is improving the safety of cars
From Bolts to Bots: How AI Is Fortifying the Automotive Industry
Artificial Intelligence

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?