By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData Collective
  • Analytics
    AnalyticsShow More
    data science anayst
    Growing Demand for Data Science & Data Analyst Roles
    6 Min Read
    predictive analytics in dropshipping
    Predictive Analytics Helps New Dropshipping Businesses Thrive
    12 Min Read
    data-driven approach in healthcare
    The Importance of Data-Driven Approaches to Improving Healthcare in Rural Areas
    6 Min Read
    analytics for tax compliance
    Analytics Changes the Calculus of Business Tax Compliance
    8 Min Read
    big data analytics in gaming
    The Role of Big Data Analytics in Gaming
    10 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-23 SmartData Collective. All Rights Reserved.
Reading: David Binkley: Data and the Reasonable Test
Share
Notification Show More
Latest News
ai in automotive industry
AI Is Changing the Automotive Industry Forever
Artificial Intelligence
SMEs Use AI-Driven Financial Software for Greater Efficiency
Artificial Intelligence
data security in big data age
6 Reasons to Boost Data Security Plan in the Age of Big Data
Big Data
data science anayst
Growing Demand for Data Science & Data Analyst Roles
Data Science
ai software development
Key Strategies to Develop AI Software Cost-Effectively
Artificial Intelligence
Aa
SmartData Collective
Aa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Big Data > Data Quality > David Binkley: Data and the Reasonable Test
Data Quality

David Binkley: Data and the Reasonable Test

Gayle Nixon
Last updated: 2012/08/21 at 2:11 PM
Gayle Nixon
5 Min Read
SHARE

David Binkley David Binkley, Senior Technical Account Manager, Harte-Hanks Trillium Software

David Binkley David Binkley, Senior Technical Account Manager, Harte-Hanks Trillium Software

In all of the projects that I’ve been involved with over the years, I can’t ever recall seeing a requirement from the customer specifying that the overall results comply with what I feel is the final data requirement – do the results pass the “reasonable” test.  Sure there have been plenty of projects where the customer has come up with target percentages or variance levels for the test cases but rarely do I recall a customer specifically asking to check if the end results appear reasonable. 

The reason that I’m bringing this up is that I have seen plenty of projects where we’ve been called in late in the implementation phase or shortly after going live where the customer is in a panic and making statements to the effect that “these results don’t appear reasonable”.  So is this panic a byproduct of not setting proper expectations or is it because the reality of the data’s quality has changed those expectations?  I venture to say that, in most cases, it’s likely been due to both.

More Read

analyzing big data for its quality and value

Use this Strategic Approach to Maximize Your Data’s Value

Niche Data Tactics to Take Your Business to the Next Level
Preserving Data Quality is Critical for Leveraging Analytics with Amazon PPC
What Tools Do You Need To Manage Unstructured Data?
3 Massive Cost-Saving Benefits of Smart Data for Businesses

Most of you will agree that it’s a matter of fact that one of the first goals for data quality is to provide the functionality of “due diligence” for your projects and applications and to be able to effectively perform that due diligence, you have to have at least an idea of what are reasonable results to begin with.  Defining reasonable can be tricky but you typically start off by asking questions about what facts are known about the data and what factors made the customer come to their assumptions and conclusions about what is right and wrong with it.  After gathering those assumptions, facts and conclusions and by doing some analysis of the unknowns, you can then have a better perspective from which you are able to formulate some initial expectations.

But that’s only the start of the exercise.  You need to remember that those initial reasonable expectations are just that – expectations.  Expectations don’t become reality until we perform our due diligence processes to validate the assumptions, gather the facts and implement the remediation processes to fulfill them.  We have to learn to trust what the customer is telling us is true, but we also need to verify as much as possible or we may be setting ourselves up for some sort of failure somewhere in the future.  Performing those due diligence processes will help you avoid the situations when the live data doesn’t support the business case or perhaps supports it too much – like suddenly surprising management with the fact that 40% of the customers are duplicates when expectations were that only 5% were duplicates. 

After you have performed your due diligence processes, I advocate that “the reasonable test” will also need to be applied against the end results.  No matter what the due diligence results were, you always need to ask yourself: Do the results make sense?  Ask yourself questions like: Given the fact that all the test cases worked, is it still reasonable that 25% of your billing records don’t have any addresses? 

Granted results outside the reasonable expectations often do occur but if we’ve correctly done our due diligence, those results at least shouldn’t surprise you the day after you have gone live with your project.  You should have known them well in advance of the implementation date giving you time to inform management that they were coming and to make plans to deal with them.

So, make sure that you have some reasonable expectations to start with and don’t be afraid to question unreasonable results – what seems unreasonable may also be incorrect.  At worst case, by questioning them and gathering the facts, you’ll then be prepared to defend those results when they are likely to be questioned by management or the end users.
   
There’s one quick example for those people who still are questioning this test.  Do you know how Bernie Madoff’s Ponzi scheme was finally exposed?  A couple of people simply challenged the returns on investment as not being reasonable as compared with everyone else.  Imagine how much pain could have been avoided if someone had performed the reasonable test years ago.

Gayle Nixon August 21, 2012
Share this Article
Facebook Twitter Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

ai in automotive industry
AI Is Changing the Automotive Industry Forever
Artificial Intelligence
SMEs Use AI-Driven Financial Software for Greater Efficiency
Artificial Intelligence
data security in big data age
6 Reasons to Boost Data Security Plan in the Age of Big Data
Big Data
data science anayst
Growing Demand for Data Science & Data Analyst Roles
Data Science

Stay Connected

1.2k Followers Like
33.7k Followers Follow
222 Followers Pin

You Might also Like

analyzing big data for its quality and value
Big Data

Use this Strategic Approach to Maximize Your Data’s Value

6 Min Read
niche data tactics for business success
Big Data

Niche Data Tactics to Take Your Business to the Next Level

6 Min Read
data quality and role of analytics
Data Quality

Preserving Data Quality is Critical for Leveraging Analytics with Amazon PPC

8 Min Read
What Tools Do You Need To Manage Unstructured Data?
Data CollectionData QualityUnstructured Data

What Tools Do You Need To Manage Unstructured Data?

7 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

giveaway chatbots
How To Get An Award Winning Giveaway Bot
Big Data Chatbots Exclusive
ai is improving the safety of cars
From Bolts to Bots: How AI Is Fortifying the Automotive Industry
Artificial Intelligence

Quick Link

  • About
  • Contact
  • Privacy
Follow US

© 2008-23 SmartData Collective. All Rights Reserved.

Removed from reading list

Undo
Go to mobile version
Welcome Back!

Sign in to your account

Lost your password?