By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData Collective
  • Analytics
    AnalyticsShow More
    predictive analytics in dropshipping
    Predictive Analytics Helps New Dropshipping Businesses Thrive
    12 Min Read
    data-driven approach in healthcare
    The Importance of Data-Driven Approaches to Improving Healthcare in Rural Areas
    6 Min Read
    analytics for tax compliance
    Analytics Changes the Calculus of Business Tax Compliance
    8 Min Read
    big data analytics in gaming
    The Role of Big Data Analytics in Gaming
    10 Min Read
    analyst,women,looking,at,kpi,data,on,computer,screen
    Promising Benefits of Predictive Analytics in Asset Management
    11 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-23 SmartData Collective. All Rights Reserved.
Reading: Why Data Should Be a Business Asset: The 1-10-100 Rule
Share
Notification Show More
Latest News
ai digital marketing tools
Top Five AI-Driven Digital Marketing Tools in 2023
Artificial Intelligence
ai-generated content
Is AI-Generated Content a Net Positive for Businesses?
Artificial Intelligence
predictive analytics in dropshipping
Predictive Analytics Helps New Dropshipping Businesses Thrive
Predictive Analytics
cloud data security in 2023
Top Tools for Your Cloud Data Security Stack in 2023
Cloud Computing
become a data scientist
Boosting Your Chances for Landing a Job as a Data Scientist
Jobs
Aa
SmartData Collective
Aa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Big Data > Data Quality > Why Data Should Be a Business Asset: The 1-10-100 Rule
Data Quality

Why Data Should Be a Business Asset: The 1-10-100 Rule

martindoyle
Last updated: 2014/08/05 at 1:42 PM
martindoyle
8 Min Read
Image
SHARE

ImageIf you run a business, or even just run a household, you’ll understand that chaos creates waste. How many times have you rushed to find an important document, such as a birth certificate, only to have to pay out for an emergency replacement? How many hours have you wasted searching for lost keys, lost passports and lost letters?

Contents
The Data Decay ChallengeWhy It Matters1-10-100The Action Plan

ImageIf you run a business, or even just run a household, you’ll understand that chaos creates waste. How many times have you rushed to find an important document, such as a birth certificate, only to have to pay out for an emergency replacement? How many hours have you wasted searching for lost keys, lost passports and lost letters? How many times have you wasted money on an emergency callout for a problem that could have been located and fixed?

Just as chaos at home is disruptive, disorganisation and chaos can have huge effects on business profitability. If employees don’t work to conventions, they risk creating problems for themselves, and for other employees down the line. Use of IT systems is strictly controlled for precisely this reason, and businesses work hard to ensure that data is stored in a controlled way. Otherwise, the chaos spreads like a virus.

But what happens when data is captured correctly, then naturally decays? Even the best laid plans cannot protect against this inevitable, and costly, data quality challenge.

More Read

analyzing big data for its quality and value

Use this Strategic Approach to Maximize Your Data’s Value

Niche Data Tactics to Take Your Business to the Next Level
Preserving Data Quality is Critical for Leveraging Analytics with Amazon PPC
What Tools Do You Need To Manage Unstructured Data?
3 Massive Cost-Saving Benefits of Smart Data for Businesses

The Data Decay Challenge

Poor data quality is one of the key causes of waste in enterprise. It creates rework and disorder for people who come into contact with that data. Businesses who understand data quality try to capture very high quality data to avoid contaminating a healthy database with junk.

But remember: data is linked to living beings, and to an ever changing world.

Let’s say a business captured your personal data on January 1st, 2004. At the point of capture, the member of staff you spoke to ensured the accuracy of that data. They may have run through checklists and used automated systems to support the data entry, resulting in a record that was completely clean.

On January 1st, 2014, would that record still be accurate? Even if it had not been edited, its contents would almost certainly have decayed. Jobs, children, houses and names all change over the years, affecting the overall integrity of data.

Why It Matters

Modern businesses are utterly reliant on data held in databases. From customer data to sales data, stock control to invoicing, the business builds its profits upon data. High quality data is directly linked to healthy profits. It should therefore be considered a business asset and kept up to standards with the help of data quality tools.

As we’ve demonstrated, healthy data doesn’t occur naturally. While verification is important at the point of entry, it cannot guarantee data quality long term. Likewise, spot checks or one-off data cleansing initiatives can only deal with the database on that day, in that moment. The day after the check is complete, the decay starts again.

1-10-100

To measure the effect of ageing on a database, we can look to the 1-10-100 rule. 1-10-100 was developed by George Labovitz and Yu Sang Chang in 1992 and is widely used as a tool to describe efficiency.

The 1-10-100 rule is applied in many scenarios concerned with quality and the cost of correction. It is expressed in US dollars in our example, but it can be understood to mean any number of ‘units’, measured in financial terms. We can also use 1-10-100 to measure ‘cost’ in terms of resources or time. For our purposes, the 1-10-100 rule can be applied to data quality challenges at various stages in the database lifecycle, since it illustrates the importance of maintaining a high standard of data quality continually rather than occasionally.

The rule applied to data is as follows:

  • Verifying the quality a record costs the business $1. This is known as the prevention cost.
  • Cleansing and deduplicating a record costs the business $10. This is the correction cost.
  • Working with a record that’s never cleansed costs $100. This is the failure cost.

We can break this down into a simple example.

If a record is added to the database on 1st January 2004, it cost the business around $1 to verify it on that date (or, to measure it another way, it could take 1 minute to complete). This is its prevention cost. It may involve checking the address against the postcode, ensuring the customer’s name is spelled correctly, and ensuring that the customer is not already listed in the database.

During a data quality initiative, a record is updated and restored to its former quality. This is the correction cost. We understand this to be ten times the effort, and ten times the resource implication, but it is often a necessary part of data maintenance.

However, if that record is never cleansed, never deduplicated and never restored to its former quality, it costs the business 100 times the initial outlay. Why? It represents a failure.

Failure to maintain data is a failure to maintain acceptable quality standards. It creates waste. Its effects ripple through the business as the erroneous data causes rework and chaos. The business is disorganised and cannot operate efficiently.

And, most importantly of all, the effects of this waste eat into profits.

The 1-10-100 rule cannot take into account the ricochet effect of poor data quality, such as customers that become irritated, or staff that find the database tiresome and difficult to work with. Over time, dissatisfied customers and staff can create additional costs that are more difficult to measure. These might include customer acquisition, conversion rates, reputation repair or staff churn.

The Action Plan

The 1-10-100 rule proves that sooner is better.

Data quality is not a problem that can be tackled once and forgotten, and tackling it soon and often is better than late or not at all.

Continual improvement and maintenance is the key to keeping waste to a minimum in business. The concept applies to data quality, too. Continual management of data is the only way to ensure waste does not become endemic.

Data quality is meaningless unless we attach some kind of cost implication. Using the 1-10-100 rule, we can achieve buy in from senior managers who are most interested in protecting profit. From there, we can catch problems early and insulate core business functions against decay.

martindoyle August 5, 2014
Share this Article
Facebook Twitter Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

ai digital marketing tools
Top Five AI-Driven Digital Marketing Tools in 2023
Artificial Intelligence
ai-generated content
Is AI-Generated Content a Net Positive for Businesses?
Artificial Intelligence
predictive analytics in dropshipping
Predictive Analytics Helps New Dropshipping Businesses Thrive
Predictive Analytics
cloud data security in 2023
Top Tools for Your Cloud Data Security Stack in 2023
Cloud Computing

Stay Connected

1.2k Followers Like
33.7k Followers Follow
222 Followers Pin

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

[mc4wp_form id=”1616″]

You Might also Like

analyzing big data for its quality and value
Big Data

Use this Strategic Approach to Maximize Your Data’s Value

6 Min Read
niche data tactics for business success
Big Data

Niche Data Tactics to Take Your Business to the Next Level

6 Min Read
data quality and role of analytics
Data Quality

Preserving Data Quality is Critical for Leveraging Analytics with Amazon PPC

8 Min Read
What Tools Do You Need To Manage Unstructured Data?
Data CollectionData QualityUnstructured Data

What Tools Do You Need To Manage Unstructured Data?

7 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

data-driven web design
5 Great Tips for Using Data Analytics for Website UX
Big Data
ai is improving the safety of cars
From Bolts to Bots: How AI Is Fortifying the Automotive Industry
Artificial Intelligence

Quick Link

  • About
  • Contact
  • Privacy
Follow US

© 2008-23 SmartData Collective. All Rights Reserved.

Removed from reading list

Undo
Go to mobile version
Welcome Back!

Sign in to your account

Lost your password?