How “Dirty Data” Derails Your Company’s Data Analytics and ROI

March 29, 2011
11 Views

Data Analytics and Dirty Data photo (uncategorized)How much of what your company knows is useful? Which data is current?

Data Analytics and Dirty Data photo (uncategorized)How much of what your company knows is useful? Which data is current? In a recent webinar about the dangers of “Dirty Data” Jay Hidalgo of The Annuitas Group says many companies simply don’t know. He estimated that 30 percent of companies have no strategy for “data hygiene” – removing duplicates or obsolete information. He said 34 percent of companies ask the front-end sales team to update customer and prospect files but even that can be flawed among multi-product or different “views” of the same company, client or transaction.

For sales and marketing especially any data analytics and business intelligence based on “bad data” will yield misleading or incorrect results. In fact, 8 of 10 companies indicated that dirty data is hindering their lead generation campaigns. Instead of cleaning and clarifying, companies pile ever increasing data into storage with a “we’ll sort it all out later” approach, particularly sales and marketing role companies.  But real-time data changes things: the sheer volume of information and how quickly it makes previous knowledge and data obsolete requires changing habits.

Enter a new support industry to recover, clean and manage that information. Data Service Providers, companies that analyze, double-check and connect the dots when you have partial details and need more. Hoovers and ZoomInfo are two examples for finding business people and companies; ThinkorSwim takes every transaction from financial markets to analyze patterns, trends and activity levels. News aggregators, auction pricing guides and search engines are just a few other examples of meta-data (data ABOUT your data).

“The problem of data decay is that it’s faster than it’s ever been. Seventy one percent of business cards you collect have at least one change within 12 months,” said Sam Zales, president of ZoomInfo. An estimated 600,000 small businesses are created and vaporize in a five-year-period. Knowing which ones can be critical if that is your marketplace.

No human-powered research effort could possibly keep pace, so a regular schedule and program is an important first-step. Like a health inspector, you can’t expect to check every restaurant daily – but responding to complaints and maintaining a rigorous plan for oversight is a good policy. Testing data for accuracy on a regular schedule answers questions such as “What do you know? And how do you know?”

Then you can confidently answer the question “Are you sure?”

Subscribe to our blog to stay informed on how to improve your company’s data quality and other data analytics topics.

David Wallace
Spotfire Blogging Team

You may be interested

The State of US Cyber Security
IT
0 shares194 views
IT
0 shares194 views

The State of US Cyber Security

bcornell - May 25, 2017

During the first week of May 2017 President Donald Trump signed a cyber security executive order focusing on upgrading government…

Tips to keep your eCommerce Store Secured against Hackers
IT
0 shares196 views
IT
0 shares196 views

Tips to keep your eCommerce Store Secured against Hackers

Rehan Ijaz - May 25, 2017

“There are risks and costs to a program of action--but they are far less than the long-range cost of comfortable…

The Lessons We can Learn from Bad Data Mistakes Made Throughout History
Big Data
0 shares247 views
Big Data
0 shares247 views

The Lessons We can Learn from Bad Data Mistakes Made Throughout History

Matthew Zajechowski - May 25, 2017

Bad data is costly.  With data driving so many decisions in our lives, the cost of bad data truly impacts…