How “Dirty Data” Derails Your Company’s Data Analytics and ROI
How much of what your company knows is useful? Which data is current? In a recent webinar about the dangers of “Dirty Data” Jay Hidalgo of The Annuitas Group says many companies simply don’t know. He estimated that 30 percent of companies have no strategy for “data hygiene” – removing duplicates or obsolete information. He said 34 percent of companies ask the front-end sales team to update customer and prospect files but even that can be flawed among multi-product or different “views” of the same company, client or transaction.
For sales and marketing especially any data analytics and business intelligence based on “bad data” will yield misleading or incorrect results. In fact, 8 of 10 companies indicated that dirty data is hindering their lead generation campaigns. Instead of cleaning and clarifying, companies pile ever increasing data into storage with a “we’ll sort it all out later” approach, particularly sales and marketing role companies. But real-time data changes things: the sheer volume of information and how quickly it makes previous knowledge and data obsolete requires changing habits.
Enter a new support industry to recover, clean and manage that information. Data Service Providers, companies that analyze, double-check and connect the dots when you have partial details and need more. Hoovers and ZoomInfo are two examples for finding business people and companies; ThinkorSwim takes every transaction from financial markets to analyze patterns, trends and activity levels. News aggregators, auction pricing guides and search engines are just a few other examples of meta-data (data ABOUT your data).
“The problem of data decay is that it’s faster than it’s ever been. Seventy one percent of business cards you collect have at least one change within 12 months,” said Sam Zales, president of ZoomInfo. An estimated 600,000 small businesses are created and vaporize in a five-year-period. Knowing which ones can be critical if that is your marketplace.
No human-powered research effort could possibly keep pace, so a regular schedule and program is an important first-step. Like a health inspector, you can’t expect to check every restaurant daily – but responding to complaints and maintaining a rigorous plan for oversight is a good policy. Testing data for accuracy on a regular schedule answers questions such as “What do you know? And how do you know?”
Then you can confidently answer the question “Are you sure?”
Subscribe to our blog to stay informed on how to improve your company’s data quality and other data analytics topics.
Spotfire Blogging Team
You may be interested
How SAP Hana is Driving Big Data StartupsRyan Kh - July 20, 2017
The first version of SAP Hana was released in 2010, before Hadoop and other big data extraction tools were introduced.…
Data Erasing Software vs Physical Destruction: Sustainable Way of Data DeletionManish Bhickta - July 20, 2017
Physical Data destruction techniques are efficient enough to destroy data, but they can never be considered eco-friendly. On the other…
10 Simple Rules for Creating a Good Data Management PlanGloriaKopp - July 20, 2017
Part of business planning is arranging how data will be used in the development of a project. This is why…
You must log in to post a comment.