Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    predictive analytics risk management
    How Predictive Analytics Is Redefining Risk Management Across Industries
    7 Min Read
    data analytics and gold trading
    Data Analytics and the New Era of Gold Trading
    9 Min Read
    composable analytics
    How Composable Analytics Unlocks Modular Agility for Data Teams
    9 Min Read
    data mining to find the right poly bag makers
    Using Data Analytics to Choose the Best Poly Mailer Bags
    12 Min Read
    data analytics for pharmacy trends
    How Data Analytics Is Tracking Trends in the Pharmacy Industry
    5 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: Be Prepared to Duel with Data Quality
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Big Data > Data Quality > Be Prepared to Duel with Data Quality
Business IntelligenceData QualityData Warehousing

Be Prepared to Duel with Data Quality

RickSherman
RickSherman
10 Min Read
SHARE

Word_data-quality007 Plenty of business intelligence (BI) or data warehouse projects have been blindsided by complications related to data quality.

Word_data-quality007 Plenty of business intelligence (BI) or data warehouse projects have been blindsided by complications related to data quality. Sometimes these issues aren’t apparent until business users start testing the systems just before going live with the projects. What causes BI project teams to get caught off guard by data quality issues? Why do these problems surface so late in the projects?

There are two common pitfalls: defining data quality too narrowly and assuming data quality is the responsibility of the source systems.

People often assume that data quality simply means eliminating bad data – data that is missing, inaccurate or incorrect. Bad data is certainly a problem, but it isn’t the only problem. Good data quality programs also ensure that data is comprehensive, consistent, relevant and timely.

More Read

World Class Information Architecture
What Should Companies Consider Before Investing in a BI Solution?
Informatica’s Master Data Management Strategy
Good Advice for Bad Times?
The Parable of the Turkey
Don’t Blame the Source Systems
Defining data quality too narrowly often leads people to assume that source transactional systems – either through data entry or systemic errors – cause the bad data. Although they may be a source of some errors, the more likely culprits are either inconsistent dimensions across source systems (such as customer or product identifiers) or inconsistent definitions for derived data across organizations. Conforming dimensions – developing consistent customer or product identifiers – is important for accessing and analyzing data for a company. The source systems do not own the data quality issues across other systems- the BI project team does. The source systems need to ensure that the data within their data silos is correct. However, the BI project team is responsible for providing the business with data that is consistent across the enterprise.
Similarly, each organization within the enterprise may have valid business reasons to derive data differently than others. For example, their position in a set of business processes may determine how they view their data. The individual organizations aren’t tasked with developing common definitions for derived data, but the BI project team is. Many BI project teams try to claim that data quality issues aren’t their responsibility. However, from a practical viewpoint, the BI team does need to make these issues its own, because its job is to ensure the highest data quality possible. The BI project team is packaging the data for consumption by business users, and they will be held accountable for the data quality. This may not seem fair, but the success of their project depends on it.
Don’t Shortchange the Pilot
Surprises happen when the project does an initial pilot or release involving only a small subset of source systems. While there may be many good reasons to have a narrow scope for a pilot, you won’t get an appreciation for the effort necessary to conform these dimensions as the number of source systems expands.
Sometimes pilots are only with a single organization, using only their definitions for derived data. Once again, the tough issue is often how to accommodate the differences in the derivation definitions between organizations. In both cases, the real challenges are encountered when dealing with multiple systems and organizations. The business users need to look at the big picture, and that is only possible when they can access and analyze data across the enterprise.
Steps to Address Data Quality
To ensure data quality, the BI project team needs to address it from the very beginning. Here are several significant steps to consider:
Require the business to define data quality in a broad sense, establish metrics to monitor and measure it, and determine what should be done if the data fails to meet these metrics.
Undertake a comprehensive data profiling effort when performing a source systems analysis. Data anomalies across source systems and time (historical data does not always age well!) are needed so that the team can address them with the business early.
Incorporate data quality into all data integration and business intelligence processes from data sourcing to information consumption by the business user. Data quality issues need to be detected as early in the processes as possible and dealt with as defined in the business requirements.
Enterprises must present data that meets very stringent data quality levels, especially in light of recent compliance regulations and demands. The level of data transparency needed can only result from establishing a strong commitment to data quality and building the processes to ensure it.

Don’t Blame the Source Systems

Defining data quality too narrowly often leads people to assume that source transactional systems – either through data entry or systemic errors – cause the bad data. Although they may be a source of some errors, the more likely culprits are either inconsistent dimensions across source systems (such as customer or product identifiers) or inconsistent definitions for derived data across organizations. Conforming dimensions – developing consistent customer or product identifiers – is important for accessing and analyzing data for a company. The source systems do not own the data quality issues across other systems- the BI project team does. The source systems need to ensure that the data within their data silos is correct. However, the BI project team is responsible for providing the business with data that is consistent across the enterprise.

Similarly, each organization within the enterprise may have valid business reasons to derive data differently than others. For example, their position in a set of business processes may determine how they view their data. The individual organizations aren’t tasked with developing common definitions for derived data, but the BI project team is. Many BI project teams try to claim that data quality issues aren’t their responsibility. However, from a practical viewpoint, the BI team does need to make these issues its own, because its job is to ensure the highest data quality possible. The BI project team is packaging the data for consumption by business users, and they will be held accountable for the data quality. This may not seem fair, but the success of their project depends on it.

Don’t Shortchange the Pilot

Surprises happen when the project does an initial pilot or release involving only a small subset of source systems. While there may be many good reasons to have a narrow scope for a pilot, you won’t get an appreciation for the effort necessary to conform these dimensions as the number of source systems expands.
Sometimes pilots are only with a single organization, using only their definitions for derived data. Once again, the tough issue is often how to accommodate the differences in the derivation definitions between organizations. In both cases, the real challenges are encountered when dealing with multiple systems and organizations. The business users need to look at the big picture, and that is only possible when they can access and analyze data across the enterprise.Steps to Address Data Quality

To ensure data quality, the BI project team needs to address it from the very beginning. Here are several significant steps to consider:

  • Require the business to define data quality in a broad sense, establish metrics to monitor and measure it, and determine what should be done if the data fails to meet these metrics.
  • Undertake a comprehensive data profiling effort when performing a source systems analysis. Data anomalies across source systems and time (historical data does not always age well!) are needed so that the team can address them with the business early.
  • Incorporate data quality into all data integration and business intelligence processes from data sourcing to information consumption by the business user. Data quality issues need to be detected as early in the processes as possible and dealt with as defined in the business requirements.

Enterprises must present data that meets very stringent data quality levels, especially in light of recent compliance regulations and demands. The level of data transparency needed can only result from establishing a strong commitment to data quality and building the processes to ensure it.

Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

street address database
Why Data-Driven Companies Rely on Accurate Street Address Databases
Big Data Exclusive
predictive analytics risk management
How Predictive Analytics Is Redefining Risk Management Across Industries
Analytics Exclusive Predictive Analytics
data analytics and gold trading
Data Analytics and the New Era of Gold Trading
Analytics Big Data Exclusive
student learning AI
Advanced Degrees Still Matter in an AI-Driven Job Market
Artificial Intelligence Exclusive

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

What you need to be a Business Intelligence Consultant ??

5 Min Read
Fintech collecting web data
AnalyticsData MiningMarket Research

How Fintech is Using Web Data For Financial Intelligence

8 Min Read
Effectiveness Metric is Business Capabilities/IT Costs x 100
AnalyticsBig DataBusiness IntelligenceData ManagementITKnowledge ManagementSoftware

IT MPG: Measuring the Value of IT Is Simple Math

5 Min Read
evolving cybersecurity standards for stopping data breaches
Artificial Intelligence

AI Technology is Essential for Online Fraud Prevention

4 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

giveaway chatbots
How To Get An Award Winning Giveaway Bot
Big Data Chatbots Exclusive
AI chatbots
AI Chatbots Can Help Retailers Convert Live Broadcast Viewers into Sales!
Chatbots

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?