Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    image fx (67)
    Improving LinkedIn Ad Strategies with Data Analytics
    9 Min Read
    big data and remote work
    Data Helps Speech-Language Pathologists Deliver Better Results
    6 Min Read
    data driven insights
    How Data-Driven Insights Are Addressing Gaps in Patient Communication and Equity
    8 Min Read
    pexels pavel danilyuk 8112119
    Data Analytics Is Revolutionizing Medical Credentialing
    8 Min Read
    data and seo
    Maximize SEO Success with Powerful Data Analytics Insights
    8 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: Need for a Robust Data Quality Framework for Big Data
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Big Data > Data Quality > Need for a Robust Data Quality Framework for Big Data
Data Quality

Need for a Robust Data Quality Framework for Big Data

koolhits
koolhits
3 Min Read
SHARE

The challenges associated with data quality and corresponding accountability across business domains and research areas has been a concern. Among the key data quality problems associated are:-

The challenges associated with data quality and corresponding accountability across business domains and research areas has been a concern. Among the key data quality problems associated are:-

  • Non-interoperability – Data collected in one system are not electronically transmittable to other systems. Re-inputting the same data in multiple systems consumes resources and increases the potential for data-entry errors.
  • Non-standardized data definitions – Various data providers use different definitions for the same elements. Passed on to the district or state level, non-comparable data are aggregated inappropriately to produce inaccurate results.
  • Unavailability of data – Data required do not exist or are not readily accessible ecause of one or other quality issue. In some cases, data providers may take an approach of “just fill something in” to satisfy distant data collectors, thus creating errors.
  • Inconsistent item response – Not all data providers report the same data elements. Idiosyncratic reporting of different types of information from different sources creates gaps and errors in macro-level data aggregation.
  • Inconsistency over time. The same data element is calculated, defined, and/or reported differently from year to year. Longitudinal inconsistency creates the potential for inaccurate analysis of trends over time.
  • Data entry errors. Inaccurate data are entered into a data collection instrument. Errors in reporting information can occur at any point in the process – from the student’s assessment answer sheet to the state’s report to the federal government.
  • Lack of timeliness. Data are reported too late. Late reporting can jeopardize the completeness of macro-level reporting.

We seriously require some thoughts and readily implementable approach where key business rules can be defined just like other business rules; ensuring proactive reporting of quality issues, checkpoints on new data being inserted and so on.

More Read

Do not make the biggest survey mistake!
6 Questions to Ask for Real Insight From Big Data
The NSA’s Data Quality Problem
Criteria for Determining Which SaaS Platform to Choose
Does Data Quality Matter in Social Media?

Imagine, if we have a framework which can ensure some of following validation rules:-

  1. Range Check – This checks that the data lies within a specified range of values
  2. Presence Check – This checks that the required data is not missing
  3. Domain Check – This checks that only certain values are accepted
  4. Cross-Field Check – This checks that multiple fields in combination are valid
  5. Cross-Table Check – This checks that multiple tables in combination are valid
  6. Uniqueness Validation – Ensure the values in a column are unique
  7. Reference Integrity Validation – Validate values between tables in relational database model
  8. Duplicate Identification – Identify a row as an unwanted duplicate record
  9. Format Consolidation – Control data values inside a preset mask pattern
  10. Business Rule Compliance


Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

image fx (2)
Monitoring Data Without Turning into Big Brother
Big Data Exclusive
image fx (71)
The Power of AI for Personalization in Email
Artificial Intelligence Exclusive Marketing
image fx (67)
Improving LinkedIn Ad Strategies with Data Analytics
Analytics Big Data Exclusive Software
big data and remote work
Data Helps Speech-Language Pathologists Deliver Better Results
Analytics Big Data Exclusive

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

Once Upon a Time in the Data

10 Min Read

Business (NOT) as Usual: 3 Big Business Intelligence Predictions for 2015

6 Min Read

Taking Measurement of Your Measurements

7 Min Read
GDPR Compliance
Data ManagementData QualityData WarehousingPolicy and Governance

A Data Catalog Makes Quick Work of GDPR Compliance

4 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

ai is improving the safety of cars
From Bolts to Bots: How AI Is Fortifying the Automotive Industry
Artificial Intelligence
giveaway chatbots
How To Get An Award Winning Giveaway Bot
Big Data Chatbots Exclusive

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?