By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    data Analytics instagram stories
    Data Analytics Helps Marketers Make the Most of Instagram Stories
    15 Min Read
    analyst,women,looking,at,kpi,data,on,computer,screen
    What to Know Before Recruiting an Analyst to Handle Company Data
    6 Min Read
    AI analytics
    AI-Based Analytics Are Changing the Future of Credit Cards
    6 Min Read
    data overload showing data analytics
    How Does Next-Gen SIEM Prevent Data Overload For Security Analysts?
    8 Min Read
    hire a marketing agency with a background in data analytics
    5 Reasons to Hire a Marketing Agency that Knows Data Analytics
    7 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-23 SmartData Collective. All Rights Reserved.
Reading: MDM Can Challenge Traditional Development Paradigms
Share
Notification Show More
Aa
SmartData CollectiveSmartData Collective
Aa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Uncategorized > MDM Can Challenge Traditional Development Paradigms
Uncategorized

MDM Can Challenge Traditional Development Paradigms

EvanLevy
Last updated: 2010/03/30 at 1:40 PM
EvanLevy
5 Min Read
SHARE
How Dare You Challenge My Paradigm mug (via cafepress.com)

I’ve been making the point in the past several years that master data management (MDM) development projects are different, and are accompanied by unique challenges. Because of the “newness” of MDM and its unique value proposition, MDM development can challenge traditional IT development assumptions.

MDM is very much a transactional processing system; it receives application requests, processes them, and returns a result.  The complexities of transaction management, near real-time processing, and the details associated security, logging, and application interfaces are a handful.  Most OLTP applications assume that the provided data is usable; if the data is unacceptable, the application simply returns an error.  Most OLTP developers are accustomed to addressing these types of functional requirements.  Dealing with imperfect data has traditionally been unacceptable because it slowed down processing; ignoring it or returning an error was a best practice.

The difference about MDM development is the focus on data content (and value-based) processing.  The whole purpose MDM is to deal with all data…

More Read

analyzing big data for its quality and value

Use this Strategic Approach to Maximize Your Data’s Value

7 Data Lineage Tool Tips For Preventing Human Error in Data Processing
Preserving Data Quality is Critical for Leveraging Analytics with Amazon PPC
Quality Control Tips for Data Collection with Drone Surveying
3 Huge Reasons that Data Integrity is Absolutely Essential

How Dare You Challenge My Paradigm mug (via cafepress.com)

I’ve been making the point in the past several years that master data management (MDM) development projects are different, and are accompanied by unique challenges. Because of the “newness” of MDM and its unique value proposition, MDM development can challenge traditional IT development assumptions.

MDM is very much a transactional processing system; it receives application requests, processes them, and returns a result.  The complexities of transaction management, near real-time processing, and the details associated security, logging, and application interfaces are a handful.  Most OLTP applications assume that the provided data is usable; if the data is unacceptable, the application simply returns an error.  Most OLTP developers are accustomed to addressing these types of functional requirements.  Dealing with imperfect data has traditionally been unacceptable because it slowed down processing; ignoring it or returning an error was a best practice.

The difference about MDM development is the focus on data content (and value-based) processing.  The whole purpose MDM is to deal with all data, including the unacceptable stuff. It assumes that the data is good enough.  MDM code assumes the data is complex and “unacceptable” and focuses on figuring out the values.  The development methods associated with deciphering, interpreting, or decoding unacceptable data to make it usable is very different.  It requires a deep understanding of a different type of business rule – those associated with data content.  Because most business processes have data inputs and data outputs, there can be dozens of data content rules associated with each business process.  Traditionally, OLTP developers didn’t focus on the business content rules; they were focused on automating business processes.

MDM developers need to be comfortable with addressing the various data content processing issues (identification, matching, survivorship, etc.) along with the well understood issues of OLTP development (transaction management, high performance, etc.)  We’ve learned that the best MDM development environments invest heavily in data analysis and data management during the initial design and development stages.  They invest in profiling and analyzing each system of creation.  They also differentiate hub development from source on-boarding and hub administration. The team that focuses on application interfaces, CRUD processing, and transaction & bulk processing requires different skills from those developers focused on match processing rules, application on-boarding, and hub administration. The developers focused on hub construction are different than those team members focused on the data changes and value questions coming from data stewards and application developers.  This isn’t about differentiating development from maintenance; this is about differentiating the skills associated with the various development activities.

If the MDM team does its job right it can dramatically reduce the data errors that cause application processing and reporting problems. They can identify and quantify data problems so that other development teams can recognize them, too.  This is why MDM development is critical to creating the single version of truth.

Image via cafepress.com.

Link to original post

TAGGED: data quality
EvanLevy March 30, 2010
Share This Article
Facebook Twitter Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

smart home data
7 Mind-Blowing Ways Smart Homes Use Data to Save Your Money
Big Data
ai low code frameworks
AI Can Help Accelerate Development with Low-Code Frameworks
Artificial Intelligence
data Analytics instagram stories
Data Analytics Helps Marketers Make the Most of Instagram Stories
Analytics
data breaches
How Hospital Security Breaches Devastate Local Communities
Policy and Governance

Stay Connected

1.2k Followers Like
33.7k Followers Follow
222 Followers Pin

You Might also Like

analyzing big data for its quality and value
Big Data

Use this Strategic Approach to Maximize Your Data’s Value

6 Min Read
data lineage tool
Big Data

7 Data Lineage Tool Tips For Preventing Human Error in Data Processing

6 Min Read
data quality and role of analytics
Data Quality

Preserving Data Quality is Critical for Leveraging Analytics with Amazon PPC

8 Min Read
data collection with drone use
Data Collection

Quality Control Tips for Data Collection with Drone Surveying

9 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

data-driven web design
5 Great Tips for Using Data Analytics for Website UX
Big Data
giveaway chatbots
How To Get An Award Winning Giveaway Bot
Big Data Chatbots Exclusive

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Lost your password?