Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    image fx (60)
    Data Analytics Driving the Modern E-commerce Warehouse
    13 Min Read
    big data analytics in transporation
    Turning Data Into Decisions: How Analytics Improves Transportation Strategy
    3 Min Read
    sales and data analytics
    How Data Analytics Improves Lead Management and Sales Results
    9 Min Read
    data analytics and truck accident claims
    How Data Analytics Reduces Truck Accidents and Speeds Up Claims
    7 Min Read
    predictive analytics for interior designers
    Interior Designers Boost Profits with Predictive Analytics
    8 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: MDM Can Challenge Traditional Development Paradigms
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Uncategorized > MDM Can Challenge Traditional Development Paradigms
Uncategorized

MDM Can Challenge Traditional Development Paradigms

EvanLevy
EvanLevy
5 Min Read
SHARE
How Dare You Challenge My Paradigm mug (via cafepress.com)

I’ve been making the point in the past several years that master data management (MDM) development projects are different, and are accompanied by unique challenges. Because of the “newness” of MDM and its unique value proposition, MDM development can challenge traditional IT development assumptions.

MDM is very much a transactional processing system; it receives application requests, processes them, and returns a result.  The complexities of transaction management, near real-time processing, and the details associated security, logging, and application interfaces are a handful.  Most OLTP applications assume that the provided data is usable; if the data is unacceptable, the application simply returns an error.  Most OLTP developers are accustomed to addressing these types of functional requirements.  Dealing with imperfect data has traditionally been unacceptable because it slowed down processing; ignoring it or returning an error was a best practice.

The difference about MDM development is the focus on data content (and value-based) processing.  The whole purpose MDM is to deal with all data…

More Read

O Knowledge Graph, Where Art Thou?
R in the NYTimes… [1]
Week 1 at Google: Information Overload!
Leading Healthcare Information Provider Licenses Good Data for On Demand Business Intelligence
SmallBlue is an information anlaytics suite that automatically visualizes social networks, helps you…

How Dare You Challenge My Paradigm mug (via cafepress.com)

I’ve been making the point in the past several years that master data management (MDM) development projects are different, and are accompanied by unique challenges. Because of the “newness” of MDM and its unique value proposition, MDM development can challenge traditional IT development assumptions.

MDM is very much a transactional processing system; it receives application requests, processes them, and returns a result.  The complexities of transaction management, near real-time processing, and the details associated security, logging, and application interfaces are a handful.  Most OLTP applications assume that the provided data is usable; if the data is unacceptable, the application simply returns an error.  Most OLTP developers are accustomed to addressing these types of functional requirements.  Dealing with imperfect data has traditionally been unacceptable because it slowed down processing; ignoring it or returning an error was a best practice.

The difference about MDM development is the focus on data content (and value-based) processing.  The whole purpose MDM is to deal with all data, including the unacceptable stuff. It assumes that the data is good enough.  MDM code assumes the data is complex and “unacceptable” and focuses on figuring out the values.  The development methods associated with deciphering, interpreting, or decoding unacceptable data to make it usable is very different.  It requires a deep understanding of a different type of business rule – those associated with data content.  Because most business processes have data inputs and data outputs, there can be dozens of data content rules associated with each business process.  Traditionally, OLTP developers didn’t focus on the business content rules; they were focused on automating business processes.

MDM developers need to be comfortable with addressing the various data content processing issues (identification, matching, survivorship, etc.) along with the well understood issues of OLTP development (transaction management, high performance, etc.)  We’ve learned that the best MDM development environments invest heavily in data analysis and data management during the initial design and development stages.  They invest in profiling and analyzing each system of creation.  They also differentiate hub development from source on-boarding and hub administration. The team that focuses on application interfaces, CRUD processing, and transaction & bulk processing requires different skills from those developers focused on match processing rules, application on-boarding, and hub administration. The developers focused on hub construction are different than those team members focused on the data changes and value questions coming from data stewards and application developers.  This isn’t about differentiating development from maintenance; this is about differentiating the skills associated with the various development activities.

If the MDM team does its job right it can dramatically reduce the data errors that cause application processing and reporting problems. They can identify and quantify data problems so that other development teams can recognize them, too.  This is why MDM development is critical to creating the single version of truth.

Image via cafepress.com.

Link to original post

TAGGED:data quality
Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

image fx (60)
How Finance & BI Teams Choose Accounting Software
Big Data Business Intelligence Exclusive
Why the AI Race Is Being Decided at the Dataset Level
Why the AI Race Is Being Decided at the Dataset Level
Artificial Intelligence Big Data Exclusive
image fx (60)
Data Analytics Driving the Modern E-commerce Warehouse
Analytics Big Data Exclusive
ai for building crypto banks
Building Your Own Crypto Bank with AI
Blockchain Exclusive

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

#14: Here’s a thought…

6 Min Read

The Two-Headed Monster of Data Matching

7 Min Read

7/17/2009 1:59:47 PM

2 Min Read

The Idea of Order in Data

6 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

ai is improving the safety of cars
From Bolts to Bots: How AI Is Fortifying the Automotive Industry
Artificial Intelligence
ai chatbot
The Art of Conversation: Enhancing Chatbots with Advanced AI Prompts
Chatbots

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?