Responding to a Follower’s Question: Why Keep Data Replication to a Minimum?

4 Min Read

I got an e-mail from one of my followers (I say this in the hopes there are many more!).  In my blog post “What will you tolerate?” I provided a sample of guiding principles.  One of them was the suggestion that data replication be kept to a minimum.  The reader wanted to get a bit more depth to that point.

I got an e-mail from one of my followers (I say this in the hopes there are many more!).  In my blog post “What will you tolerate?” I provided a sample of guiding principles.  One of them was the suggestion that data replication be kept to a minimum.  The reader wanted to get a bit more depth to that point.

Going with the theory that if one person in the audience has that question there may be several more with the same thought, I wanted to just clarify and expand on that point.  I will give the shout out to Jon and thank him for the question (as well as correcting some of my typos).

Editor’s note: Rob Armstrong is an employee of Teradata. Teradata is a sponsor of The Smart Data Collective.

So why do I suggest that data replication be minimized.  There are several reasons beyond the very obvious one of disk storage and cost to maintain.

The main point about this guiding principle is that once the data has been cleansed, transformed, and integrated into the core data warehouse, the access should be against that data directly (or through views).  There is very little reason to then extract the data to another database or platform for analytics.  Many people will extract the data into data marts, excel, or other applications. 

Often times this is justified by claiming performance factors, IT barriers, or a variety of other issues.

Whatever the reason, this duplication of data is a problem and should be avoided when possible.  When data is replicated out very rarely do the data rules, data quality, and auditing trails accompany the extract.  This leads to users taking data, possibly transforming it in their on spreadsheets and then sharing that extract with others.  Now the data in the data warehouse no longer matches any reports or analytics from the extracts.  This lead to confusion and finger pointing about where answers are coming from and who’s answers are correct.  Added to this is a problem when a user decided they want to “drill down” from manipulated data but the underlying data in the warehouse no longer matches the reports.

Now this is not to say there is never a time that replicating data is justified.  Clearly, you will need to replicate data for disaster recovery systems.  You may also want to replicate data into a test environment so new applications can be developed and tested against “real data”.  These cases are reasonable as the data is audited for consistency and do not become the source of new analytics.

You may also have to overcome a real technical issue such as an business critical (with proven value) application that requires data to be co-located with the process.  In this case care should be taken to really document what the technical issue is and how it needs to be resolved.  Finally, there needs to be the understanding that the application will be pointed back to the core warehouse once the issue is resolved.  This of course leads us to the whole role of governance but that is another blog.

That help?

Share This Article
Exit mobile version