Delivering Quality – Where it Counts, When it Counts

5 Min Read

Considering a data quality program? What’s the best way to implement it? One of the decisions that organizations must make is where data quality (Trillium’s focus) fits within their overall approach to technology architecture and business solutions. Is data quality technology a “solution” unto itself or is it a service that is delivered to other solutions. Let’s look to some industry commentary for guidance.

In a Gartner report of March 2016 reviewing The State of Data Quality: Current Practices and Evolving Trends, analysts Saul Judah and Ted Friedman cited a survey of 390 organizations to state that the leading use case for data quality (more than 50%) was support for the “ongoing operation of business applications.” Gartner noted that this reflected “increased activity in [CRM and ERP] application renovation” which suggests that a key aspect of enhancing an organization’s approach to modern business applications is to proactively address the quality of data used by those applications. Makes sense; you can’t enhance your application portfolio without consideration of the data that drives the execution of the applications.

Gartner is not alone. TDWI (Philip Russom, specifically) has stated in a Checklist Report that “failing to ensure high-quality operational data may put many worthwhile business goals for operational excellence at risk.” That report characterizes operational data quality as “largely about the same practices and techniques found in any data quality initiative but focuses on continuous improvement for operational data and the operational business processes that depend on such data.” As is evident from the reference to “continuous”, this perspective advocates data quality an ongoing process and not a one-time or standalone project.

Another perspective comes from Forrester Research, which has written about “fast data”, a characterization of data that is “in the moment; it’s dynamic, agile, consumable, and intelligent so that it meets your data consumers’ real-time, self-service needs in both analytical and operational environments.” In terms of data quality, this speaks to the notion of “fit for purpose” – that data needs to be suited to the context of the operational applications that it serves. Data that is incomplete or poorly structured for those applications is, by definition, not fit for purpose. For example, a marketer investing in a direct mail campaign needs to have confidence in the addresses of the targets on the list. Pursuing an email campaign? The same obviously goes for email addresses. Want to accelerate pipeline development by assigning certain leads directly to your account reps? You’ll quickly sabotage your efforts if contact phone numbers are wrong.

Let’s simplify things. It all comes down to “when,” as in when you need the data is when you need the assurance of its quality. If you’re assembling a lot of disparate data sources as part of an analytics effort, then you need assurance that the data is fit for that purpose – and your data quality focus should be concentrated on data preparation in support of that effort. But if you’re supporting an operational application (like a CRM system) then you need your data quality efforts operating as a service to that solution – and since those solutions operate in a continuous manner, your data quality efforts are in service to those continuous operations, hopefully as part of the natural processing of those applications and equally hopefully not being intrusive such that quality efforts get in the way.

After all, the goal of any quality effort, whether it is data, process, people or ……, is not to explain why things went wrong. It’s to better ensure that they don’t go wrong. And that means implementing quality practices at the point of execution.

Share This Article
Exit mobile version