An occasional series in which a review of recent posts on SmartData Collective reveals the following nuggets:
One of the most common obstacles organizations face with data quality initiatives is that many initial attempts end in failure. Some fail because of lofty expectations, unmanaged scope creep, and the unrealistic perspective that data quality problems can be permanently “fixed” by a one-time project as opposed to needing a sustained program. However, regardless of the reason for the failure, it can negatively affect morale and cause employees to resist participating in the next data quality effort.
Innovation continuous… and discontinuous
Examples of continuous innovation are everywhere in tech, but discontinuous innovation is rare. Discontinuous innovation has given us things like the PC, the CD and DVD, MP3 Players, SOA, relational databases, etc., and each of these innovations has itself been improved on through continuous innovation. The MacBook Pro I’m typing this blog post on today is a product of continuous innovation. You get the idea… So when I say that the mega-vendors are in fact drivers of innovation, it’s continuous innovation for the most part, of course, but innovation none the less.
…a “score,” no matter how well designed or well intentioned, can and will be misused by those who don’t understand it. Equally, of course, decisions that don’t use analytics have problems, too. People’s snap judgments and use of how someone looks can be inaccurate with things like how people dress, the color of their skin, etc., all overriding more valuable information. A score does not suffer from these problems.
The same can be said for process design and problem solving sessions – remain aware of your level of knowledge debt and budget time to document your findings. I like to call these chunks of captured knowledge “white papers.” Calling them “white papers” helps folks understand the purpose and value of such a document: reasonably short and idea-complete. The sweet spot seems to be two to four pages, well-organized, not too wordy, but clear enough that it remains effective months after the design or process rework sessions took place.
Operational vs. analytical skills
It becomes fairly clear that the role of a DBA is very different when comparing the work activities of analytical and operational systems. I’m not suggesting that working in one environment is more complex or difficult than the other—they’re just different. Thus the activities and their associated skills are very different. Which is why we often recommend that a single individual may be hard-pressed to support both operational and analytical environments.
Integrating cross-business data
Many systems handle transactions, some provide statistical or predictive analytics and some provide fast reporting. But few are truly integrated and utilize the cross-business data that is a foundation for success. Executives who allocate resources need to understand the value of this integration, accessibility, scalability, and decrease response time to get to and view critical information which drives the business processes and customer profits.
Big Brother or…?
Of course, there are strong privacy considerations with the advent of these services. How does one opt in/or opt out? What information is shared and how much is shared and with whom? Arguably, on the marketing side, more detailed information (including location-based data), collected and analyzed by your wireless carrier could help them tailor and personalize specific offers—raising marketing effectiveness. And mapping your social network could help you share information more easily (think: favorite five plans—on steroids). But there is a fine line between “benefit” and “big brother.”