#10: Here’s a thought…

8 Min Read

An occasional series in which a review of recent posts on SmartData Collective reveals the following nuggets:

“Intelligent” decision-making
If a predictive analytic model turns uncertainty about how a particular customer (or supplier or partner) will behave in the future into a usable probability, then you can act based in part on that probability. In other words you can specify some rules that use the probability in deciding what action to take next. This kind of “intelligent” decision-making by systems is, I believe, the future. I think that many folks over-estimate the value of making more information available – even if that information is predictive. No matter how easy you make it to consume the information, you still assume that the person is able to put it in context and use it. I call this the “so what” problem:

—James Taylor: “Beyond Predictive BI

Data gazers
All enterprise information initiatives are complex endeavors and data quality projects are certainly no exception. Success requires people taking on the challenge united by collaboration, guided by an effective methodology, and implementing a solution using powerful technology. But the complexity of the project can


An occasional series in which a review of recent posts on SmartData Collective reveals the following nuggets:

“Intelligent” decision-making
If a predictive analytic model turns uncertainty about how a particular customer (or supplier or partner) will behave in the future into a usable probability, then you can act based in part on that probability. In other words you can specify some rules that use the probability in deciding what action to take next. This kind of “intelligent” decision-making by systems is, I believe, the future. I think that many folks over-estimate the value of making more information available – even if that information is predictive. No matter how easy you make it to consume the information, you still assume that the person is able to put it in context and use it. I call this the “so what” problem:

—James Taylor: “Beyond Predictive BI

Data gazers
All enterprise information initiatives are complex endeavors and data quality projects are certainly no exception. Success requires people taking on the challenge united by collaboration, guided by an effective methodology, and implementing a solution using powerful technology. But the complexity of the project can sometimes work against your best intentions. It is easy to get pulled into the mechanics of documenting the business requirements and functional specifications and then charging ahead on the common mantra: “We planned the work, now we work the plan.”

—Jim Harris: “Data Gazers

More hurry, less progress
When executives try to move too fast, such as attempting in three months to define and cascade a scorecard of key performance indicators (KPIs) from the executive team down to the front line employees, the implementation is doomed to failure. The reason is organizations require a managed rate of learning and buy-in acceptance. The major impediments to implementing Performance Management methodologies are not technical, such as data availability or quality; they are social. For example, KPIs should be gradually and carefully defined and cascaded downward to KPIs of middle managers that influence their higher-level managers’ KPIs. Conflict and tension in organizations is natural, and it takes time to rationalize what to measure and how driver KPIs correlate – or not – with other influenced measures.

The 9th layer of hell
For many data quality analysts, I would imagine looking at the data from a call centre is like being sentenced to the 9th layer of Hell, it’s just not a fun place to be. Why? Because let’s face it: trying to correct bad data from the front-line can be a cumbersome task. You have multiple systems to work through, lineage to deal with, and when you want data corrected or to set up preventative safeguards, there’s no one to call.

The beauty of clouds
Cloud-based services provide several advantages for analytics. Perhaps the most important is elastic capacity — if 25 processors are needed for one job for a single hour, then these can be used for just the single hour and no more. This ability of clouds to handle surge capacity is important for many groups that do analytics. With the appropriate surge capacity provided by clouds, modelers can be more productive, and this can be accomplished in many cases without requiring any capital expense.

Is there a data-crunching career in your future?
Which brings up an interesting perspective of the analyst community. While there are certainly the math and stat majors along with masters and PhD candidates, many of today’s analysts in corporations are self taught and accidentally landed into a data crunching career. There aren’t many that went to college and said, “Gee, I’d like to be a statistician.” But, somehow, many analysts have found an affinity for analyzing data and putting it into context for gaining insight and making business decisions.

—Michele Goetz: “Analyst Skills Are Hot

The medium, not the message
We’re bombarded with data all the time. Depending on the way that data flows to us, it often can be unmanageable and turn into noise. Part of the problem, in my opinion, is using the wrong vehicle for some types of information. Sure you can blast out everything in email newsletters or post on corporate Intranet sites — one a push method and the other a pull approach. I suggest there’s a much more effective way to share much of the same info. Why not use company blogs with RSS capabilities to allow employees to opt into the information that they find useful? Granted some compliance issues would drive certain types of information to be handled differently, but for the bulk of data the RSS feed / internal blog could work as a more effective channel.

TAGGED:
Share This Article
Exit mobile version