Sign up | Login with →

Comments by James Taylor Subscribe

On The Road to Operational Analytics


Thanks for the article. I completely agree with your thoughts on the value and approach as well as the availability of technology. I have been writing about and working with these kinds of systems for the last decade or so and the degree to which the technology has improved, the cost reduced and the complexity crashed is really impressive. If you look at the available platform technologies you can see literally dozens of highly functional products that are easier to adopt and use than ever before.

That said I believe there is already a name for these kinds of systems - Decision Management Systems. It's what I started calling them back in 2002; what FICO, Transunion, Experian and the rest of the Financial Services industry have called them since then; and what IBM and even SAS are calling them today. Operational analytics (and prescriptive analytics) are great words for some of what is going on but analytics are not and never can be 100% of these systems so I still prefer a broader, more holistic label like Decision Management.

May 15, 2013    View Comment    

On From Decision Support Systems to Decision Management Systems


Great question. There are an increasing array of adaptive analytic engines that do exactly this and take traditional analytic models as well as expert rules as "input" and then adapt them based on what works and what does not. Typically these are used mostly in personalization/dynamic content and marketing rather than, say risk management, because they can be hard to explain to someone. 


September 28, 2012    View Comment    

On From Decision Support Systems to Decision Management Systems

I don't think the two will ever merge but have certainly seen plenty of blended systems as I noted in the article. The biggest challenge remains getting organizations to agree to the degree of automation that will  be developed: Too many systems projects give up on automating decisions because the top level decision must be left in the hands of a person. In fact many sub decisions could be automated and the results pushed into a decision support environment for the top level one. Similarly I see projects trying to over automate and ending up with a system that has no users.


September 27, 2012    View Comment    

On The Pathetic State of Dashboards


Great comments. The lack of thinking about the decisions that must be made, the actions that can be taken, in most dashboard designs is breathtaking. Without some real concept of who is making which decision, when and with what objective it seems to me impossible to design a productive dashboard.


August 28, 2012    View Comment    

On Data Mining Book Review: Decision Management Systems


Thanks for taking the time to read and review Decision Management Systems. I appreciate your feedback and I am glad you found the chapters from chapter 6 (Design and Implement Decision Services) useful. There is a trade-off between the length of a book and completeness and there are a variety of free resources on the Decision Management Solutions website in addition that can provide additional detail.

The heart of decision management is filling the gap between analytics and action. While some of the techniques and examples are clearly focused on business rules, rather than analytics, the overall framework works for both. Business rules are a technology of real value to data miners and analytic professionals because business rules technology is such a useful platform for rapid deployment of models.

We’ve had great success on projects using the decision management approach with data miners very pleased in its usefulness in getting a consistent problem understanding with the business and in getting models quickly deployed. Decision Discovery makes it easier to tell where analytics will make a difference in concert with the business team and what analytics will be required. The construction of Decision Services to automate these decisions (using analytics and business rules in particular) and ongoing Decision Analysis ensures that models (and rules) don't go stale and that the value of analytics grows over time rather than declining. I’ve been told it makes for “business friendly” data mining, and liked that title so much that’s what I’ve called my upcoming workshop at Predictive Analytics World SF in March.


January 17, 2012    View Comment    

On Black Box Analytics

Thanks for bringing up the issue of black box models.

The challenge of black box analytics is a real one. Without business owner commitment it is hard to get models adopted, no matter how predictive they are. Explicability, an appreciation of the business inputs, use of best-to-worst deciles, the ability to simulate the impact of a model - all these are important tools for letting business people get comfortable enough to adopt a model.

The only results that matter are business results. Data miners who produce really predictive models that don't get adopted have failed just as badly as those who cannot produce a predictive model. We are not done when we have a "black box" model but when we have persuaded the business to adopt the model and change their decision-making.


November 15, 2011    View Comment    

On Guest Post: Can Database Developers do Data Mining ?

While there are some good points here there remain challenges for database developers in this scenario:

  • They tend to see operational databases as distinct from analytic databases and to keep summary data in the analytic database to drive reporting while archiving operational data as it ages. For data mining we need the details but we need them for an extended period and often neither database meets the need
  • They don't always understand leaks from the future and so allow records to be updated and changed without keeping track of when and what changes were made. This can make it very hard to use the records for data mining as it is not possible to recreate records as they were in the past
  • They tend to focus on the data they own instead of beginning with the decision in mind and then working with the data that will change the decision - whether they own it or not
  • Data quality and integration efforts often eliminate the records that are the most interesting for data miners - those that are outliers and those that have problems. Cleaning these up is not always what is needed for data mining

There's more, but that's the key list I think. Database developers could do data mining and could be allies in the efforts of data miners but mostly they aren't. Time to improve the training and default behaviors of database developers so that they think analytics not just OLTP and reporting.


May 11, 2011    View Comment    

On In-database analytics and Decision Management


Thanks for the comment. There's lots of potential with in-database analytics, especially when folks stay focused on the decisions they are trying to improve with analytics


May 4, 2011    View Comment    

On 5 Ways Predictive Analytics Cuts Enterprise Risk


One of the most frustrating things I see is where good risk management approaches - for fraud detection say - are abandoned because the budget for them is cut! How can something that shows a great return (far more fraud detected and prevented than money spent) be considered like a cost center budget? But as you say the budget often rules.

Risk management cannot drive every decision but it should be part of many decisions at the strategic, tactical and operational levels.


March 21, 2011    View Comment    

On 5 Ways Predictive Analytics Cuts Enterprise Risk


Agreed - I like the old insurance saying: "There is no such thing as a bad risk, only a bad price". Sales, marketing and risk management must all play nicely together so we acquire the customers we will sell to at a price that makes good risk and business sense


March 21, 2011    View Comment    

On Walled Gardens and the Value of Innovation: Questions for Bill Franks

I think the culture of experimentation must extend to how customers are actually treated - we must be able to run an experiment where we are trying something new with real customers (not many, just some) while understanding that this may be WORSE than what we do today. Unless we can experiment in real life (after experimenting in our sandbox) we can never be sure how the "Carbon-Based Lifeforms" in our data will actually behave. Getting a culture that allows this level of experimentation is tricky to say the least!

McKinsey identified experimentation - a test-and-learn mentality - as a key trend for the future and I have to agree.


March 21, 2011    View Comment    

On What does IBM Watson mean for Decision Management and Analytics?


Great point - analytics of any kind cannot always be a black box. In some circumstances, such as in fraud detection, it may be ok for a model simply to make a prediction without being able to explain it. In many others, however, it is essential to understand what the drivers were for the prediction, what the sources were. Many analytic techniques are good at this - some because they generate rules that can be individually reviewed and logged when they are used in a decision, some because they explicitly generate "reason codes". This is why, for instance, you can be told what the biggest drivers are for your credit score ("too much revolving debt and some missed payments" say) without being told how the score is actually calculated - the model knows the reason codes for the biggest contributors to your score.

In fact this kind of explicability is one of the reasons I really like a business rules approach to implementing analytics. I recently wrote a brief on this topic as part of my work on the faculty of the International Institute for Analytics and it is the topic of my presentation at Predictive Analytics World San Francisco.


February 26, 2011    View Comment