Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    unusual trading activity
    Signal Or Noise? A Decision Tree For Evaluating Unusual Trading Activity
    3 Min Read
    software developer using ai
    How Data Analytics Helps Developers Deliver Better Tech Services
    8 Min Read
    ai for stock trading
    Can Data Analytics Help Investors Outperform Warren Buffett
    9 Min Read
    media monitoring
    Signals In The Noise: Using Media Monitoring To Manage Negative Publicity
    5 Min Read
    data analytics
    How Data Analytics Can Help You Construct A Financial Weather Map
    4 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: The Perils of Forecasting Benchmarks
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Data Management > Best Practices > The Perils of Forecasting Benchmarks
Best PracticesPredictive Analytics

The Perils of Forecasting Benchmarks

mvgilliland
mvgilliland
3 Min Read
SHARE

Benchmarks of forecasting performance are available from several sources, including professional organizations and journals, academic research, and private consulting/benchmarking organizations. But there are several reasons why industry forecasting benchmarks should not be used for setting your own forecasting performance objectives.

1) Can you trust the data?

Benchmarks of forecasting performance are available from several sources, including professional organizations and journals, academic research, and private consulting/benchmarking organizations. But there are several reasons why industry forecasting benchmarks should not be used for setting your own forecasting performance objectives.

More Read

James Taylor Reports on Predictive Analytics World Some trends:…
Is Big Data Causing Insurance Actuaries to Move Away from Using Credit Scores?
How to Measure Emotions in Branding and Advertising Research
Open-source banking with R
Horse Racing’s Triple Crown – Just like Business Analysts*

1) Can you trust the data?

Are the numbers based on rigorous audits of company data or responses to a survey? If they are based on unaudited survey responses, do the respondents actually know the answers or are they just guessing?

2) Is measurement consistent across the respondents?

Are all organizations forecasting at the same level of granularity, such as by product, customer or region? Are they forecasting in the same time interval, such as weekly or monthly? Are they forecasting by the same lead time offset, such as three weeks or three months in advance? Are they using the same metric? It is important to note that even metrics as similar sounding as MAPE, weighted MAPE, and symmetric MAPE can deliver very different values from the same data.

3) Finally, and most important, is the comparison relevant?

Does the benchmark company have equally forecastable data?

Consider this worst-case example:

Suppose a benchmark study shows that Company X has the lowest forecast error. Consultants and academics then converge on Company X to study its forecasting process and publish reports touting Company X’s best practices. You read these reports and begin to copy Company X’s best practices at your own organization.

However, upon further review using FVA analysis, it is discovered that Company X had very easy-to-forecast demand, and it would have had even lower error if it had just used a naive forecast. In other words, Company X’s so-called best practices just made the forecast worse.

This example is not far-fetched. Organizations at the top of the benchmark lists are probably there because they have the easiest-to-forecast demand. Many organizational practices, even purported best practices, may only make the forecast worse.

Benchmarks tell you the accuracy that best-in-class companies are able to achieve. But…they do not tell you whether their forecasting environment is similar to yours or worthy of your admiration. Without that information, industry benchmarks are largely irrelevant and should not be used to evaluate your performance or set performance objectives.

 

Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

data migration risk prevention
Best Approach to Risk Management for Data Migration in Data-Driven Businesses
Big Data Data Management Exclusive Risk Management
AI in branding
How Data Analytics and Data Mining Strengthen Brand Identity Services
Big Data Exclusive
Hidden AI, a risk?
Hidden AI, Real Risk: A Governance Roadmap For Mid-Market Organizations
Artificial Intelligence Exclusive Infographic
unusual trading activity
Signal Or Noise? A Decision Tree For Evaluating Unusual Trading Activity
Analytics Exclusive Infographic

Stay Connected

1.2KFollowersLike
33.7KFollowersFollow
222FollowersPin

You Might also Like

ReBlog: On Why I Don’t Like Auto-Scaling in the Cloud

2 Min Read

History

3 Min Read

What Lessons Can IT and Analysts Learn from the Cinema Industry?

7 Min Read

IBM and ILOG – Java, COBOL AND .Net

3 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

giveaway chatbots
How To Get An Award Winning Giveaway Bot
Big Data Chatbots Exclusive
data-driven web design
5 Great Tips for Using Data Analytics for Website UX
Big Data

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?