Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    predictive analytics risk management
    How Predictive Analytics Is Redefining Risk Management Across Industries
    7 Min Read
    data analytics and gold trading
    Data Analytics and the New Era of Gold Trading
    9 Min Read
    composable analytics
    How Composable Analytics Unlocks Modular Agility for Data Teams
    9 Min Read
    data mining to find the right poly bag makers
    Using Data Analytics to Choose the Best Poly Mailer Bags
    12 Min Read
    data analytics for pharmacy trends
    How Data Analytics Is Tracking Trends in the Pharmacy Industry
    5 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: The Perils of Forecasting Benchmarks
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Data Management > Best Practices > The Perils of Forecasting Benchmarks
Best PracticesPredictive Analytics

The Perils of Forecasting Benchmarks

mvgilliland
mvgilliland
3 Min Read
SHARE

Benchmarks of forecasting performance are available from several sources, including professional organizations and journals, academic research, and private consulting/benchmarking organizations. But there are several reasons why industry forecasting benchmarks should not be used for setting your own forecasting performance objectives.

1) Can you trust the data?

Benchmarks of forecasting performance are available from several sources, including professional organizations and journals, academic research, and private consulting/benchmarking organizations. But there are several reasons why industry forecasting benchmarks should not be used for setting your own forecasting performance objectives.

More Read

Image
6 Simple Steps to a Big Data Strategy
What is Cloud Computing
Travel funding available for UseR! and DSC 2009
The top 20 tweeters for bizanalytics
PAW: Reports from the Feb Conference – Next One in Sept

1) Can you trust the data?

Are the numbers based on rigorous audits of company data or responses to a survey? If they are based on unaudited survey responses, do the respondents actually know the answers or are they just guessing?

2) Is measurement consistent across the respondents?

Are all organizations forecasting at the same level of granularity, such as by product, customer or region? Are they forecasting in the same time interval, such as weekly or monthly? Are they forecasting by the same lead time offset, such as three weeks or three months in advance? Are they using the same metric? It is important to note that even metrics as similar sounding as MAPE, weighted MAPE, and symmetric MAPE can deliver very different values from the same data.

3) Finally, and most important, is the comparison relevant?

Does the benchmark company have equally forecastable data?

Consider this worst-case example:

Suppose a benchmark study shows that Company X has the lowest forecast error. Consultants and academics then converge on Company X to study its forecasting process and publish reports touting Company X’s best practices. You read these reports and begin to copy Company X’s best practices at your own organization.

However, upon further review using FVA analysis, it is discovered that Company X had very easy-to-forecast demand, and it would have had even lower error if it had just used a naive forecast. In other words, Company X’s so-called best practices just made the forecast worse.

This example is not far-fetched. Organizations at the top of the benchmark lists are probably there because they have the easiest-to-forecast demand. Many organizational practices, even purported best practices, may only make the forecast worse.

Benchmarks tell you the accuracy that best-in-class companies are able to achieve. But…they do not tell you whether their forecasting environment is similar to yours or worthy of your admiration. Without that information, industry benchmarks are largely irrelevant and should not be used to evaluate your performance or set performance objectives.

 

Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

predictive analytics risk management
How Predictive Analytics Is Redefining Risk Management Across Industries
Analytics Exclusive Predictive Analytics
data analytics and gold trading
Data Analytics and the New Era of Gold Trading
Analytics Big Data Exclusive
student learning AI
Advanced Degrees Still Matter in an AI-Driven Job Market
Artificial Intelligence Exclusive
mobile device farm
How Mobile Device Farms Strengthen Big Data Workflows
Big Data Exclusive

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

Precision Forecasting for Weather-Sensitive Business Operations…

2 Min Read

Success for the Data Scientist = Happiness

1 Min Read
smart manufacturing
Predictive Analytics

Big Data and the Evolution of Supply Chain Planning

5 Min Read

The 2009 Rexer Data Mining Survey – A conversation with Karl Rexer

7 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

ai in ecommerce
Artificial Intelligence for eCommerce: A Closer Look
Artificial Intelligence
data-driven web design
5 Great Tips for Using Data Analytics for Website UX
Big Data

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?