Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    composable analytics
    How Composable Analytics Unlocks Modular Agility for Data Teams
    9 Min Read
    data mining to find the right poly bag makers
    Using Data Analytics to Choose the Best Poly Mailer Bags
    12 Min Read
    data analytics for pharmacy trends
    How Data Analytics Is Tracking Trends in the Pharmacy Industry
    5 Min Read
    car expense data analytics
    Data Analytics for Smarter Vehicle Expense Management
    10 Min Read
    image fx (60)
    Data Analytics Driving the Modern E-commerce Warehouse
    13 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: The Perils of Forecasting Benchmarks
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Data Management > Best Practices > The Perils of Forecasting Benchmarks
Best PracticesPredictive Analytics

The Perils of Forecasting Benchmarks

mvgilliland
mvgilliland
3 Min Read
SHARE

Benchmarks of forecasting performance are available from several sources, including professional organizations and journals, academic research, and private consulting/benchmarking organizations. But there are several reasons why industry forecasting benchmarks should not be used for setting your own forecasting performance objectives.

1) Can you trust the data?

Benchmarks of forecasting performance are available from several sources, including professional organizations and journals, academic research, and private consulting/benchmarking organizations. But there are several reasons why industry forecasting benchmarks should not be used for setting your own forecasting performance objectives.

More Read

Defining Analytics: Data, Information and Knowledge
19th Century Decision Management
Predictive Analytics Addresses Pressing Web Hosting Challenges in 2019
The Moneyball-itzation of Marketing
Welcome to the Cloud Neighborhood

1) Can you trust the data?

Are the numbers based on rigorous audits of company data or responses to a survey? If they are based on unaudited survey responses, do the respondents actually know the answers or are they just guessing?

2) Is measurement consistent across the respondents?

Are all organizations forecasting at the same level of granularity, such as by product, customer or region? Are they forecasting in the same time interval, such as weekly or monthly? Are they forecasting by the same lead time offset, such as three weeks or three months in advance? Are they using the same metric? It is important to note that even metrics as similar sounding as MAPE, weighted MAPE, and symmetric MAPE can deliver very different values from the same data.

3) Finally, and most important, is the comparison relevant?

Does the benchmark company have equally forecastable data?

Consider this worst-case example:

Suppose a benchmark study shows that Company X has the lowest forecast error. Consultants and academics then converge on Company X to study its forecasting process and publish reports touting Company X’s best practices. You read these reports and begin to copy Company X’s best practices at your own organization.

However, upon further review using FVA analysis, it is discovered that Company X had very easy-to-forecast demand, and it would have had even lower error if it had just used a naive forecast. In other words, Company X’s so-called best practices just made the forecast worse.

This example is not far-fetched. Organizations at the top of the benchmark lists are probably there because they have the easiest-to-forecast demand. Many organizational practices, even purported best practices, may only make the forecast worse.

Benchmarks tell you the accuracy that best-in-class companies are able to achieve. But…they do not tell you whether their forecasting environment is similar to yours or worthy of your admiration. Without that information, industry benchmarks are largely irrelevant and should not be used to evaluate your performance or set performance objectives.

 

Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

student learning AI
Advanced Degrees Still Matter in an AI-Driven Job Market
Artificial Intelligence Exclusive
mobile device farm
How Mobile Device Farms Strengthen Big Data Workflows
Big Data Exclusive
composable analytics
How Composable Analytics Unlocks Modular Agility for Data Teams
Analytics Big Data Exclusive
fintech startups
Why Fintech Start-Ups Struggle To Secure The Funding They Need
Infographic News

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

Maximizing Capacity Utilization as a Startup Premise

3 Min Read

SAS and the art and science of better

3 Min Read

HPC is dead, long live HPC!

3 Min Read

Gartner says predictive analytics are the hot BI topic.

2 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

ai chatbot
The Art of Conversation: Enhancing Chatbots with Advanced AI Prompts
Chatbots
ai is improving the safety of cars
From Bolts to Bots: How AI Is Fortifying the Automotive Industry
Artificial Intelligence

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?