Fudging Data: The How, The Why and Catching it

5 Min Read

An often encountered problem in data management as well as reporting is data inaccuracy. I was tempted to write about this while poring through reams of data that specifically I had been told to investigate for veracity. Why data is fudged- Some of data problems are  due to bad data gathering systems, some of it are […]


An often encountered problem in data management as well as reporting is data inaccuracy. I was tempted to write about this while poring through reams of data that specifically I had been told to investigate for veracity.

Why data is fudged

Some of data problems are  due to bad data gathering systems, some of it are due to wrong specifications, and some of it is often plain bad or simplistic assumptions.

Data fudging on the other hand is clearly inventing data to fit the curve or trend, and is deliberate and thus harder to catch.

It can also be included to give confusing rather than inaccurate data just to avoid greater scrutiny.

Sometimes it may be termed as over-fitting but over-fitting is generally due to statistical and programmatic reasons rather than human reasons.

 

Note fudging data or talking about is not really political correct in the data world , yet it exists all all levels from students preparing survey samples to budgetary requests.

I am outlining some ways in how to recognize data fudging – and to catch a fudge, you sometime have to think like one.

How data is often fudged-

  1. Factors-This starts be recognizing all factors that can positively or negatively impact the final numbers that are being presented.Note the list can be expanded to many more factors than needed just to divert attention from main causal factors.
  2. Sensitivity-This gives the range of answers gotten by tweaking individual factors within a certain range say +- 10 % and noting the final figures.Assumptions can be both conservative or aggressive in  terms of recognizing the weightage of causal factors in order to suit the final numbers.
  3. Causal Equation-Recognizing the interplay between various factors due to correlation as well to the final numbers due to causing variance changes.The causal equation can then be tweaked including playing with weightage, powers of polynomial expression, as well correlation between many factors.

How data fudging is often caught-

  1. Sampling- Using a random sample or holdout sample, and thus seeking if final answer converges to that known to happen. The validation sample technique is powerful to recognize data modeling inaccuracies.
  2. Checking assumptions- For reasons of risk management, always consider conservative or worst case scenarios first and then build up your analysis. Similarly for checking an analysis , check for over optimism or the period or history on which the assumption growth factors/sensitivities are assumed.
  3. Missing Value and Central Value Movements- If a portion of data is missing, check the mean as well as median for both the reported as well overall data. You can try and resample by taking a random sample from the data and check these values repeatedly to see if they hold firm.
  4. Early Warning Indicators-Ask the question (loudly)- if this analysis was totally wrong , what indicator would give us the first indication of it being wrong. This could be then incorporated as part of metric tracking early warning system

Note the above are simplistic expressions of numbers I have seen being presented wrongly, or being fudged. They are based on my experiences so feel free to add in your share of data anecdotes.

Using these simple techniques could have helped many people in the financial as well as other decision making including budgetary as well as even in other strategic areas.

As the saying goes- In God we Trust, Everybody else has to bring data ( which we will have to check before trusting it)

 

Share This Article
Exit mobile version