The prevalent tendency to underweight, or ignore, distributional information is perhaps the major source of error of intuitive prediction…The analyst should therefore make every effort to frame the forecasting problem so as to facilitate utilizing all the distributional information that is available to the expert.
Most people involved in risky enterprises forecast costs or benefits as a single estimate in each time period, rather than a range that reflects the unknowable nature of the future. That’s why we can readily compile the following examples. Below you’ll find a few instances of deterministic (expressed as a point or line estimate rather than a range) forecasts that missed the mark. In some cases we have the benefit of hindsight over many years of forecasts, or across many independent experts, so that we can easily see just how difficult forecasting can be.
We hope they will serve as a cautionary tale and motivate you to explicitly consider uncertainty in your forecasts and project plans. Better yet, perform your own historical accounting and inform your current forecasts with information from similar projects of the past. Whether you call it benchmarking or reference class forecasting, a dose of history will help you improve the accuracy of your forecasts. [Aside: this post is a companion to a post on using historical data to improve your forecasts; check it out if you didn’t come from there yourself!]
UK Government forecasts of water usage in the 1970’s predicted massive increases in demand, but actual usage never kept pace. At least later year forecasts seem to have understood the actual trends in play.
Here is an example of the highest and lowest U.S. GDP growth forecasts from a group of more than fifty economists surveyed by Businessweek magazine. Each year, The magazine asked economists for growth estimates in the next four quarters only. In five of fourteen years the actual growth was outside the forecasts of all of the economists. It is possible that the high growth (for four of the years) evidenced from 1996 to 2000 seemed unlikely because these economists underestimated the structural economic changes (and speculative activity) spurred by the internet revolution.
The Intel Itanium was a series of 64-bit server CPU released by Intel beginning in 2001. Even though the chipset required recompilation of all code originally written on the x86 platform, Intel had high hopes given the anticipated performance of Itanium compared to competing designs. Year after year, their forecasts reflected that optimism, in defiance of the reality of paltry sales; competitors were able to match and even exceed Itanium performance. In the graph below, actual sales are represented by the orange line at the bottom. (We first showcased this set of forecasts, and the oil price forecasts below, in an earlier blog post here.)
In 2005, Three researchers in Denmark investigated 30 railroad and 180 road infrastructure projects’ estimates of eventual usage. They found a majority of rail projects’ passenger forecasts were over-estimated by more than two-thirds. Road projects fared better, with half of the projects yielding a forecast/actual demand variance of over 20% A box plot summarizes these results below.
Oil prices are another area that has repeatedly defied prediction. The graph below superimposes crude oil forecasts over a decade with the actual prices. The cruel irony here is that from 1985 to 2000 or so, forecasts presaged a price hike that never came. Then, perhaps once forecasters had ‘learned the lessons’ of the past decade and tempered their forecasts, prices really did soar.
Perhaps the most famous incorrect forecast of all time, at least here in the States: Relying on early polls and one confident, veteran, political reporter based in Washington DC, the Chicago Daily Tribune reported on the Truman/Dewey presidential race of 1948.
1Kahneman D., Tversky, A., Intuitive Prediction: Biases and Corrective Procedures, Jointly published by Decision Research and DARPA, Working Paper, December 1977, p.2-4