Page added on July 10, 2007
Napping on the railroad tracks sounds risky on its face. But it may not feel that way if you don’t know you’re napping on the tracks.
Humans seem programmed to believe that the future will look pretty much like the past. But the narrative of history is the narrative of unexpected events. And, so it is surprising that when it comes to resource depletion, cornucopian thinkers love to refer to history. Daniel Yergin, chairman of Cambridge Energy Research Associates, likes to say, “This is not the first time the world has run out of oil. It is more like the fifth.” But even though Yergin admits that oil is a finite resource (and that therefore its total quantity is declining), he invites us to snooze with him on the railroad tracks because history has shown that so far that it’s been safe to do so.
Yergin’s faith (and that of many others) is founded on the forecasts of his own firm and that of the U. S. Energy Information Administration (which takes its data from the U. S. Geological Survey’s World Petroleum Assessment). But what drives us to make such forecasts? Even create a whole forecasting industry? In his latest book, The Black Swan: The Impact of the Highly Improbable, Nassim Nicholas Taleb believes that we do so because we are planning animals. This behavior may be a successful evolutionary adaptation. We are able to imagine situations that might risk injury or death rather than simply experiment and see what happens. “Used correctly and in place of more visceral reactions, the ability to project effectively frees us from immediate, first-order natural selection….,” he writes.
But, imagining the future is not the same as correctly predicting it. Taleb outlines the problems with forecasts as follows. First, variability matters. Most forecasts don’t include an error rate, often indicated as a range of possibilities. In other words, how wide of the mark might a forecast be? (The U. S. EIA forecast is an exception, but it is not clear how the error rate is calculated and whether the data upon which it is based can be justified.) Very often, the “error rate is so large that it is far more significant than the projection itself!” (The EIA doesn’t seem to understand this point.) Taleb gives this example: If you knew the place you are flying to is expected to be 70 degrees, you would pack much differently if you also knew that the range was plus or minus 40 degrees rather than plus or minus 5 degrees.
Leave a Reply