Climate Change: Tale of Three Computer Models
The Boeing 777 became the first plane designed entirely on computers. It has flown for more than a decade; it is safe, reliable and comfortable. Before passengers flew on the 777, test pilots flew it, and arduous stress tests supplemented those test flights. Boeing’s computer models rested upon vast empirical knowledge of how a thin aluminum tube the length of a football field, hurtling through the atmosphere at high subsonic speeds at an altitude of up to eight miles, would respond to the stresses thereby encountered. Boeing’s models used high-density uniform data–knowledge of how every point on the fuselage is affected–plus mathematical algorithms that mimic precisely the environment in which airliners operate. Extensive empirical validation over time supplemented the data.
We found out last year that the financial models dreamed up by the wizards of Wall Street worked a lot better when asset values were rising than they did when asset values were falling. They were based upon relatively recent data, and had been tested only over a brief period. The resulting hyper-leveraged investing based upon presumed soundness of the financial models nearly crashed the global economy. Fortunately for fliers, Boeing’s computer models work just as well when planes are descending as they do when planes are ascending.
So, the multi-trillion Climate Change Question is: Which computer models, those of Boeing or those of Wall Street, more closely resemble climate models? When today’s models are plugged into yesterday’s climate data, they do not yield today’s climate. Suppose test pilots flying the 777 had crashed: Would Boeing have rolled out the plane anyway? Would airlines have purchased it? Would passengers have boarded it?
Climate models lack dense, uniform data–much data is interpolated artificially by supercomputers, because raw empirical data is too sparse. We still do not know how the global climate engine works, which is why scientists argue whether solar phenomena, the ocean currents, greenhouse gas emissions, etc. are the prime cause of the recent planetary warming (that stopped in 1998). Because temperature extrapolations looking decades out are built upon each year’s calculation, errors in modeling will cumulate over time–that is, the errors will become more inaccurate as years pass. As recently as twenty years ago, modelers–including the fabled James Hansen of NASA, Al Gore’s hotshot 1988 star witness–predicted far larger increases than they now estimate. And a decade earlier, we were warned that a new Ice Age was imminent.
To the argument that we should play it safe and invest a few trillion just in case the climate modelers are right, that is classic bootstrapping. I might accept it if we were talking small change, but not for trillions –especially at a time when the global economy is, and may well remain for some time, precarious. I must add that the neo-McCarthyite anathematizing of climate skeptics as today’s Holocaust deniers doesn’t increase my confidence in their case.
Above all, remember philosopher Karl Popper’s definition of what makes science distinct from dogma: It is falsifiable, in light of subsequent discovery and empirical validation. The debate, as such, is never fully over. When Einstein gave us E = mc squared, he was a majority of one. Most sole dissenters may be wrong, and we should follow proven models like Boeing’s 777 model until if and when experience reveals something better. Consensus is not science. Peer-reviewed climate models that do not, when plugged into earlier data, produce today’s climate are worthless; a non-peer-reviewed model that, when plugged into prior years’ data did produce today’s climate arguably would be worth more (though not enough to go forward without honest peer review–notably lacking in today’s climate debate universe).
Thus, Ronald Reagan used to say that there are three things one should never believe: “The check is in the mail!” “Of course I will respect you in the morning!” “Hi, I’m from the IRS and I am here to help you!” Given the lack of empirical validation of climate models, perhaps we should add this one: “This computer model is, despite never having been empirically validated, completely reliable.” We should instead heed the rule applied by Reagan regarding negotiating with the Soviet leaders: “Trust, but verify.”
John C. Wohlstetter is the founder of the issues blog “Letter >From The Capitol,” the author of “The Long War Ahead and the Short War Upon Us,” and a senior fellow at Discovery Institute. John’s articles and commentary can be followed on Twitter: JohnWohlstetter”