Of all the environmental issues we have ever confronted, only one—global warming—is driven solely by the imagination of a computer. All the policy proposals to “fight” this “threat,” including the notorious Kyoto Protocol to the United Nations Framework Convention on Climate Change, are based upon the output of a few silicon chips.
This situation is unique in human events. The Kyoto Protocol is the most expensive and intrusive treaty the United States has ever signed. And it is driven not by history (as most treaties are), but by a weather forecast. Remember that, beginning a mere 7.9 years from now, the protocol would require the United States to reduce its net emissions of carbon dioxide and methane by nearly 40 percent below where they would be if we just continued on as we have since 1990. The expense is enormous.
How reliable is the climate change forecast in question? That seemingly obvious question has spurred one of the most acrimonious scientific and political debates in the history of environmental protection. The answer lies in the history of the forecasts.
It is fairly easy to calculate the mean temperature of the Earth from a knowledge of basic physics and a measurement of the sun’s output. But that calculation is meaningless because it treats the planet as a point in space. Rather, climate is a local phenomenon, and any calculation of the myriad elements that make it up–such as temperature, rainfall, storminess and the like–requires an ability to simulate what is happening over our heads, rather than simply treating the Earth as a uniform dot.
The only way we know how to approximate this dynamic is to simulate the behavior of the atmosphere and its attendent circulation systems—jet streams, trade winds, and so forth. These are what make the weather, and the sum of weather is what makes the climate. The computer tools we use are models of the climate’s general circulation—General Circulation Models, or GCMs.
GCMs take the known and merely guessed physics of the atmosphere and attempt to simulate observed weather patterns. Exploring how much has been guessed vs. what is known is quite revealing.
Climate is determined by the differential heating of the Earth’s surface by the sun. To understand climate, you must know how much radiation shines on the planet and how much the planet absorbs vs. the amount it reflects away. Obviously snow and clouds, being bright white, reflect away radiation, while black dirt absorbs almost all of it.
Some interesting history:
- 25 years ago, we didn’t really know how much the sun heated the Earth. Textbooks at that time give a value that is about 5 percent more than it turned out to be.
- 20 years ago, we thought the Earth absorbed a whopping 40 percent more of the sun’s energy than it actually does.
- 10 years ago we didn’t know whether the Earth’s clouds warmed or cooled the planet (it turns out they are net coolers).
- And to this day, we still don’t know what the “natural” amount of cloudiness is.
Things are even murkier when it comes to the greenhouse effect. This very real phenomenon arises from some of the atmosphere’s constituents—mainly water vapor and carbon dioxide—absorbing some of the heat from the Earth’s surface and preventing it from going directly out to space at the rate that it would go if they weren’t there. As a result they warm the bottom of the atmosphere (and cool the top).
The greenhouse effect keeps the surface of the planet about 60ºF warmer than it would be without these gases. About 55ºF of that warming comes from water vapor, and most of the rest from carbon dioxide.
Through the burning of fossil fuels, human beings have increased the CO2 content of the atmosphere by about one-third above its mean value since the last glaciation (about 11,000 years ago). Our other greenhouse emissions, such as methane and the chlorofluorocarbon refrigerants, nearly double the effect from CO2, resulting in an effective greenhouse change of around 60 percent of the “natural” carbon dioxide “background” level.
Every forecast of how the climate changes as the greenhouse effect increases requires some knowledge of how those emissions will change in the future. Climate models used to increase their effective carbon dioxide concentration by 1 percent per year. But the United Nations assumes the most likely increase in the next 100 years will be around 0.63 percent, or nearly 40 percent lower than the number the models employ in their forecasts.
Recently, NASA scientist James Hansen demonstrated that the real increase in the last 15 years has been around 0.40 percent, or 60 percent less than previously assumed. His reasoning? The plants are absorbing carbon dioxide at increasing rates, resulting in a greener planet.
As a result of these (and other) missteps, most GCMs initially predicted way too much warming.
When the United Nations first began (somewhat arrogantly, some of us think) making pronoucements about the “consensus” of scientists, in 1990, the average GCM predicted global warming of 7.6ºF for doubled CO2. This figure also roughly corresponds to the predicted warming by the year 2100. Scientists like the writer of this article howled in protest (and laughter), because these models also predicted that we should have already warmed 3.2°F by 1990, and the observed warming was a mere 0.9ºF, or around one-quarter of what was forecast.
By 1992, in time for the Rio Earth Summit, the GCMs had been cooled a bit, because the rate of greenhouse increase was dropped a smidgen. According to the United Nations, the likely warming to 2100 was now 4.5ºF. The same critics still objected, because far too much warming was still being predicted, compared to what was being observed. But it was these models that gave us the climate treaty that is the parent of the Kyoto Protocol.
By 1995, the U.N. dropped its predictions further, to 3.6ºF, under the highly debatable notion that greenhouse warming is largely being “masked” by other industrial emissions, such as sulfate dust from the combustion of coal. The same critics pointed out repeatedly that this explanation didn’t stand up to some very simple logical tests.
In 2001, the United Nations is going to issue another forecast. Here’s where it stands as of this writing: Using the “average” guess for greenhouse and sulfate emissions in the next century, the warming stays down at 3.6ºF. This is less than half the value predicted by the average GCM a mere 10 years earlier (corrected thanks, in part, to my colleagues’ and my relentless public criticism of the forecasts) and only about 80 percent of what served as the basis for this notorious treaty.
Further, it is now apparent that most of this warming is taking place in the coldest, deadliest air of winter. As a result, some of the blatant idealogues participating in the U.N. process—mainly (unfortunately) our own U.S. representatives—are rumored to be most upset because they know that this warming is too small and too benign to ever persuade the required 67 U.S. senators to ratify the Kyoto Protocol. Our gossip lines are squawking that some people are desperate to jimmy the future emissions scenarios so that the models predict more warming.
We hope this doesn’t shock our readers. But now you know the history of the world’s only computer-predicted environmental catastrophe. Needless to say, it does not inspire confidence in either our science or our leaders.
According to Nature magazine, University of Virginia environmental sciences professor Patrick J. Michaels is probably the nation’s most popular lecturer on the subject of climate change. Michaels is the author of Sound and Fury: The Science and Politics of Global Warming.