Who says you can’t “prove a negative”?
In a recent issue of the refereed scientific journal Geophysical Research Letters (GRL), several Environment & Climate News colleagues published a paper that comes pretty close to showing that the climate models used as the basis for gloom-and-doom projections are simply wrong over the balance of the lower atmosphere.
We obtained this result months ago but have been impatiently sitting on the news so as not to jeopardize publication in GRL, which, like most scientific journals, publishes only previously unreleased results. Truth be told, an oversight by a third party led to some of our results appearing on the Internet for a couple of days while our submission was in review for the journal Science. They rejected it, and rightly so, given their policy that a submission must “remain a privileged document and . . . not be released to the press or the public before publication.”
(We find it somewhat ironic that shortly after our paper was rejected on those grounds, Science ran a piece describing some of the contents in the draft version of the yet-to-be-released IPCC Third Assessment Report. Each and every page of the IPCC document was marked “Do not cite. Do not quote.” Go figure.)
Still, we understood. So we expanded the study and sent it to GRL.
Our paper is a bullet through the heart of the global warming scare, which requires that computer models used as the excuse for the United Nations’ climate treaty match reality.
The well-known problem we examine stems from satellite and weather-balloon data for the balance of the lower atmosphere that appear to show very little warming during their period of concurrency, which is the 21-plus years since January 1, 1979. The computer models all indicate there should have been a dramatic warming. Literally billions of our tax dollars have now gone to try to explain away that discrepancy.
Finally, last March, newspapers around the world trumpeted—some on the front page—that new research by federal climatologist Ben Santer had reconciled the difference. Along with several co-authors, he argued in Science that computer models could account for the lack of warming after all, mainly because of the cooling influence of the 1991 eruption of the Philippine volcano Mt. Pinatubo.
Two things troubled us: Pinatubo wasn’t the only big volcano in recent decades (El Chichón caused a cooling in the early 1980s of about half of the magnitude of Pinatubo), and the paper’s data ended precisely at a very hot point—the mega-El Niño of 1998. (To those who think history repeats itself here—you recall correctly that these same guys tried using a different, highly fortuitous dataset a few years ago to prove the models were OK. See sidebar.)
The recent Santer Science paper argued that including Mt. Pinatubo’s cooling effects made the difference between GCM-predicted temperatures and those measured by the satellites only 0.045ºC per decade, which they found to be statistically insignificant. They concluded, therefore, that they had reconciled observed and modeled temperatures.
But when we add in all of the volcanic action (including the cooling from El Chichón) and allow for the fact that El Niños are pretty much random occurrences (in other words, a huge one happened to occur in 1998, which happened to produce this happy result), the difference between the models and the observed temperatures works out to a whopping 0.162ºC per decade, or 360 percent of the amount Santer and colleagues published in Science.
Interestingly, this is almost exactly the difference in warming between surface temperatures and those of the rest of the lower atmosphere—proving, as we have maintained for more than half a decade now, that recent warming is confined to the very lowest layers of the atmosphere and, further research confirms, that it is largely confined to the shallow, coldest air masses of winter that no one likes anyway.
In other words, the gloom-and-doom models don’t work, which effectively leaves us with no scientifically based projection of twenty-first century climate at all, except the projection that results from observed data and the laws of physics that dictate that human-induced warming should be relatively constant, rather than increasing at an alarming exponentiality.
That leaves us about 0.65ºC of warming to “worry” about for the next 50 years. That’s the only conclusion we can take from the recent GRL paper. The models are wrong and nature has told us the answer.
Will we ever have a climate model that works? We think so, and we think we know how to “make” that happen. As NASA scientist James Hansen recently did, simply adjust the warming radiation down in the models to make them consistent with reality.
But that’s called throwing in the towel, sending the champagne to Environment & Climate News, and finding something else to do for a living. Not very likely.
According to Nature magazine, University of Virginia environmental sciences professor Patrick J. Michaels is probably the nation’s most popular lecturer on the subject of climate change. Michaels is coauthor of The Satanic Gases: Clearing the Air About Global Warming.
Michaels, P.J., and P.C. Knappenberger, 2000. Natural Signals in the MSU Lower Tropospheric Temperature. Geophysical Research Letters, 27, 2905-2908.
Santer, B.D., et al., 2000. Interpreting differential temperature trends at the surface and in the lower troposphere, Science, 287, 2000.