Senator Refutes Global Warming Hypothesis: Part 2 in a Series

Published November 21, 2003

Managing Editor’s note: Senator James Inhofe (R-Oklahoma), chairman of the Senate Committee on Environment and Public Works, fired the opening salvo July 28 in Senate hearings regarding global warming theory.

The following is the second of a three-part series excerpting Inhofe’s remarks. Subheads have been added for the reader’s convenience.


The Science of Climate Change
Senate Floor Statement by
U.S. Sen. James M. Inhofe (R-Oklahoma)
Chairman, Committee on Environment and Public Works
July 28, 2003


One might pose the question: If we had the ability to set the global thermostat, what temperature would we pick? Would we set it colder or warmer than it is today? What would the optimal temperature be? The actual dawn of civilization occurred in a period climatologists call the “climatic optimum,” when the mean surface temperature was 1-2º Celsius warmer than today. Why not go 1 to 2 degrees Celsius higher? Or 1 to 2 degrees lower for that matter?

The Kyoto emissions reduction targets are arbitrary, lacking in any real scientific basis. Kyoto therefore will have virtually no impact on global temperatures. This is not just my opinion, but the conclusion reached by the country’s top climate scientists.

Dr. Tom Wigley, a senior scientist at the National Center for Atmospheric Research, found that if the Kyoto Protocol were fully implemented by all signatories—this next point assumes that the alarmists’ science is correct, which of course it is not—if Kyoto were fully implemented it would reduce temperatures by a mere 0.07º Celsius by 2050, and 0.13º Celsius by 2100. What does this mean? Such an amount is so small that ground-based thermometers cannot reliably measure it.

Dr. Richard Lindzen, an MIT scientist and member of the National Academy of Sciences, who has specialized in climate issues for over 30 years, told the Committee on Environment and Public Works on May 2, 2001 that there is a “definitive disconnect between Kyoto and science. Should a catastrophic scenario prove correct, Kyoto would not prevent it.”

Similarly, Dr. James Hansen of NASA, considered the father of global warming theory, said the Kyoto Protocol “will have little effect” on global temperature in the twenty-first century. In a rather stunning follow-up, Hansen said it would take 30 Kyotos to reduce warming to an acceptable level. If one Kyoto devastates the American economy, what would 30 do?

So this leads to another question: If the provisions in the Protocol do little or nothing measurable to influence global temperatures, what does this tell us about the scientific basis of Kyoto?

Answering that question requires a thorough examination of the scientific work conducted by the U.N.’s Intergovernmental Panel on Climate Change, which provides the scientific basis for Kyoto, international climate negotiations, and the substance of claims made by alarmists.

IPCC Assessment Reports

The science of Kyoto is based on the “Assessment Reports” conducted by the Intergovernmental Panel on Climate Change, or IPCC. Over the past 13 years, the IPCC has published three assessments, with each one over time growing more and more alarmist.

The first IPCC Assessment Report in 1990 found that the climate record of the past century was “broadly consistent” with the changes in Earth’s surface temperature, as calculated by climate models that incorporated the observed increase in greenhouse gases.

This conclusion, however, appears suspect considering the climate cooled between 1940 and 1975, just as industrial activity grew rapidly after World War II. It has been difficult to reconcile this cooling with the observed increase in greenhouse gases.

After its initial publication, the IPCC’s Second Assessment report in 1995 attracted widespread international attention, particularly among scientists who believed that human activities were causing global warming. The IPCC report was replete with caveats and qualifications, providing little evidence to support anthropogenic theories of global warming. The preceding paragraph in which the “balance of evidence” quote appears makes exactly that point.

Moreover, the IPCC report was quite explicit about the uncertainties surrounding a link between human actions and global warming. “Although these global mean results suggest that there is some anthropogenic component in the observed temperature record, they cannot be considered compelling evidence of a clear cause-and-effect link between anthropogenic forcing and changes in the Earth’s surface temperature.”

Remember, the IPCC provides the scientific basis for the alarmists’ conclusions about global warming. But even the IPCC is saying their own science cannot be considered compelling evidence.

IPCC Releases Third Assessment

Five years later, the IPCC was back again, this time with the Third Assessment Report on Climate Change. In October 2000, the IPCC “Summary for Policymakers” was leaked to the media, which once again accepted the IPCC’s conclusions as fact.

Based on the summary, the Washington Post wrote on October 30, 2000, “The consensus on global warming keeps strengthening.” In a similar vein, the New York Times confidently declared on October 28, “The international panel of climate scientists considered the most authoritative voice on global warming has now concluded that mankind’s contribution to the problem is greater than originally believed.”

Such reporting prompted testimony by Dr. Richard Lindzen before the Committee on Environment and Public Works, the committee I now chair, in May 2001. Lindzen said, “Nearly all reading and coverage of the IPCC is restricted to the highly publicized Summaries for Policymakers, which are written by representatives from governments, NGOs, and business; the full reports, written by participating scientists, are largely ignored.”

The predictions in the summary went far beyond those in the IPCC’s 1995 report. In the Second Assessment, the IPCC predicted the Earth could warm by 1 to 3.5º Celsius by the year 2100. The “best estimate” was a 2º Celsius warming by 2100. Both are highly questionable at best.

In the Third Assessment, the IPCC dramatically increased that estimate to a range of 1.4 to 5.8º Celsius, even though no new evidence had come to light to justify such a dramatic change.

What changed? As it turned out, the new prediction was based on faulty, politically charged assumptions about trends in population growth, economic growth, and fossil fuel use.

The extreme-case scenario of a 5.8º warming, for instance, rests on an assumption that the whole world will raise its level of economic activity and per-capita energy use to that of the United States, and that energy use will be carbon-intensive. This scenario is simply ludicrous. This essentially contradicts the experience of the industrialized world over the past 30 years. Yet the 5.8º figure featured prominently in news stories because it produced the biggest fear effect.

Even Dr. Stephen Schneider, an outspoken believer in catastrophic global warming, criticized the IPCC’s assumptions in the journal Nature on May 3, 2001. Schneider’s own calculations, which cast serious doubt on the IPCC’s extreme prediction, broadly agree with an MIT study published in April 2001. It found there is a “far less” than 1 percent chance that temperatures would rise to 5.8º C or higher, while there is a 17 percent chance the temperature rise would be lower than 1.4º.

That point bears repeating: Even true believers think the lower number is 17 times more likely to be right than the higher number. Moreover, even if the Earth’s temperature increases by 1.4º Celsius, does it really matter? The IPCC doesn’t offer any credible science to explain what would happen.

Dr. David Wojick, an expert in climate science, recently wrote in Canada’s National Post, “The computer models (GCMs) cannot … decide among the variable drivers, like solar versus lunar change, or chaos versus ocean circulation versus greenhouse gas increases. Unless and until they can explain these things, the models cannot be taken seriously as a basis for public policy.”

Again, to reiterate in plain English, this means the models do not account for key variables that influence the climate system.

Despite this, the alarmists continue to use these models and all the other flimsy evidence I’ve cited to support their theories of man-made global warming.

Satellites, Weather Balloons, CO2, and Glaciers

Now I want to turn to temperature trends in the twentieth century. GCMs predict that rising atmospheric CO2 concentrations will cause temperatures in the troposphere, the layer from 5,000 to 30,000 feet, to rise faster than surface temperatures—a critical fact for the alarmist hypothesis.

But in fact, there is no meaningful warming trend in the troposphere, and weather satellites, widely considered the most accurate measure of global temperatures, have confirmed this.

Using NOAA satellite readings of temperatures in the lower atmosphere, scientists at The University of Alabama in Huntsville (UAH) produced a dataset that shows global atmospheric warming at the rate of about 0.07º C (about 0.13º Fahrenheit) per decade since November 1978.

“That works out to a global warming trend of about one and a quarter degrees Fahrenheit over 100 years,” said Dr. John Christy, who compiled the comparison data. Christy concedes such a trend “is probably due in part to human influences,” but adds that “it’s substantially less than the warming forecast by most climate models, and”—here is the key point—”it isn’t entirely out of the range of climate change we might expect from natural causes.”

To reiterate: The best data collected from satellites validated by balloons to test the hypothesis of a human-induced global warming from the release of C02 into the atmosphere shows no meaningful trend of increasing temperatures, even as the climate models exaggerated the warmth that ought to have occurred from a build-up in C02.

Some critics of satellite measurements contend they don’t square with the ground-based temperature record. But some of this difference is due to the so-called “urban heat island effect.” This occurs when concrete and asphalt in cities absorb—rather than reflect—the sun’s heat, causing surface temperatures and overall ambient temperatures to rise. Scientists have shown that this strongly influences the surface-based temperature record.

In a paper published in the Bulletin of the American Meteorological Society in 1989, Dr. Thomas R. Karl, senior scientist at the National Climate Data Center, corrected the U.S. surface temperatures for the urban heat-island effect and found there has been a downward temperature trend since 1940. This suggests a strong warming bias in the surface-based temperature record.

When we look at the twentieth century as a whole, we see some distinct phases that question anthropogenic theories of global warming. First, a strong warming trend of about 0.5º C began in the late nineteenth century and peaked around 1940. Next, the temperature decreased from 1940 until the late 1970s.

Why is that decrease significant? Because about 80 percent of the carbon dioxide from human activities was added to the air after 1940, meaning the early twentieth century warming trend had to be largely natural.

Yet the doomsayers, undeterred by these facts, just won’t quit. In February and March 2002, the New York Times and the Washington Post, among others, reported the collapse of the Larsen B ice shelf in the Antarctic Peninsula, causing quite a stir in the media and providing alarmists with more propaganda to scare the public.

Although there was no link to global warming, the Times couldn’t help but make that suggestion in its March 20 edition. “While it is too soon to say whether the changes there are related to a buildup of the ‘greenhouse’ gas emissions that scientists believe are warming the planet, many experts said it was getting harder to find any other explanation.”

The Times, however, simply ignored a recent study in the journal Nature, which found the Antarctic has been cooling since 1966. And another study in Science recently found the West Antarctic Ice Sheet has been thickening rather than thinning.

University of Illinois researchers also reported “a net cooling on the Antarctic continent between 1966 and 2000.” In some regions, like the McMurdo Dry Valleys, temperatures cooled between 1986 and 1999 by as much as two degrees centigrade per decade.

In perhaps the most devastating critique of glacier alarmism, the American Geophysical Union found that the Arctic was warmer in 1935 than it is now. “Two distinct warming periods from 1920 to 1945, and from 1975 to the present, are clearly evident … compared with the global and hemispheric temperature rise, the high-latitude temperature increase was stronger in the late 1930s to early 1940s than in recent decades.”

Again, that bears repeating: 80 percent of the carbon dioxide from human activities was added to the air after 1940—yet the Arctic was warmer in 1935 than it is today.

So, not only is glacier alarmism flawed, but there is no evidence, as shown by measurements from satellites and weather balloons, of any meaningful warming trends in the twentieth century.


For more information …

on the science of global warming and economics of measures proposed to address it, visit the ClimateSearch Web site at http://www.climatesearch.com and The Heartland Institute’s Environment Issue Suite at http://www.heartland.org.