ICCC-10 Panel 2: Climate Science and Accurate Data

Published June 16, 2015

The “Climate Science and Accurate Data” panel, which I moderated, was particularly timely because the week before the Tenth International Conference on Climate Change began, the National Oceanic and Atmospheric Administration (NOAA) released a paper using tweaked and selective temperature data from ocean buoys and ship engine intake channels to claim there has been no pause in global temperature over the past 18 years.

NOAA’s report is an outlier that contradicts its own previous research, every other international data set, the Intergovernmental Panel on Climate Change’s own findings, and the world’s weather balloon and satellite data sets—all of which found an unexplained pause in Earth’s warming.

Watts Exposes Poor Data Sources

Anthony Watts opened the session by discussing “Government Gate keepers and the true temperature record.”

Watts, Northern California’s resident meteorologist, started SurfaceStations.org in 2007, a website devoted to photographing and documenting the quality of weather stations across the United States. For Watts, the problem with assessing the case for human-caused global warming is twofold: The basic data from ground based thermometers are fundamentally flawed and the gatekeepers are biased, thwarting what is supposed to be an unbiased scientific process.

Watts defined “gatekeeping” as “the process through which information is filtered for dissemination.” Using climate gatekeepers’ own words, Watts showed they are subverting transparency and replication, two fundamental features for conducting sound science. For instance, Watts quoted Phil Jones of the Climate Research Unit at the University of East Anglia, one the most often-cited sources of climate data, saying, “‘I should warn you, some of the data we have we are not supposed to pass on to others. … Even if the [World Meteorological Organization] agrees, I will still not pass on the data. We have 25 years or so invested in the work. Why should I make the data available to you when your aim is to try and find something wrong with it.'”

If data is not shared, theories cannot be tested and results cannot be replicated.

Watts says even gatekeepers such as NASA’s Gavin Schmidt, who is wedded to the idea humans are causing dangerous global warming, recognize limitations with the dataset they use. Watts quoted Schmidt’s analysis of the weaknesses of the data sources, including “‘uneven spatial distribution, many missing data points, and a large number of non-climatic biases varying in time and space.'”

While there are a number of problems related to data sources, perhaps the most damaging, as Watts demonstrated, is the use of obviously bad data. For instance, using data from temperature gauges in the middle of concrete parking lots or near air conditioning unit heat outlets make it appear as though results are far different than they actually are.

Watts says improving the temperature record requires throwing out all the data from bad temperature stations, which are the stations noncompliant with NOAA’s stated standards for temperature recording locations.

Watts asks, “Why are we keeping all of these bad stations in the record? Let’s look at the best station and use those as the metric to measure temperature change.”

Spencer on Satellite Updates

The next speaker was Roy Spencer, who works with John Christy at the University of Alabama-Huntsville in conjunction with NASA’s Marshall Space Flight Center, to develop and manage NASA’s satellite temperature monitoring network. Spencer’s presentation detailed the updates they are undertaking to improve the functioning of global satellites and correct their measurement biases.

IPCC Uses Faulty Forecasting Methods

Forecasting expert J. Scott Armstrong, from the Wharton School at the University of Pennsylvania, closed the panel with a presentation titled “Global Warming? It’s a forecasting problem,” laying out the scientific principles necessary for producing the most accurate, evidence-based, forecasts possible.

It turns out, IPCC and other prominent supporters of the view humans are causing catastrophic climate change ignore or violate almost every one of these principles. Developing a rational climate policy requires three sets of evidence based forecasts: a substantive long-term trend in global mean temperatures; major net harmful effects from changing temperatures; and net benefit from proposed policies relative to no action. IPCC projections lack all three sets of forecasts to develop a rational climate policy.

In the IPCC’s own words, “[The] long-term prediction of future climate states is not possible.” Rather, it presents scenarios based on computer simulations using experts’ assumptions.

Armstrong says scenarios are just “stories about what happened in the future,” and forecasting research shows experts’ opinions on complex, uncertain situations are no better than assessments by non-experts.

In response to this finding, Armstrong developed the Seer-Sucker Theory: “No matter how much evidence there is that experts are unable to provide accurate forecasts, suckers will pay seers for forecasts,” said Armstrong.

An audit run by Armstrong and his Wharton colleague Kesten Green of the procedures used by IPCC to develop its “business as usual” global mean temperature scenario found it violated 72 of the 89 relevant forecasting principles, including principles such as fully disclosing data and methods, ensuring forecasts are independent of politics, and when there is uncertainty, designing forecasts conservatively.

Based on Armstrong’s assessment, the panel ended where it began, with Watts concluding IPCC data and projections are indelibly compromised by bad data and poor procedures, resulting in untrustworthy climate policy projections. 

H. Sterling Burnett, Ph.D. ([email protected]) is a research fellow with The Heartland Institute.