When Too Little CO2 Nearly Doomed Humanity

Published June 15, 2017

Aside from the expected protests of Al Gore and Leonard Di Caprio, the public didn’t seem to raise its CO2 anguish much above the Russians-election frenzy the media already had going. Statistician Bjorn Lomborg has already pointed out that the Paris CO2 emissions cuts would make only a 0.05 degree C difference in earth’s 2100 AD temperature. And that’s in the unlikely event that all parties actually kept their pledges.

During the last Ice Age, however, too little CO2 in the air almost eradicated mankind. That’s when the much-colder water in the oceans sucked most of the CO2 from the air. There were only about 180 parts per million in the atmosphere, compared to today’s 400 ppm.

The Ice Age’s combined horrors—intense cold, permanent drought, and CO2 starvation—killed most of the plants on earth. Only a few trees survived, in the mildest climates.Much of the planet’s grass turned to tundra, which is much less nourishing to the herbivores we depended on for food and fur. Cambridge University’s recent studies say that, worldwide, only about 100,00 humans were left alive when the current Interglacial warming mercifully began.

The few surviving prey animals had to keep migrating to get enough food. That forced our ancestors to migrate with them, in temperatures that routinely fell to 40 degrees below zero (both Fahrenheit and Celsius). The Neanderthals had been living in warm caves protected from predators by fires at the cave mouths. They had hunted their prey by sneaking through the trees—which no longer existed. They apparently couldn’t adapt, and starved. Cambridge finds no signs of genocidal warfare.


The most successful human survivors—who provided most of the DNA for modern Europeans—were nomads from the Black Sea region. The Gravettians had never had trees, so they invented mammoth-skin tents, held up by salvaged mammoth ribs. They also developed spear-throwers, to kill the huge mammoths from a safe distance. Most important, they tamed wolves into dogs, to protect their tents from marauders, to locate the game animals on the broad tundra, and to harry the prey into a defensive cluster. The scarcity of food in that Glacial Maximum intensified the wolves’ appreciation for the bones and bone marrow at the human camps.

Carbon dioxide truly is “the gas of life.” The plants that feed us can’t live without inhaling the CO2, and then they breath out the oxygen that lets humans keep breathing too.

Our crop plants evolved about 400 million years ago, when CO2 in the atmosphere was about 5000 parts per million! Our evergreen trees and shrubs evolved about 360 million years ago, with CO2 levels at about 4000 ppm. When our deciduous trees evolved about 160 million years ago, the C02 level was about 2200 ppm—still five times the current level.

There’s little danger to humans of too much CO2 in the air they breathe. The EPA says 1000 ppm is the safe limit for lifetime human exposure. The CO2 alarm in nuclear submarines is set at 8,000 ppm, and space shuttles often get to 5,000 ppm.


If there’s little danger of humans having too much CO2 in their air, and a real danger to civilization from having too little, what’s the ideal level of atmospheric CO2? The answer? There’s a broad safe range—with far more risk of too little than too much. With no plants, there’d be no people or animals, let alone civilization.

Human numbers, moreover, expanded strongly during the Holocene Optimum, with temperatures 4 degrees C higher than today. Even today, the residents of the tropics keep demonstrating that humans can tolerate much higher temperatures than most of us experience. (As we utilize the new malaria vaccine, the tropics will prosper even more.) And even now we suffer far more human deaths from “too cold” than from “too warm.”

The crops continue to produce record yields in our “unprecedented” warming—and the extra CO2 in our air is credited with as much as 15 percent of that yield gain!

The atmosphere has a limited capacity to absorb CO2 and is already partially saturated. That’s why each additional ppm of CO2 produces radically less warming. In fact, the first surge of human-emitted CO2 after World War II should have produced a major surge of warming if C02 is the control factor. Instead temperatures went down from 1940–1975. Media outlets urgently predicted another Ice Age.


Remember, that 1936 recorded the highest thermometer readings of any year in the last 5000. (NOAA these days reports only its “adjusted” temperatures.) That “adjustment” has given us all the recent hottest-year claims—but the “adjustments” conveniently lowered past temperature readings (such as 1936) and pushed up recent warmth. Our thermometers are also increasingly in Urban Heat Islands or at jet-fueled airports. Most of the rural measurement sites showed far less warming, but they have now been dropped from the “official” records.

Both sides of the debate calculate that a full redoubling of CO2, by itself, will produce only about 1 degree C of additional warming. Another redoubling would have far less impact than that.

One degree C of warming was obviously not enough to frighten the public. It would be hardly noticed within our constant temperature variations. So, the computerized models cited by the Intergovernmental Panel on Climate Change made another assumption—that a hotter world would hold more moisture in its atmosphere. Since water vapor is the most effective greenhouse gas, the climate modelers claimed the earth might heat by 5 or even ten degrees C. One scientist (who supposedly advises Pope Francis) recently claimed 12 degrees C of overheating! The awkward truth is that NASA has monitored moisture in the atmosphere since 1980 and water vapor hasn’t increased despite the higher levels of CO2 and our “unprecedented” temperatures.

Is that why the IPCC models have predicted more than twice as much warming as we’ve seen?

And why did the computer models fail to predict both the Pacific Oscillation’s current 20-year non-warming and the coming solar sunspot minimum? We have only one model which has verified itself by back-casting the weather we’ve had over the past century. That model is from Nicola Scafetta at Duke University, and it’s based on solar, lunar and planetary cycling. The latest data from the CERN particle physics lab has also has produced a model based on cycling–that foresees no runaway warming. It sees the impending cold solar minimum instead.

Is the long, wrong- headed war against CO2 finally fading?

[Originally published at Townhall]