January 28, 2005

Disaster averted

Filed under: Climate History

Human activity over the last 8,000 years may have headed off the next ice age, new research suggests.

Humans may have averted the next ice age! That’s a research result that is sure to make global warming alarmists cringe. Ongoing human activities during the past 8,000 years likely have served to prevent us from falling into an ice age, reports a research team led by William Ruddiman, former chairman of the University of Virginia environmental sciences department. “Without any anthropogenic warming,” the team writes, “Earth’s climate would no longer be in a full-interglacial state [warm period] but be well on its way toward the colder temperatures typical of glaciations.”

Ruddiman’s team has been carefully studying carbon dioxide and methane trapped in the ice cores extracted from Antarctica and have been piecing together a detailed history of the atmospheric concentrations of those two greenhouse gases. When analyzing natural cycles of the concentration of these gases during the past 400,000 years, the researchers noticed some anomalous behavior in recent millennia. According to the authors, “Over the past 8,000 years, the CO2 concentration gradually rose to a level of 280ppm-285ppm during an interval when trends observed during the three previous interglaciations suggest that it should have fallen to 240ppm-245ppm.” They found similar behavior in the concentration of methane (Figure 1).

CO2 and Methane
Figure 1. Anthropogenic effects (as suggested by Ruddiman et al., 2005) on atmospheric methane (top) and carbon dioxide (bottom) concentrations during the past 10,000 years. (Source: Ruddiman et al., 2005)

Ruddiman attributes the anomalous rise to the massive deforestation of Eurasia, irrigation for rice farming in southeast Asia, and to increases from biomass burning, livestock production, and other sources.

Based upon the anthropogenic increases in these greenhouse gases during the past 8,000 years, as well as climate cycles during the past 400,000 years, Ruddiman and colleagues suggest that the global average temperature would be about 2ºC lower than it is now and “roughly one-third of the way toward full-glacial temperatures.”

This result puts global warmers in a difficult position. Their bedrock belief is that the earth’s climate was merrily chugging along the way Nature intended prior to the Industrial Revolution. Then all sorts of pernicious human activity started interfering with how the climate should “naturally” behave, ultimately leading to where we are now—on the brink of environmental catastrophe.

In a report issued on January 25, 2005, a group calling itself the International Climate Taskforce claimed that if temperatures rose more than 2ºC above pre-industrial levels, we would be flirting with disaster. “Beyond the 2ºC level, the risks to human societies and ecosystems grow significantly,” the task force warned. Their report stressed that “climate change represents one of the most serious and far-reaching challenges facing humankind in the 21st century.”

The message from the Ruddiman paper is basically the opposite: anthropogenic climate change to date has saved us from what would have been the most serious and far-reaching challenge facing humankind in the twenty-first century, namely dealing with a climate rapidly deteriorating into an ice age. After all, no matter what scary scenarios the global warming enthusiasts can dream up, they all pale in comparison to the actual conditions that ice ages have served up in the past—for instance, 21,000 years ago, an ice sheet covered all of North America north of a curve stretching from about Seattle to Indianapolis to New York City (Figure 2). Considering that the earth has spent about 90 percent of its time during the past 1.8 million years in ice age conditions, and only about 10 percent of the time in warm conditions, we should consider ourselves lucky to be living when we do. Actually, luck has little to do with it: that the last 10,000 years have been warm is more than likely the reason that humanity has flourished.

Last Glacial Maximum
Figure 2. Map of the timing of the ice sheet retreat at the end of the last ice age. Numbers represent thousands of years before present. (Source: Ruddiman et al., 2005)

All this optimistic talk puts the alarmists in a bind. Now, they either have to admit that the “natural” climate is an undesirable one and the human influence on the climate should be applauded, or they must dismiss the Ruddiman results. The problem with the latter solution is that the Ruddiman results were derived from a complex climate model that incorporates not only atmospheric and oceanic components, but also vegetation, soils, snow, and sea ice models. Similar kinds of models are used by modelers to project the future course of climate, thus they are an essential tool that can be tweaked to produce scary climate scenarios for the 21st century. So, obviously, climate model results can’t simply be dismissed by the very people who rely on them the most.

What to do, what to do? Obviously, the best strategy is for the alarmists to try to get everyone to simply ignore the results and pretend they don’t exist. That seems to be the tack they’re taking, as scores of newspapers from around the world have recently been filled with headlines like these: “Global Warming Called Time Bomb,” “Extinction Tied to Global Warming: Greenhouse Effect Cited in Mass Decline 250 Million Years Ago,” and “Global Warming Twice as Bad as Feared,” while few and far between, are the headlines celebrating our activities instrumental in heading off (at least temporarily) the next ice age.


Ruddiman, W.F., Vavrus, S.J., and Kutzbach, J.E., 2005. A test of the overdue-glaciation hypothesis. Quaternary Science Reviews, 24, 1-10.

No Comments

No comments yet.

RSS feed for comments on this post. TrackBack URI

Sorry, the comment form is closed at this time.

Powered by WordPress