May 23, 2011

Less Cooling Means Less Warming

Filed under: Aerosols, Climate Forcings

We occasionally highlight articles from the scientific literature showing that the cooling impact of aerosol emissions from human activities has been overestimated. Such findings are important because they mean that warming from greenhouse gases has been similarly overestimated.

Climate models rely on aerosol cooling to keep warming in check—otherwise they predict far more warming to occur than has been observed. So, if aerosols produce less cooling, then this means that the climate models must compensate by producing less warming from greenhouse gases than they do presently. If they don’t, they will fail to replicate the observed temperature history.

In a paper soon to appear in Geophysical Research Letters, an M.I.T. research team led by Jason Cohen finds that by including in climate models aerosol-influencing processes that take place in urban environments, the total global-average negative forcing (i.e., cooling pressure) from aerosols is significantly less than when these urban processes are not considered—as is currently the case with all climate models.

The research published in GRL grew from research Cohen’s PhD. dissertation under the supervision of M.I.T.’s high profile atmospheric chemist, Dr. Ronald Prinn.

These urban processes are currently not included in climate models because they are complicated and because they are too small, falling beneath the rather coarse geographic resolution of current climate models.

As these urban processes are quite important in shaping aerosol behavior and characteristics, Cohen et al. developed a method of incorporating them into a global climate model. They developed a “metamodel” which could simulate the results of a complex, small-scale urban chemistry model, but at a lower computational cost. The metamodel results could then feed a coarser-scale global climate model.

We can’t say how many understand the nature of this metamodel, as the authors’ prose is a bit technical. They say that it was “developed using the probabilistic collocation method, which in turn is based on a polynomial chaos expansion.” Sounds complicated. But the more straightforward statistics presented by the authors indicate that the metamodel appears to work pretty well.

While there is still a lot more work to be done, Cohen et al.’s efforts yielded interesting and noteworthy results showing that current climate models exaggerate the cooling effect from aerosols. Cohen et al. put it this way:

“These results show that failure to consider urban scale processing leads to significantly more negative aerosol radiative forcing compared to when detailed urban scale processing is considered.”

This is very germane to the study of climate change, because climate models rely on aerosol cooling to offset warming caused by a build-up of greenhouse gases in the atmosphere. Without it, climate models predict too much warming from greenhouse gas increases. This means that the climate sensitivity—that is, how much warming should be expected from a doubling of the atmospheric carbon dioxide concentration—is too high.

To bring their models in line with reality, modelers tend to focus on aerosols rather than climate sensitivity. This is because the true cooling from aerosols, and their presence in the atmosphere, are only known to a broad range of error. Within this range, values can be chosen (a process called “tuning”) that allow the models to reproduce observed temperature changes. Rather than have to get into the real nuts and bolts of the model (which is where the determination of climate sensitivity lies), it is much easier just to turn the big aerosol forcing knob to produce the “right” results.

The results reported by Cohen et al. mean that all the models have been getting the “right” answer for the wrong reason. To amend this, climate modelers have two choices:

1) Increase the aerosol input to the models to bring the total aerosol forcing back to what was being used originally, or

2) admit that the climate sensitivity is too high, that climate models need a major overhaul, and that they serially and repeatedly exaggerated future global warming.

As climate sensitivity is the Holy Grail of climate science, you can bet your bottom dollar that the modelers (and all their various supporters), will not let No. 2 happen. They are quite happy with the climate sensitivity that they have. If the climate sensitivity is lowered, then so too is the sense of urgency to “do something” (aka taxing carbon) about the climate “problem.” And dreaming up way to do something about the problem is the preferred past time of a fairly large number of individuals. The recent report from the National Academies of Science, “America’s Climate Choices” is a prime example, where scientists argue for the imposition of taxes based precisely on the types of models that Cohen is showing are wrong.

So despite ever-growing evidence that the climate sensitivity is near the low end of the IPCC range of 2°C to 4.5°C—evidence which is further enhanced by the new Cohen et al. results—we imagine that little will change and that national and international assessment makers and consensus shapers will continue to cling to high climate sensitivity. That is what it takes to manufacture a good climate crisis, without which, a lot of people are going to look very foolish.

Reference:

Cohen, J. B., R. G. Prinn, and C. Wang, 2011. The impact of detailed urban-scale processing on the composition, distribution, and radiative forcing of anthropogenic aerosols. Geophysical Research Letters, in press.




No Comments

No comments yet.

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.

Powered by WordPress