Study Concludes 10 More Years of Business-as-Usual CO2 Emissions Could Make Avoiding Dangerous Climate Change Impossible
An additional 10 years of continued rapid growth of CO2 emissions and infrastructure may make avoiding dangerous climate change “impractical, if not impossible”, according to a recently published study in the journal Atmospheric Chemistry and Physics.
The lead author of the study is James Hansen, Director of NASA’s Goddard Institute for Space Studies (GISS) in New York. Forty-six other researchers from other organizations in the US and France also contributed to the work. The researchers used data from earlier warm periods in the Earth’s history to estimate climate impacts as a function of global temperature, climate models to simulate global warming, and satellite data to verify ongoing changes.
Although identifying “dangerous” effects is “partly subjective”, the report concludes that additional global warming of more than 1°C above the level in 2000 will have effects that may be highly disruptive, and push the climate past tipping points.
Tipping points can occur during climate change when the climate reaches a state such that strong amplifying feedbacks are activated by only moderate additional warming. This study finds that global warming of 0.6°C in the past 30 years has been driven mainly by increasing greenhouse gases, and only moderate additional climate forcing is likely to set in motion disintegration of the West Antarctic ice sheet and Arctic sea ice. Amplifying feedbacks include increased absorption of sunlight as melting exposes darker surfaces and speedup of iceberg discharge as the warming ocean melts ice shelves that otherwise inhibit ice flow.
Have we already passed a “tipping point” such that it is now impossible to avoid “dangerous” climate change? In our estimation, we must be close to such a point, but we may not have passed it yet. It is still feasible to achieve a scenario that keeps additional global warming under 1°C, yielding a degree of climate change that is quantitatively and qualitatively different than under BAU scenarios.
The researchers conclude that a CO2 level exceeding about 450 ppm would be dangerous. The current atmospheric concentration of CO2 is currently 383 ppm, up from 280 ppm at the start of the industrial age. Atmospheric carbon is currently increasing at about 2 ppm per year.
The study also shows that the reduction of non-carbon dioxide forcings such as methane and black soot can offset some CO2 increase, but only to a limited extent.
The team used a computer model developed by the Goddard Institute to simulate climate from 1880 through today. The model included a more comprehensive set of natural and human-made climate forcings than previous studies, including changes in solar radiation, volcanic particles, human-made greenhouse gases, fine particles such as soot, the effect of the particles on clouds and land use. Extensive evaluation of the model’s ability to simulate climate change is contained in a companion paper to be published in Climate Dynamics.
The authors use the model for climate simulations of the 21st century using both business-as-usual growth of greenhouse gas emissions and an alternative scenario in which emissions decrease slowly in the next few decades and then rapidly to achieve stabilization of atmospheric CO2 amount by the end of the century. Climate changes are so large with business-as-usual, with additional global warming of 2-3°C (3.6-5.4°F) that Hansen concludes “business-as-usual would be a guarantee of global and regional disasters.”
However, the study finds much less severe climate change—one-quarter to one-third that of the business-as-usual scenario—when greenhouse gas emissions follow the alternative scenario.
A scenario that avoids “dangerous” climate change appears to be still technically feasible.
Our conclusion that global temperature is nearing the level of dangerous climate effects implies that little time remains to achieve the international cooperation needed to avoid widespread undesirable consequences. CO2 emissions are the critical issue, because a substantial fraction of these emissions remain in the atmosphere “forever”, for practical purposes. The principal implication is that avoidance of dangerous climate change requires the bulk of coal and unconventional fossil fuel resources to be exploited only under condition that CO2 emissions are captured and sequestered.
A second inference is that remaining gas and oil resources must be husbanded, so that their role in critical functions such as mobile fuels can be stretched until acceptable alternatives are available, thus avoiding a need to squeeze such fuels from unconventional and environmentally damaging sources. The task is to achieve a transition to clean carbon-free energy sources, which are essential on the long run, without pushing the climate system beyond a level where disastrous irreversible effects become inevitable.
Separately, a different new study by NASA scientists suggests that business-as-usual greenhouse-gas warming may raise average summer temperatures in the eastern United States nearly 10° Fahrenheit by the 2080s. It also suggests that current climate models are underestimating surface temperature changes.
The research found that eastern US summer daily high temperatures that currently average in the low-to-mid-80s (degrees Fahrenheit) will most likely soar into the low-to-mid-90s during typical summers by the 2080s. In extreme seasons—when precipitation falls infrequently—July and August daily high temperatures could average between 100 and 110 degrees Fahrenheit in cities such as Chicago, Washington, and Atlanta.
To reach their conclusions, the researchers analyzed nearly 30 years of observational temperature and precipitation data and also used computer model simulations that considered soil, atmospheric, and oceanic conditions and projected changes in greenhouse gases. The simulations were produced using a widely-used weather prediction model coupled to a global climate model developed by NASA’s Goddard Institute for Space Studies.
The global model, one of the models used in the recently issued climate report by the Intergovernmental Panel on Climate Change (IPCC), was utilized in this study to identify future changes in large-scale atmospheric circulation patterns due to the build up of greenhouse gases. This information was then fed into the weather prediction model to forecast summer-to-summer temperature variability in the eastern United States during the 2080s. The weather model showed that extreme summertime surface temperatures developed when carbon dioxide emissions were assumed to continue to increase about two percent a year, the “business as usual” scenario. These findings are too recent to be included in the latest IPCC report.
The weather prediction model used in this research is advantageous because it assesses details about future climate at a smaller geographic scale than global models, providing reliable simulations not only on the amounts of summer precipitation, but also on its frequency and timing. This is an important capability for predicting summer temperatures because observed daily temperatures are usually higher on rainless days and when precipitation falls less frequently than normal.
The study determined that the global climate model (GCM) likely underestimates future air temperatures near the ground because it simulates too many rainy days on which clouds block sunlight and on which the wet ground is additionally cooled by evaporation. Statistics of rain frequency inherently depend on the size of the area being monitored, since it rains more often somewhere within a large area than somewhere within a much smaller area. The smaller area of the weather prediction model provided better tuning.
However, even accounting for the relatively large area of the GCM's computational elements, the NASA researchers found that the GCM still overestimates precipitation frequency. By comparison, the corresponding percentage of rainy days predicted by the regional mesoscale model for the same summers was lower and much more realistic.
The scientific literature indicates that other GCMs are also flawed by computations of too frequent precipitation and unrealistic morning showery precipitation. We are not aware of any other study that has documented the impact of the precipitation simulation imperfections on GCMs’ predictions of surface air temperature, but the ability of such flawed models to predict global warming and its extremes could be compromised. This study suggests that climate change will cause more extreme temperatures than implied by previous GCM studies.
(A hat-tip to Marcus!)
“Dangerous human-made interference with climate: a GISS modelE study”; J. Hansen, M. Sato, R. Ruedy, P. Kharecha, A. Lacis, R. Miller, L. Nazarenko, K. Lo, G. A. Schmidt, G. Russell, I. Aleinov, S. Bauer, E. Baum, B. Cairns, V. Canuto, M. Chandler, Y. Cheng, A. Cohen, A. Del Genio, G. Faluvegi, E. Fleming, A. Friend, T. Hall, C. Jackman, J. Jonas, M. Kelley, N. Y. Kiang, D. Koch, G. Labow, J. Lerner, S. Menon, T. Novakov, V. Oinas, Ja. Perlwitz, Ju. Perlwitz, D. Rind, A. Romanou, R. Schmunk, D. Shindell, P. Stone, S. Sun, D. Streets, N. Tausnev, D. Thresher, N. Unger, M. Yao, and S. Zhang; Atmos. Chem. Phys., 7, 2287–2312, 2007
“An analysis of the potential for extreme temperature change based on observations and model simulations”; Lynn, B.H., R. Healy, and L.M. Druyan; J. Climate, 20, 1539-1554, doi:10.1175/JCLI4219.1.
Precipitation and the Potential for Extreme Temperature Change (NASA Science Brief)