Climate sensitivity wrangles don’t change the big picture on emissions

The sources of data that scientists can use to determine climate sensitivity include ice cores, the cylinders these researchers are holding at the Vostok station in Antarctica. Image credit: Todd Sowers, Columbia University

The sources of data that scientists can use to determine climate sensitivity include ice cores, the cylinders these researchers are holding at the Vostok station in Antarctica. Image credit: Todd Sowers, Columbia University

How much does the world warm up in response to a certain amount of greenhouse gases like CO2 in the atmosphere? It’s a simple question, but its answer depends on whether you mean short-term or long-term warming, and estimates vary according to the methods used. Scientists are currently intensively debating long-term ‘climate sensitivity’, which begs prompts the question: might we be pushing too hard to cut climate CO2 emissions, if this is uncertain?

The answer is no, according to Joeri Rogelj from the Swiss Federal Institute of Technology, ETH Zurich, and his coworkers. They looked at how a range of climate sensitivity values affected their 21st century warming projections in a paper published in Environmental Research Letters last week. “When taking into account all available evidence, the big picture doesn’t change,” Joeri told me. The ‘carbon budget’ of greenhouse gases we could still emit today and in the future is very limited whatever the climate sensitivity, he explained. “Keeping the so-called carbon budget in line with warming below 2°C still requires a decarbonisation of  global society over the first half of this century.”

Climate sensitivity is the measure of how much the world will eventually warm when it reaches equilibrium after a doubling of CO2 in the air. Today, we have upset the normal equilibrium where the Sun’s energy flowing into the atmosphere matches the flow the Earth radiates back out of it. Now more is coming in than leaving, and that’s heating the planet up. Think of the atmosphere as a series of pipes, with energy flowing through them like a liquid. The Earth is a reservoir in the system, filled by an incoming pipe and drained by an outgoing one. CO2 acts like a blockage in the outgoing pipe – it slows the outward energy flow and causes a build-up in the reservoir. When the reservoir gets fuller it can put enough pressure on the blockage for the outward flow through it to again match the incoming flow. Then we’d be at equilibrium, but with a fuller reservoir – a warmer planet. The more CO2 we emit, the worse the blockage gets and the hotter we get before reaching equilibrium. Read the rest of this entry »

Real-world grounding could cool 21st century outlook

The world's surface air temperature change ("anomaly"), relative to the world's mean temperature of 58° F or 14.5° C, averaged over land and oceans from 1975 to 2008. Inset are two periods of no warming or cooling within this overall warming trend. Copyright 2009 American Geophysical Union. Reproduced/modified by permission of American Geophysical Union.

The world’s surface air temperature change (“anomaly”), relative to the world’s mean temperature of 58° F or 14.5° C, averaged over land and oceans from 1975 to 2008. Inset are two periods of no warming or cooling within this overall warming trend. Copyright 2009 American Geophysical Union. Reproduced/modified by permission of Wiley/American Geophysical Union, see citation below.

Starting climate models from measured data helps simulate the early-2000s global warming hiatus better, and reduces projections for warming through to 2035. Jerry Meehl and Haiyan Teng have compared such ‘initialised’ model runs against more common ‘uninitialised’ ones starting without real-life conditions. The scientists, from the US National Centre for Atmospheric Research (NCAR) in Boulder, Colorado, find initialised runs get closer to modelling that hiatus and surprisingly rapid warming in the 1970s. Using the same approach, admittedly rough 30-year predictions for Earth’s surface air temperature initialised in 2006 are about one-sixth less than uninitialised projections. “We have evidence that if we would have had this methodology in the 1990s, we could have predicted the early-2000s hiatus,” Jerry told me.

The hiatus Jerry and Haiyan studied – an easing off in the rate of global warming since 1998 – is perhaps the aspect of climate change most hotly debated today. But hiatus is a slippery word, whose meaning depends on who is highlighting what points on which graph. Climate skeptics will often infer that it’s evidence that global warming is not a problem, or that it shows we know too little to act on climate change. The UN Intergovernmental Panel on Climate Change puts it in plain numbers: the rate of warming from 1998-2012 was 0.05°C per decade; from 1951 to 2012, it was 0.12°C per decade. “In addition to robust multi-decadal warming, global mean surface temperature exhibits substantial decadal and interannual variability,” it adds.  “Due to natural variability, trends based on short records are very sensitive to the beginning and end dates and do not in general reflect long-term climate trends.”

In a paper published online in the journal Geophysical Research Letters last week, Jerry and Haiyan touch on the current best explanations of the let-up. These include the chilling effect of recent volcano eruptions, but mostly focus on cooling in the Pacific as part of a natural cycle. Called the Interdecadal Pacific Oscillation (IPO), this regular wobble in sea surface temperatures has likely partly masked greenhouse-gas driven warming. The IPO has also been linked to a larger warming than might have been expected from greenhouse gases alone in the 1970s, the NCAR researchers add. Read the rest of this entry »

The warrior who gave his life to climate change

Steve Schneider juggled with the dilemma of conveying the urgency of climate change, while conveying how we can be confident about that when the evidence is sometimes uncertain Credit: L.A. Cicero, Stanford University News.

Steve Schneider juggled with the dilemma of conveying the urgency of climate change, while conveying how we can be confident about that when the evidence is sometimes uncertain Credit: L.A. Cicero, Stanford University News.

  • This is part three of this profile. You can also read part one and two.

“Each of us has to decide what the right balance is between being effective and being honest.” Those are words – seemingly advising his colleagues to lie to shape public opinion – credited to climate scientist Stephen Schneider in a 1989 Discover magazine article. Because of that quote, Steve was told, a number of US congressional hearings chose not to invite him as an expert witness.

With extreme irony, it came from an interview where Steve was expressing frustration after having been misrepresented in the media. He later argued that he was trying to explain how to succeed faced with the “double ethical bind” of being effective and honest in conveying both uncertainty and urgency. Even in the original article he adds, “I hope that means being both” – something often overlooked in scandalised reactions to his words. But perhaps the trouble this caused Steve in fact reflects the strangeness of the idea he dedicated much of his career to getting across. How can scientific projections with an element of uncertainty lead to the conclusion that we must act on climate change urgently?

We’re instinctively uncomfortable with uncertainty, and so Steve wanted clearer ways to get it across in the UN Intergovernmental Panel on Climate Change (IPCC)’s second climate change assessment report. At a meeting in 1994 he pushed for their estimates to come as probability distributions showing the odds for each in a range of different values. This could be done, he argued, for everything from damage likely to be caused by global warming to values for numbers central to natural processes, like climate sensitivity.

With no-one backing Steve, ambiguity crept into the report. Did everyone think that saying they had “high confidence” in a statement meant the same thing? After the second report, returning to his recently-appointed post at Stanford University in California, he was determined to hammer out any doubt. Together with the IPCC’s Richard Moss, Steve found 100 examples of inconsistent use of such terms. Armed with that shocking finding they persuaded the IPCC’s working group I, which discusses the physics of climate change, to define clear scales.

The result would be a double strategy, with verbal scales both for the probability of a forecast and for the reliability of the underlying science. For probabilities, low confidence meant a less than 1-in-3 chance, medium between 1-in-3 and 2-in-3 and high 2-in-3 and above. Very high confidence meant a 19-in-20 chance, and very low confidence 1-in-20. There were four grades for the quality of science, ranging from ‘speculative’ to ‘well established’. Reading through thousands of draft pages ensuring consistency in the run-up to the IPCC’s third report, published in 2001, Richard and Steve became known as the ‘uncertainty police’. Read the rest of this entry »

When the climate change fight got ugly

  • Steve Schneider talks about climate and energy with Johnny Carson on the Tonight Show in 1977, early on in his efforts to bring human-caused climate change to the public's notice.

    Steve Schneider talks about climate and energy with Johnny Carson on the Tonight Show in 1977, early on in his efforts to bring human-caused climate change to the public’s notice.

    This is part two of this profile. Read part one here.

“How many of you think the world is cooling?” That’s what Steve Schneider asked the studio audience of the Tonight Show with Johnny Carson in September 1977. And when the majority put their hands up, he explained that the recent cooling trend had only been short-term. Though the unscripted poll meant Steve wasn’t invited back to the programme, through the summer of that year he had brought climate science to US national TV. The appearances typified Steve’s efforts to bring climate change to the world’s notice – efforts that would later draw attention of a less desirable sort.

Building on his earlier high profile research, Steve had just published ‘The Genesis Strategy: Climate and Global Survival’, predicting ‘demonstrable climate change’ by the end of the century. Whether human pollution would cause warming or cooling, he argued governments should copy the biblical story where Joseph told Pharoah to prepare for lean years ahead. In a decade already torn by rocketing food and oil prices, the advice resonated with many who wanted to head off any further crises.

Some scientists criticised Steve and those like him for speaking straight to the public. In particular, climate science uncertainties were so great that they feared confusion – like that over whether temperatures were rising or falling – was inevitable. That dispute grew from a basic question about science’s place in society. Should researchers concentrate on questions they can comfortably answer using their existing methods? Or should they tackle questions the world needs answered, even if the results that follow are less definite?

At a meeting to discuss climate and modelling research within the Global Atmospheric Research Programme (GARP) in 1974 near Stockholm, Sweden, Steve pushed for the second approach. Given the food problems the world was struggling with at the time, it seemed obvious that climate change impacts like droughts, floods and extreme temperatures would bring famines. “I stood alone in arguing that we had to consider the implications of what we were researching,” Steve later wrote. While some attacked him angrily, saying they weren’t ready to address these problems, conference organiser Bert Bolin agreed that socially important questions must be answered.

The suggestion was also controversial because it meant blurring the lines between climate science and other subjects, such as agriculture, ecology and even economics. The director at the US National Center for Atmospheric Research (NCAR) in Boulder, Colorado, where Steve worked, warned that crossing subject boundaries might cost him promotion. But he responded with characteristic wilfulness, founding a journal doing exactly what he was warned not to. Read the rest of this entry »

Carbon prices alone won’t keep warming below 2°C

Annela Anger-Kraavi from the University of East Anglia. Image credit: University of East Anglia

Annela Anger-Kraavi from the University of East Anglia. Image credit: University of East Anglia

It’s possible to cut emissions of the greenhouse gas CO2 enough to keep global warming below 2°C, while maintaining growth in the world’s industrial production. But the ‘decarbonisation scenario’ developed by UK researchers needs more than just the carbon price relied on by most economists looking at how to reduce our fossil fuel use. “There are no magic measures,” Annela Anger-Kraavi from the University of East Anglia told me. “You need regulations, a carbon price, and investment. And all industries need to be considered simultaneously – you cannot leave anything out if you want to achieve the best results.”

Seeing temperatures 2°C higher than the ‘pre-industrial’ average from 1850-1899 as the danger level for climate change, governments have agreed to keep warming below this limit. Scientists have calculated that 450 CO2 molecules are allowable in every million air molecules to give us better than a 3/5 chance of keeping temperature rises below 2°C. But few economists have modelled scenarios that might keep CO2 levels below this 450 parts per million (ppm) threshold, as many think it’s clearly too expensive. In the report released by the United Nations Intergovernmental Panel on Climate Change in 2007, for example, just six out of 177  studies summarised looked at this range.

Since then there have been more studies, still finding the costs too high. But they use ‘neoclassical’ economic theories  – something Annela and her teammates wanted to avoid, because models based on such theories failed to predict the recent financial crisis. They also mainly rely on charging for the right to emit CO2 to cut emissions, through a carbon tax, or emission permits that companies must pay for. In the New Economics of Decarbonising the Global Economy (NewEDGE) project the University of Cambridge’s Terry Barker developed a model that links 43 sectors across the world economy. It avoids some central Keynesian ideas, such as that prices will always change everywhere at the same time, and that everything supplied is always bought. Read the rest of this entry »

Stark conclusions seek to empower young to sue for climate justice

Jim Hansen (bottom left) and his family. For their benefit, and for the next generation as a whole, he is pushing for more urgent action on global warming. Credit: James Hansen

Jim Hansen (bottom left) and his family. For their benefit, and for the next generation as a whole, he is pushing for more urgent action on global warming. Credit: JimHansen

Even limiting human-made global climate warming to 2°C above preindustrial temperatures would subject young people, future generations and nature to irreparable harm, leading scientists said on Tuesday. The team led by pioneering climate researcher Jim Hansen, now at Columbia University in New York, calls aiming for this internationally-recognised threshold ‘foolhardy’. In a paper published in PLOS ONE, they outline a case for aiming for 1°C that supports efforts to sue the US government for not doing enough.

“Governments are blatantly failing to do their job,” Jim told me. “They know that human-caused climate change is beginning and poses a huge risk to young people and future generations, and they understand that we must phase out fossil fuel emissions. Yet they go right ahead encouraging companies to go after every fossil fuel that can be found!”

As one of the first climate modellers, Jim has long warned about the greenhouse effect caused by the CO2 we emit from burning fossil fuels. On a sweltering June 23, 1988, he famously testified to the Energy and Natural Resources Committee of the US Senate on the dangers of global warming. “It’s time to stop waffling so much and say that the evidence is pretty strong that the greenhouse effect is here,” he told reporters at the time.

Yet Jim remains frustrated at the slow pace of action, and regularly voices it. In 2006 Mary Wood from the University of Oregon Law School saw one of his articles in the New York Review of Books and contacted him. Her work inspired the formation of a team of lawyers who are suing the US federal government, highlighting the principle that US citizens, young and old, have ‘equal protection of the laws’. “I agreed specifically to write a paper that would provide the scientific basis for legal actions against governments for not doing their job of protecting the rights of young people,” Jim recalled. Read the rest of this entry »

Fighting for useful climate models

  • This is part two of a two-part post. Read part one here.
Princeton University's Suki Manabe published his latest paper in March this year, 58 years after his first one. Credit: Princeton University

Princeton University’s Suki Manabe published his latest paper in March this year, 58 years after his first one. Credit: Princeton University

When Princeton University’s Syukuro Manabe first studied global warming with general circulation models (GCMs), few other researchers approved. It was the 1970s, computing power was scarce, and the GCMs had grown out of mathematical weather forecasting to become the most complex models available. “Most people thought that it was premature to use a GCM,” ‘Suki’ Manabe told interviewer Paul Edwards in 1998. But over following decades Suki would exploit GCMs widely to examine climate changes ancient and modern, helping make them the vital research tool they are today.

In the 1970s, the world’s weather and climate scientists were building international research links, meeting up to share the latest knowledge and plan their next experiments. Suki’s computer modelling work at Princeton’s Geophysical Fluid Dynamics Laboratory (GFDL) had made his mark on this community, including two notably big steps. He had used dramatically simplified GCMs to simulate the greenhouse effect for the first time, and developed the first such models linking the atmosphere and ocean. And when pioneering climate research organiser Bert Bolin invited Suki to a meeting in Stockholm, Sweden, in 1974, he had already brought these successes together.

Suki and his GFDL teammate Richard Weatherald had worked out how to push their global warming study onto whole world-scale ocean-coupled GCMs. They could now consider geographical differences and indirect effects, for example those due to changes of the distribution of snow and sea ice. Though the oceans in the world they simulated resembled a swamp, shallow and unmoving, they got a reasonably realistic picture of the difference between land and sea temperatures. Their model predicted the Earth’s surface would warm 2.9°C if the amount of CO2 in the air doubled, a figure known as climate sensitivity. That’s right in the middle of today’s very latest 1.5-4.5°C range estimate.

Comparison between the measured sea surface temperature in degrees C calculated by the GFDL ocean-coupled GCM, from a 1975 GARP report chapter Suki wrote - see below for reference.

Comparison between the measured sea surface temperature in degrees C calculated by the GFDL ocean-coupled GCM, from a 1975 GARP report chapter Suki wrote – see below for reference.

At the time no-one else had the computer facilities to run this GCM, and so they focussed on simpler models, and fine details within them. Scientists model climate by splitting Earth’s surface into 3D, grids reaching up into the air. They can then calculate what happens inside each cube and how it affects the surrounding cubes. But some processes are too complex or happen on scales that are too small to simulate completely, and must be replaced by ‘parameterisations’ based on measured data. To get his GCMs to work Suki had made some very simple parameterisations, and that was another worry for other scientists. Read the rest of this entry »

The man who got the world to agree on climate

  • This is part two of a two-part post. Read part one here.
When not tackling climate science or negotiations Bert Bolin liked nothing more than a little choir singing. Credit: KVA

When not tackling climate science or negotiations Bert Bolin liked nothing more than a little choir singing. Credit: KVA

In 1975, advised by Bert Bolin, the Swedish government drafted a bill on future energy policy containing a conclusion that elsewhere might be controversial even today. “It is likely that climatic concerns will limit the burning of fossil fuels rather than the size of the natural resources,” it foresaw. Produced thanks to Bert’s early role tackling environmental issues, it was one of the first times humans’ effect on climate and the risk it poses us was noted officially. For more than two decades afterward the Stockholm University researcher would further strengthen that case, both through his research and by putting climate science firmly on the political agenda. And those tireless efforts would help the United Nations’ Intergovernmental Panel on Climate Change (UN IPCC) to consistently achieve what otherwise might have been impossible agreements.

The Swedish bill was a bold statement, given that average air temperatures were only just about to reverse a slight cooling that had gone on since 1940. Bert and scientists like Dave Keeling had shown that CO2 levels in the atmosphere were rising. Basic science established by Svante Arrhenius 80 years before had showed this should warm Earth’s surface. So why was it cooling? The way scientists found the answer was typical of the progress in climate science Bert was overseeing. They would use the latest tools, including computers and satellites, bringing theory and measurement together to improve our understanding.

Climate models in the early 1970s were still simple by today’s standards, but had advanced from the early computerised weather predictions Bert had previously pioneered. And when Columbia University’s Stephen Schneider and S. Ichtiaque Rasool added aerosols of floating dust to CO2 in a model for the first time, they found a possible explanation for the temperature drop. The aerosols, particularly human pollution, created a cooling effect that swamped the warming – so much so they warned it could trigger an ice age. Though Stephen and Ichtiaque soon realised that their model overestimated the cooling, aerosols obviously deserved a closer look.

To clear up such murky problems, the Global Atmospheric Research Programme (GARP) that Bert jointly set up would bring together scientists from around the world, despite the cold war. As GARP’s first experiments, looking at heat and moisture flow between the atmosphere and ocean, started in 1974, Bert organised a meeting in Stockholm on climate physics and modelling. GARP had two goals – improving 6-10 day weather forecasts first, and climate change predictions second. As it gradually became clear how hard the first was, climate forecasting became more important.

Diplomacy was needed among the gathered scientists as arguments flared over how ambitious they should be. Should they strive for satellites that could collect the high resolution data scientists and models needed, even though that was beyond their capabilities at the time? And significantly for later climate work – should they seek to produce results so society could respond to change, even when results were uncertain? Bert was clear on that one: scientists had to answer socially important questions, though he was in a very small minority prepared to say so openly. Read the rest of this entry »

The underprepared figurehead that led climate science from calculation to negotiation

Bert Bolin discussing weather maps in Stockholm circa 1955. Image copyright Tellus B, used via Creative Commons license, see Rodhe paper referenced below.

Bert Bolin discussing weather maps in Stockholm circa 1955. Image copyright Tellus B, used via Creative Commons license, see Rodhe paper referenced below.

In 1957, at the young age of 32 and just one year after completing his PhD, Bert Bolin officially gained a new skill: leadership. Taking over the International Meteorological Institute (IMI) in Stockholm, Sweden, after his mentor Carl-Gustaf Rossby’s sudden death must have been a huge shock. But Bert gained responsibility after responsibility over the next 40 years, ultimately becoming the first chairman of the United Nations’ Intergovernmental Panel on Climate Change (UN IPCC). And though it’s hard to beat setting up a Nobel-prize winning organisation, Bert was not just an administrator – his research helped build the foundations of climate science too.

Growing up in Nyköping, south of Stockholm, Bert recorded the weather with encouragement from a schoolteacher father who had studied meteorology at university. After the pair met the Swedish Meteorological and Hydrological Institute’s deputy director when Bert was 17, he moved north to study maths, physics and meteorology at the University of Uppsala. Immediately after graduating in 1946 he went to Stockholm to do military service, where he first saw Carl-Gustaf giving a series of lectures.

By that time Carl-Gustaf had been living in the US for 21 years, pioneering mathematical and physical analysis of the atmosphere, becoming the country’s foremost meteorologist. He had set up meteorology departments at Massachusetts Institute of Technology in the 1930s, and the University of Chicago, Illinois, in the 1940s. He had also modernised the US Weather bureau and by 1946 wanted to help improve meteorology’s status in his native Sweden. As Carl-Gustaf’s renowned organisational prowess gradually pulled together the IMI, Bert came to study with him, gaining his Master’s degree in 1950.

Carl-Gustaf was collaborating with leading scientists of his time, and through some of these links Bert spent a year working in the US after his Master’s. Perhaps the most notable such relationship was with John von Neumann at Princeton University in New Jersey, who had helped develop the hydrogen bomb. John and his team had made history using arguably the world’s first computer, ENIAC, to predict weather mathematically. But when errors emerged, Carl-Gustaf asked Bert to help analyse why, using his understanding of the atmosphere to prevent such forecasts being ‘mathematical fantasy’. Read the rest of this entry »

Enhanced fingerprinting strengthens evidence for human warming role

Microwave sounding units, like the AMSU units on the Aqua satellite, shown here, can be used to take temperature measurements from different layers in the atmosphere. Ben Santer and his colleagues use this information to find a 'fingerprint' of human impact on recent climate changes. Credit: NASA

Microwave sounding units, like the AMSU units on the Aqua satellite, shown here, can be used to take temperature measurements from different layers in the atmosphere. Ben Santer and his colleagues use this information to find a ‘fingerprint’ of human impact on recent climate changes. Credit: NASA

We have left a clear climate change ‘fingerprint’ in the atmosphere, through CO2 emissions that have made air near the Earth’s surface warmer and caused cooling higher up. That’s according to Ben Santer from Lawrence Livermore National Laboratory (LLNL) in California, who started studying this fingerprint in the mid-1990s, and his expert team. They have strengthened the case by comparing satellite-recorded temperature data against the latest climate models including natural variations within Earth’s climate system, and from the sun and volcanic eruptions. Ben hopes that in the process their results will finally answer ill-tempered criticism his earlier work attracted, and lingering doubts over what causes global warming.

“There are folks out there even today that posit that the entire observed surface warming since 1950 is due to a slight uptick in the Sun’s energy output,” Ben told me. “That’s a testable hypothesis.  In this paper we look at whether changes in the sun plausibly explain the observed changes that we’ve monitored from space since 1979. The very clear answer is that they cannot. Natural influences alone, the sun, volcanoes, internal variability, either individually or in combination, cannot explain this very distinctive pattern of warming.”

That pattern emerged when scientists in the 1960s did some of the first computer modelling experiments looking at what would happen on an Earth with higher CO2 levels in the air. “They got back this very curious warming in the lower atmosphere and cooling of the upper levels of the atmosphere,” Ben explained. The effect happens because most of the gas molecules in the atmosphere, including CO2, sit relatively near to Earth’s surface. CO2′s greenhouse effect lets heat energy from the Sun reach the Earth, but interrupts some of it getting back to the upper atmosphere and outer space. Adding more CO2 by burning fossil fuels therefore warms the lower atmosphere, or troposphere, and cools the stratosphere, 6-30 miles above the Earth’s surface.  Read the rest of this entry »

Follow

Get every new post delivered to your Inbox.

Join 160 other followers