IPCC: Millions of words on climate change are not enough

The latest IPCC report has highlighted that it's dead certain that the world has warmed, and that it's extremely likely that humans are the main cause. Credit: IPCC

The latest IPCC report has highlighted that it’s dead certain that the world has warmed, and that it’s extremely likely that humans are the main cause. Credit: IPCC

The most recent UN Intergovernmental Panel on Climate Change (IPCC) report saw perhaps the most severe conflict between scientists and politicians in the organisation’s existence. As its name suggests, governments take an active part in the IPCC process, whose latest main findings appeared between September 2013 and May 2014. Debate over what information makes the high-profile ‘Summaries for Policymakers’ is usually intense, but this time three graphs were dropped on politicians’ insistence. I show these graphs later in this blog entry.

At the Transformational Climate Science conference in my home town, Exeter, UK, earlier this month, senior IPCC author Ottmar Edenhofer discussed the ‘battle’ with governments on his part of the report. Another scientist who worked on the report highlighted confidentially to me how unusual the omission was.

To me, it’s more surprising that this hasn’t happened more often, especially when you look more closely at the latest report’s findings. There’s concrete certainty that warming is happening, and it’s extremely likely that humans are the dominant cause, it says. Governments have even – in some cases, begrudgingly – already signed up to temperature and CO2 emission targets reflecting this fact.

The inadequacy of those words is becoming ever more starkly obvious. Ottmar stressed that the emissions levels agreed at the United Nations’ Climate Change Conference in Cancún, Mexico, in November 2010, would likely need later emissions cuts the likes of which we’ve never seen before to avoid dangerous climate change. The latest IPCC report shines a floodlight on that inertia, which understandably cranks up the tension between researchers and politicians.

Ottmar was one of two co-chairs who led the ‘working group three’ (WGIII) section of the IPCC report that looks at how to cut greenhouse gas emissions. He stressed that the need to make these cuts comes from a fundamental difference between the risks that come from climate change and the risks of mitigation. We can heal economic damage arising from cutting emissions – reversing sea level rise isn’t so easy.

Read the rest of this entry »

The witness who collided with government on climate

  • This is part two of this profile. Read part one here.
Jim Hansen giving testimony at a US Congressional hearing in 1988, where he'd declare 99% certainty that humans are changing the climate. Image credit: NASA

Jim Hansen giving testimony at a US Congressional hearing in 1988, where he’d declare 99% certainty that humans are changing the climate. Image credit: NASA

“It’s time to stop waffling so much and say that the evidence is pretty strong that the greenhouse effect is here.” It’s a comment that wouldn’t sound out of place today, but Jim Hansen made it 26 years ago, on June 23, 1988, amid record 38°C temperatures in Washington DC. Jim said it to reporters after telling a Congressional hearing he was 99% certain the world is getting warmer thanks to human-made greenhouse gases.

Jim’s 1980s media bombshells led journalist Robert Pool to liken him to a religious ‘witness’, ‘someone who believes he has information so important that he cannot keep silent’. However, he still felt shy and awkward, preferring to immerse himself in pure science, and so would turn down almost all invitations to speak out for another decade. Jim’s efforts during that period would then help build even stronger evidence on global warming. But with extra motivation provided by clashes with the US government and the arrival of his grandchildren he would return to bear witness more forcefully than ever.

Before his self-imposed media ban Jim would make headlines one more time in 1989, after giving written evidence to a hearing convened by then US senator Al Gore. The testimony reaching the hearing had been altered by the White House to make his conclusions about the dangers of global warming seem less certain. When Jim sent the future vice-president a note telling him this, he alerted the media, turning their scheming into the lead story across all TV networks that evening. John Sununu, aide to then president George H. W. Bush, would then try to get Jim fired for his troubles. But Republican senator John Heinz intervened on Jim’s behalf, and he kept his job.

The reputation Jim had built up as a warming witness went ahead of him in December 1989, as he walked into a ‘roundtable’ meeting held by senators Al Gore and Barbara Mikulski. On the coldest day of the year, in a building whose heating system had failed, Al noticed Jim enter and said, “Hey, aren’t you the guy who…” Despite such jibes, Jim was becoming firmer in his convictions. In April 1990 he offered a group of climatologists an even money bet that one of the next three years would be the warmest in a century. He’d be proven right by the end of the year. Read the rest of this entry »

Climate change-boosted disease could endanger China’s food supply

Wheat ear infected with Fusarium ear blight (FEB), giving the ear a pinkish color. The disease could be set to increase in countries like China and the UK with climate change, Bruce Fitt and his teammates have found, suggesting resistant varieties should be developed. Photo credit: CIMMYT.

Wheat ear infected with Fusarium ear blight (FEB), giving the ear a pinkish color. The disease could be set to increase in countries like China and the UK with climate change, Bruce Fitt and his teammates have found, suggesting resistant varieties should be developed. Photo credit: CIMMYT.

As the planet warms, China’s wheat crops will be threatened by more frequent epidemics of ‘fusarium ear blight’ (FEB), scientists in the UK and China have projected. Bruce Fitt from the University of Hertfordshire in Hatfield, UK, and his teammates forecast levels of the disease in the Anhui and Hubei provinces from 2021-2050. Whereas in the worst affected regions in 2001-2010 around one-sixth of all ears were infected, this was the lowest disease level the researchers found in their future scenario. In the worst-hit areas, FEB infected more than a third of all ears. “This has implications for crop breeding because it takes 10-15 years to breed a new cultivar,” Bruce told me. “If you know the disease is going to become more important then you need to get on and start breeding now rather than waiting until the disease hits you.”

Today, over a billion people don’t have enough to eat, and further population growth and climate change are set to put the world’s food supplies under even greater strain. To help ease that pressure, Bruce and other scientists are working to understand and help improve control of crop diseases like FEB. While some crop diseases will worsen in the future, not all will, he stressed. “For example, you might have a disease that is spread by rainsplash in summer and then it’s predicted that there will be far less rainfall in summer,” he explained. “Then you would expect that with climate change the importance of that disease would diminish.” If governments, farmers and seed suppliers know which diseases are likely to get worse, they can prioritise developing strategies to contol them, like breeding disease resistant varieties.

To make useful forecasts for which diseases will worsen, scientists build models that include weather data, how crops grow and how the disease pathogen spreads through the crop. “In this particular instance the wheat is susceptible only at flowering,” Bruce said. “It may be in flower for a few days. If it doesn’t get the pathogen inoculum and the right weather conditions at that time it will not get the disease.” Climate change can both alter flowering times and the chances of warm, wet weather that make infection more likely. When wheat gets infected, even if it can be harvested it is more likely to contain poisonous mycotoxins. “If it’s full of mycotoxins then it can’t be eaten by man or beast, so it’s just wasted,” Bruce added. Read the rest of this entry »

Climate sensitivity wrangles don’t change the big picture on emissions

The sources of data that scientists can use to determine climate sensitivity include ice cores, the cylinders these researchers are holding at the Vostok station in Antarctica. Image credit: Todd Sowers, Columbia University

The sources of data that scientists can use to determine climate sensitivity include ice cores, the cylinders these researchers are holding at the Vostok station in Antarctica. Image credit: Todd Sowers, Columbia University

How much does the world warm up in response to a certain amount of greenhouse gases like CO2 in the atmosphere? It’s a simple question, but its answer depends on whether you mean short-term or long-term warming, and estimates vary according to the methods used. Scientists are currently intensively debating long-term ‘climate sensitivity’, which begs prompts the question: might we be pushing too hard to cut climate CO2 emissions, if this is uncertain?

The answer is no, according to Joeri Rogelj from the Swiss Federal Institute of Technology, ETH Zurich, and his coworkers. They looked at how a range of climate sensitivity values affected their 21st century warming projections in a paper published in Environmental Research Letters last week. “When taking into account all available evidence, the big picture doesn’t change,” Joeri told me. The ‘carbon budget’ of greenhouse gases we could still emit today and in the future is very limited whatever the climate sensitivity, he explained. “Keeping the so-called carbon budget in line with warming below 2°C still requires a decarbonisation of  global society over the first half of this century.”

Climate sensitivity is the measure of how much the world will eventually warm when it reaches equilibrium after a doubling of CO2 in the air. Today, we have upset the normal equilibrium where the Sun’s energy flowing into the atmosphere matches the flow the Earth radiates back out of it. Now more is coming in than leaving, and that’s heating the planet up. Think of the atmosphere as a series of pipes, with energy flowing through them like a liquid. The Earth is a reservoir in the system, filled by an incoming pipe and drained by an outgoing one. CO2 acts like a blockage in the outgoing pipe – it slows the outward energy flow and causes a build-up in the reservoir. When the reservoir gets fuller it can put enough pressure on the blockage for the outward flow through it to again match the incoming flow. Then we’d be at equilibrium, but with a fuller reservoir – a warmer planet. The more CO2 we emit, the worse the blockage gets and the hotter we get before reaching equilibrium. Read the rest of this entry »

Real-world grounding could cool 21st century outlook

The world's surface air temperature change ("anomaly"), relative to the world's mean temperature of 58° F or 14.5° C, averaged over land and oceans from 1975 to 2008. Inset are two periods of no warming or cooling within this overall warming trend. Copyright 2009 American Geophysical Union. Reproduced/modified by permission of American Geophysical Union.

The world’s surface air temperature change (“anomaly”), relative to the world’s mean temperature of 58° F or 14.5° C, averaged over land and oceans from 1975 to 2008. Inset are two periods of no warming or cooling within this overall warming trend. Copyright 2009 American Geophysical Union. Reproduced/modified by permission of Wiley/American Geophysical Union, see citation below.

Starting climate models from measured data helps simulate the early-2000s global warming hiatus better, and reduces projections for warming through to 2035. Jerry Meehl and Haiyan Teng have compared such ‘initialised’ model runs against more common ‘uninitialised’ ones starting without real-life conditions. The scientists, from the US National Centre for Atmospheric Research (NCAR) in Boulder, Colorado, find initialised runs get closer to modelling that hiatus and surprisingly rapid warming in the 1970s. Using the same approach, admittedly rough 30-year predictions for Earth’s surface air temperature initialised in 2006 are about one-sixth less than uninitialised projections. “We have evidence that if we would have had this methodology in the 1990s, we could have predicted the early-2000s hiatus,” Jerry told me.

The hiatus Jerry and Haiyan studied – an easing off in the rate of global warming since 1998 – is perhaps the aspect of climate change most hotly debated today. But hiatus is a slippery word, whose meaning depends on who is highlighting what points on which graph. Climate skeptics will often infer that it’s evidence that global warming is not a problem, or that it shows we know too little to act on climate change. The UN Intergovernmental Panel on Climate Change puts it in plain numbers: the rate of warming from 1998-2012 was 0.05°C per decade; from 1951 to 2012, it was 0.12°C per decade. “In addition to robust multi-decadal warming, global mean surface temperature exhibits substantial decadal and interannual variability,” it adds.  “Due to natural variability, trends based on short records are very sensitive to the beginning and end dates and do not in general reflect long-term climate trends.”

In a paper published online in the journal Geophysical Research Letters last week, Jerry and Haiyan touch on the current best explanations of the let-up. These include the chilling effect of recent volcano eruptions, but mostly focus on cooling in the Pacific as part of a natural cycle. Called the Interdecadal Pacific Oscillation (IPO), this regular wobble in sea surface temperatures has likely partly masked greenhouse-gas driven warming. The IPO has also been linked to a larger warming than might have been expected from greenhouse gases alone in the 1970s, the NCAR researchers add. Read the rest of this entry »

The warrior who gave his life to climate change

Steve Schneider juggled with the dilemma of conveying the urgency of climate change, while conveying how we can be confident about that when the evidence is sometimes uncertain Credit: L.A. Cicero, Stanford University News.

Steve Schneider juggled with the dilemma of conveying the urgency of climate change, while conveying how we can be confident about that when the evidence is sometimes uncertain Credit: L.A. Cicero, Stanford University News.

  • This is part three of this profile. You can also read part one and two.

“Each of us has to decide what the right balance is between being effective and being honest.” Those are words – seemingly advising his colleagues to lie to shape public opinion – credited to climate scientist Stephen Schneider in a 1989 Discover magazine article. Because of that quote, Steve was told, a number of US congressional hearings chose not to invite him as an expert witness.

With extreme irony, it came from an interview where Steve was expressing frustration after having been misrepresented in the media. He later argued that he was trying to explain how to succeed faced with the “double ethical bind” of being effective and honest in conveying both uncertainty and urgency. Even in the original article he adds, “I hope that means being both” – something often overlooked in scandalised reactions to his words. But perhaps the trouble this caused Steve in fact reflects the strangeness of the idea he dedicated much of his career to getting across. How can scientific projections with an element of uncertainty lead to the conclusion that we must act on climate change urgently?

We’re instinctively uncomfortable with uncertainty, and so Steve wanted clearer ways to get it across in the UN Intergovernmental Panel on Climate Change (IPCC)’s second climate change assessment report. At a meeting in 1994 he pushed for their estimates to come as probability distributions showing the odds for each in a range of different values. This could be done, he argued, for everything from damage likely to be caused by global warming to values for numbers central to natural processes, like climate sensitivity.

With no-one backing Steve, ambiguity crept into the report. Did everyone think that saying they had “high confidence” in a statement meant the same thing? After the second report, returning to his recently-appointed post at Stanford University in California, he was determined to hammer out any doubt. Together with the IPCC’s Richard Moss, Steve found 100 examples of inconsistent use of such terms. Armed with that shocking finding they persuaded the IPCC’s working group I, which discusses the physics of climate change, to define clear scales.

The result would be a double strategy, with verbal scales both for the probability of a forecast and for the reliability of the underlying science. For probabilities, low confidence meant a less than 1-in-3 chance, medium between 1-in-3 and 2-in-3 and high 2-in-3 and above. Very high confidence meant a 19-in-20 chance, and very low confidence 1-in-20. There were four grades for the quality of science, ranging from ‘speculative’ to ‘well established’. Reading through thousands of draft pages ensuring consistency in the run-up to the IPCC’s third report, published in 2001, Richard and Steve became known as the ‘uncertainty police’. Read the rest of this entry »

When the climate change fight got ugly

  • Steve Schneider talks about climate and energy with Johnny Carson on the Tonight Show in 1977, early on in his efforts to bring human-caused climate change to the public's notice.

    Steve Schneider talks about climate and energy with Johnny Carson on the Tonight Show in 1977, early on in his efforts to bring human-caused climate change to the public’s notice.

    This is part two of this profile. Read part one here.

“How many of you think the world is cooling?” That’s what Steve Schneider asked the studio audience of the Tonight Show with Johnny Carson in September 1977. And when the majority put their hands up, he explained that the recent cooling trend had only been short-term. Though the unscripted poll meant Steve wasn’t invited back to the programme, through the summer of that year he had brought climate science to US national TV. The appearances typified Steve’s efforts to bring climate change to the world’s notice – efforts that would later draw attention of a less desirable sort.

Building on his earlier high profile research, Steve had just published ‘The Genesis Strategy: Climate and Global Survival’, predicting ‘demonstrable climate change’ by the end of the century. Whether human pollution would cause warming or cooling, he argued governments should copy the biblical story where Joseph told Pharoah to prepare for lean years ahead. In a decade already torn by rocketing food and oil prices, the advice resonated with many who wanted to head off any further crises.

Some scientists criticised Steve and those like him for speaking straight to the public. In particular, climate science uncertainties were so great that they feared confusion – like that over whether temperatures were rising or falling – was inevitable. That dispute grew from a basic question about science’s place in society. Should researchers concentrate on questions they can comfortably answer using their existing methods? Or should they tackle questions the world needs answered, even if the results that follow are less definite?

At a meeting to discuss climate and modelling research within the Global Atmospheric Research Programme (GARP) in 1974 near Stockholm, Sweden, Steve pushed for the second approach. Given the food problems the world was struggling with at the time, it seemed obvious that climate change impacts like droughts, floods and extreme temperatures would bring famines. “I stood alone in arguing that we had to consider the implications of what we were researching,” Steve later wrote. While some attacked him angrily, saying they weren’t ready to address these problems, conference organiser Bert Bolin agreed that socially important questions must be answered.

The suggestion was also controversial because it meant blurring the lines between climate science and other subjects, such as agriculture, ecology and even economics. The director at the US National Center for Atmospheric Research (NCAR) in Boulder, Colorado, where Steve worked, warned that crossing subject boundaries might cost him promotion. But he responded with characteristic wilfulness, founding a journal doing exactly what he was warned not to. Read the rest of this entry »

Carbon prices alone won’t keep warming below 2°C

Annela Anger-Kraavi from the University of East Anglia. Image credit: University of East Anglia

Annela Anger-Kraavi from the University of East Anglia. Image credit: University of East Anglia

It’s possible to cut emissions of the greenhouse gas CO2 enough to keep global warming below 2°C, while maintaining growth in the world’s industrial production. But the ‘decarbonisation scenario’ developed by UK researchers needs more than just the carbon price relied on by most economists looking at how to reduce our fossil fuel use. “There are no magic measures,” Annela Anger-Kraavi from the University of East Anglia told me. “You need regulations, a carbon price, and investment. And all industries need to be considered simultaneously – you cannot leave anything out if you want to achieve the best results.”

Seeing temperatures 2°C higher than the ‘pre-industrial’ average from 1850-1899 as the danger level for climate change, governments have agreed to keep warming below this limit. Scientists have calculated that 450 CO2 molecules are allowable in every million air molecules to give us better than a 3/5 chance of keeping temperature rises below 2°C. But few economists have modelled scenarios that might keep CO2 levels below this 450 parts per million (ppm) threshold, as many think it’s clearly too expensive. In the report released by the United Nations Intergovernmental Panel on Climate Change in 2007, for example, just six out of 177  studies summarised looked at this range.

Since then there have been more studies, still finding the costs too high. But they use ‘neoclassical’ economic theories  – something Annela and her teammates wanted to avoid, because models based on such theories failed to predict the recent financial crisis. They also mainly rely on charging for the right to emit CO2 to cut emissions, through a carbon tax, or emission permits that companies must pay for. In the New Economics of Decarbonising the Global Economy (NewEDGE) project the University of Cambridge’s Terry Barker developed a model that links 43 sectors across the world economy. It avoids some central Keynesian ideas, such as that prices will always change everywhere at the same time, and that everything supplied is always bought. Read the rest of this entry »

Stark conclusions seek to empower young to sue for climate justice

Jim Hansen (bottom left) and his family. For their benefit, and for the next generation as a whole, he is pushing for more urgent action on global warming. Credit: James Hansen

Jim Hansen (bottom left) and his family. For their benefit, and for the next generation as a whole, he is pushing for more urgent action on global warming. Credit: JimHansen

Even limiting human-made global climate warming to 2°C above preindustrial temperatures would subject young people, future generations and nature to irreparable harm, leading scientists said on Tuesday. The team led by pioneering climate researcher Jim Hansen, now at Columbia University in New York, calls aiming for this internationally-recognised threshold ‘foolhardy’. In a paper published in PLOS ONE, they outline a case for aiming for 1°C that supports efforts to sue the US government for not doing enough.

“Governments are blatantly failing to do their job,” Jim told me. “They know that human-caused climate change is beginning and poses a huge risk to young people and future generations, and they understand that we must phase out fossil fuel emissions. Yet they go right ahead encouraging companies to go after every fossil fuel that can be found!”

As one of the first climate modellers, Jim has long warned about the greenhouse effect caused by the CO2 we emit from burning fossil fuels. On a sweltering June 23, 1988, he famously testified to the Energy and Natural Resources Committee of the US Senate on the dangers of global warming. “It’s time to stop waffling so much and say that the evidence is pretty strong that the greenhouse effect is here,” he told reporters at the time.

Yet Jim remains frustrated at the slow pace of action, and regularly voices it. In 2006 Mary Wood from the University of Oregon Law School saw one of his articles in the New York Review of Books and contacted him. Her work inspired the formation of a team of lawyers who are suing the US federal government, highlighting the principle that US citizens, young and old, have ‘equal protection of the laws’. “I agreed specifically to write a paper that would provide the scientific basis for legal actions against governments for not doing their job of protecting the rights of young people,” Jim recalled. Read the rest of this entry »

Fighting for useful climate models

  • This is part two of a two-part post. Read part one here.
Princeton University's Suki Manabe published his latest paper in March this year, 58 years after his first one. Credit: Princeton University

Princeton University’s Suki Manabe published his latest paper in March this year, 58 years after his first one. Credit: Princeton University

When Princeton University’s Syukuro Manabe first studied global warming with general circulation models (GCMs), few other researchers approved. It was the 1970s, computing power was scarce, and the GCMs had grown out of mathematical weather forecasting to become the most complex models available. “Most people thought that it was premature to use a GCM,” ‘Suki’ Manabe told interviewer Paul Edwards in 1998. But over following decades Suki would exploit GCMs widely to examine climate changes ancient and modern, helping make them the vital research tool they are today.

In the 1970s, the world’s weather and climate scientists were building international research links, meeting up to share the latest knowledge and plan their next experiments. Suki’s computer modelling work at Princeton’s Geophysical Fluid Dynamics Laboratory (GFDL) had made his mark on this community, including two notably big steps. He had used dramatically simplified GCMs to simulate the greenhouse effect for the first time, and developed the first such models linking the atmosphere and ocean. And when pioneering climate research organiser Bert Bolin invited Suki to a meeting in Stockholm, Sweden, in 1974, he had already brought these successes together.

Suki and his GFDL teammate Richard Weatherald had worked out how to push their global warming study onto whole world-scale ocean-coupled GCMs. They could now consider geographical differences and indirect effects, for example those due to changes of the distribution of snow and sea ice. Though the oceans in the world they simulated resembled a swamp, shallow and unmoving, they got a reasonably realistic picture of the difference between land and sea temperatures. Their model predicted the Earth’s surface would warm 2.9°C if the amount of CO2 in the air doubled, a figure known as climate sensitivity. That’s right in the middle of today’s very latest 1.5-4.5°C range estimate.

Comparison between the measured sea surface temperature in degrees C calculated by the GFDL ocean-coupled GCM, from a 1975 GARP report chapter Suki wrote - see below for reference.

Comparison between the measured sea surface temperature in degrees C calculated by the GFDL ocean-coupled GCM, from a 1975 GARP report chapter Suki wrote – see below for reference.

At the time no-one else had the computer facilities to run this GCM, and so they focussed on simpler models, and fine details within them. Scientists model climate by splitting Earth’s surface into 3D, grids reaching up into the air. They can then calculate what happens inside each cube and how it affects the surrounding cubes. But some processes are too complex or happen on scales that are too small to simulate completely, and must be replaced by ‘parameterisations’ based on measured data. To get his GCMs to work Suki had made some very simple parameterisations, and that was another worry for other scientists. Read the rest of this entry »

Follow

Get every new post delivered to your Inbox.

Join 185 other followers