Real-world grounding could cool 21st century outlook

The world's surface air temperature change ("anomaly"), relative to the world's mean temperature of 58° F or 14.5° C, averaged over land and oceans from 1975 to 2008. Inset are two periods of no warming or cooling within this overall warming trend. Copyright 2009 American Geophysical Union. Reproduced/modified by permission of American Geophysical Union.

The world’s surface air temperature change (“anomaly”), relative to the world’s mean temperature of 58° F or 14.5° C, averaged over land and oceans from 1975 to 2008. Inset are two periods of no warming or cooling within this overall warming trend. Copyright 2009 American Geophysical Union. Reproduced/modified by permission of Wiley/American Geophysical Union, see citation below.

Starting climate models from measured data helps simulate the early-2000s global warming hiatus better, and reduces projections for warming through to 2035. Jerry Meehl and Haiyan Teng have compared such ‘initialised’ model runs against more common ‘uninitialised’ ones starting without real-life conditions. The scientists, from the US National Centre for Atmospheric Research (NCAR) in Boulder, Colorado, find initialised runs get closer to modelling that hiatus and surprisingly rapid warming in the 1970s. Using the same approach, admittedly rough 30-year predictions for Earth’s surface air temperature initialised in 2006 are about one-sixth less than uninitialised projections. “We have evidence that if we would have had this methodology in the 1990s, we could have predicted the early-2000s hiatus,” Jerry told me.

The hiatus Jerry and Haiyan studied – an easing off in the rate of global warming since 1998 – is perhaps the aspect of climate change most hotly debated today. But hiatus is a slippery word, whose meaning depends on who is highlighting what points on which graph. Climate skeptics will often infer that it’s evidence that global warming is not a problem, or that it shows we know too little to act on climate change. The UN Intergovernmental Panel on Climate Change puts it in plain numbers: the rate of warming from 1998-2012 was 0.05°C per decade; from 1951 to 2012, it was 0.12°C per decade. “In addition to robust multi-decadal warming, global mean surface temperature exhibits substantial decadal and interannual variability,” it adds.  “Due to natural variability, trends based on short records are very sensitive to the beginning and end dates and do not in general reflect long-term climate trends.”

In a paper published online in the journal Geophysical Research Letters last week, Jerry and Haiyan touch on the current best explanations of the let-up. These include the chilling effect of recent volcano eruptions, but mostly focus on cooling in the Pacific as part of a natural cycle. Called the Interdecadal Pacific Oscillation (IPO), this regular wobble in sea surface temperatures has likely partly masked greenhouse-gas driven warming. The IPO has also been linked to a larger warming than might have been expected from greenhouse gases alone in the 1970s, the NCAR researchers add. Read the rest of this entry »

Fighting for useful climate models

  • This is part two of a two-part post. Read part one here.
Princeton University's Suki Manabe published his latest paper in March this year, 58 years after his first one. Credit: Princeton University

Princeton University’s Suki Manabe published his latest paper in March this year, 58 years after his first one. Credit: Princeton University

When Princeton University’s Syukuro Manabe first studied global warming with general circulation models (GCMs), few other researchers approved. It was the 1970s, computing power was scarce, and the GCMs had grown out of mathematical weather forecasting to become the most complex models available. “Most people thought that it was premature to use a GCM,” ‘Suki’ Manabe told interviewer Paul Edwards in 1998. But over following decades Suki would exploit GCMs widely to examine climate changes ancient and modern, helping make them the vital research tool they are today.

In the 1970s, the world’s weather and climate scientists were building international research links, meeting up to share the latest knowledge and plan their next experiments. Suki’s computer modelling work at Princeton’s Geophysical Fluid Dynamics Laboratory (GFDL) had made his mark on this community, including two notably big steps. He had used dramatically simplified GCMs to simulate the greenhouse effect for the first time, and developed the first such models linking the atmosphere and ocean. And when pioneering climate research organiser Bert Bolin invited Suki to a meeting in Stockholm, Sweden, in 1974, he had already brought these successes together.

Suki and his GFDL teammate Richard Weatherald had worked out how to push their global warming study onto whole world-scale ocean-coupled GCMs. They could now consider geographical differences and indirect effects, for example those due to changes of the distribution of snow and sea ice. Though the oceans in the world they simulated resembled a swamp, shallow and unmoving, they got a reasonably realistic picture of the difference between land and sea temperatures. Their model predicted the Earth’s surface would warm 2.9°C if the amount of CO2 in the air doubled, a figure known as climate sensitivity. That’s right in the middle of today’s very latest 1.5-4.5°C range estimate.

Comparison between the measured sea surface temperature in degrees C calculated by the GFDL ocean-coupled GCM, from a 1975 GARP report chapter Suki wrote - see below for reference.

Comparison between the measured sea surface temperature in degrees C calculated by the GFDL ocean-coupled GCM, from a 1975 GARP report chapter Suki wrote – see below for reference.

At the time no-one else had the computer facilities to run this GCM, and so they focussed on simpler models, and fine details within them. Scientists model climate by splitting Earth’s surface into 3D, grids reaching up into the air. They can then calculate what happens inside each cube and how it affects the surrounding cubes. But some processes are too complex or happen on scales that are too small to simulate completely, and must be replaced by ‘parameterisations’ based on measured data. To get his GCMs to work Suki had made some very simple parameterisations, and that was another worry for other scientists. Read the rest of this entry »

The model scientist who fixed the greenhouse effect

Syukuro ("Suki") Manabe in the 1960s at Princeton University, New Jersey, where he taught from 1968-1997. He was working on weather prediction in Tokyo during the difficult postwar years when he was invited to come to the US Weather Bureau's unit working on the general circulation of the atmosphere. He was assigned programmers to write computer code so he could concentrate on the physical concepts and mathematics. Image copyright: AIP Emilio Segrè Visual Archives, used with permission.

Syukuro (“Suki”) Manabe in the 1960s at Princeton University, New Jersey, where he taught from 1968-1997. He was working on weather prediction in Tokyo during the difficult postwar years when he was invited to come to the US Weather Bureau’s unit working on the general circulation of the atmosphere. He was assigned programmers to write computer code so he could concentrate on the physical concepts and mathematics. Image copyright: AIP Emilio Segrè Visual Archives, used with permission.

In 1963, using one of the world’s first transistor-based supercomputers, Syukuro Manabe was supposed to be simulating how Earth’s atmosphere behaves in more detail than ever before. Instead, the young US Weather Bureau scientist felt the frustration, far more common today, of a crashed system. But resolving that problem would lead ‘Suki’ Manabe to produce the first computerised greenhouse effect simulations, and lay the foundations for some of today’s most widely used climate models.

After growing up during the Second World War, studying in bomb shelters, Suki entered the University of Tokyo in 1949 to become a doctor like his father and grandfather. The same year Japanese physicist Hideki Yukawa won a Nobel Prize, and helped drive many students into his subject, including Suki. “I gradually realized, ‘Oh my God, I despise biology,’” he told interviewer Paul Edwards in 1998. But to start with, he wasn’t very successful in his new subject. “At the beginning my physics grade was miserable – straight C,” he recalled.

Those grades came about because Suki’s main interest was in the mathematical parts of the subjects, but he hadn’t been thinking about what the maths really meant. When he realised this he concentrated on the physics he found most interesting, in subjects related to the atmosphere and oceans, and his grades started to improve. “By the time I graduated from geophysics and went on to a Master’s course at the University of Tokyo, I was getting a pretty solid way of thinking about the issues,” he said.

Suki went on to get a PhD, but when he finished the kinds of jobs in meteorology he was qualified for were hard to find in Japan. But he had applied his interests to rainfall, in an approach known as numerical weather prediction pioneered by scientists like John von Neumann, Carl-Gustaf Rossby and Bert Bolin. Another leader in the field, Joe Smagorinsky, was looking at rainfall in a similar way, and had read Suki’s research. Joe was setting up a numerical weather prediction team at the US Weather Bureau in Washington, DC, and in 1958 invited Suki to join him.

Their early models split the world into grids reaching into the air and across its surface, calculating what happens within and between each cube as today’s versions still do. But Joe wanted Suki to go further in preparation for the arrival of a transistorised IBM ‘Stretch’ computer in 1963. Joe wanted to develop complex system models that included the role of water movements, the structure of the atmosphere, and heat from the Sun. In particular Joe wanted to push from simulating two layers in the atmosphere to nine. Read the rest of this entry »

Twin rainfall effects strengthen human climate impact case

While existing studies of rainfall changes rely on data collected on land, by switching to satellite data LLNL's Kate Marvel and Céline Bonfils could include changes in rainfall at sea. Image copyright snoboard1010 used via Flickr Creative Commons license.

While existing studies of rainfall changes rely on data collected on land, by switching to satellite data LLNL’s Kate Marvel and Céline Bonfils could include changes in rainfall at sea. Image copyright snoboard1010 used via Flickr Creative Commons license.

The way we humans are affecting the climate is changing rainfall patterns over land and sea, scientists at Lawrence Livermore National Laboratory (LLNL) in California have found. Kate Marvel and Céline Bonfils compared precipitation ‘fingerprints’ in satellite data against what climate models showed would result from actions like adding greenhouse gases to the atmosphere. “Everyone knows that temperatures are rising, but figuring out how that affects other aspects of the climate is tricky,” Kate told me. “We’ve shown that global precipitation is changing in the way climate scientists expect it to. The odds of the observed trends being due to natural climate variability are very low.”

Changes to rain, snow and all the other forms of falling wetness collectively known as precipitation are undeniably important, given their power to bring floods and droughts. Scientists have already shown that, over land, wet areas are getting wetter and dry areas are getting drier. These studies rely on data measured directly on land, reaching back almost a century. The long record gives scientists a lot of data to test, making it easier to tell human influences from the many natural rainfall patterns. Yet Kate and Céline wanted to use satellite data instead. Though these have only been recorded since 1979, each measurement is more reliable, and the satellites also cover the oceans.

“With such a short record, it’s often difficult to identify the ‘signal’ of climate change against the background of completely natural variability,” Kate explained. For example, the wet-gets-wetter, dry-gets-dryer strengthening of the Earth’s water cycle happens because warmer air can hold more water vapour. But that can be caused by the El Niño climate pattern, as well as by increasing greenhouse gases. Our activities can also change how air circulates above the planet, pushing dry regions and storm tracks toward the poles – but so can the La Niña pattern.

Read the rest of this entry »

Braving African piracy reveals abrupt rainfall shifts

Woods Hole Oceanographic Institution's Jessica Tierney has patiently produced a record of rainfall in East Africa reaching back 40,000 years, from sediment collected from pirate- and extremist-infested waters. Image copyright: Tom Kleindinst, Woods Hole Oceanographic Institution

Woods Hole Oceanographic Institution’s Jessica Tierney has patiently produced a record of rainfall in East Africa reaching back 40,000 years, from sediment collected from pirate- and extremist-infested waters. Image copyright: Tom Kleindinst, Woods Hole Oceanographic Institution

Having dodged pirates and extremists, and slogged for two years to interpret the record collected, US scientists have shown how abruptly rainy climates in East Africa come and go. Jessica Tierney puzzled out a rainfall record back to the last ice age from mud collected in one of the last research cruises to brave the Horn of Africa. “The region goes from being pretty humid to very arid in hundreds of years,” Jessica, who works at Woods Hole Oceanographic Institution (WHOI) in Massachusetts, told me. “That’s important because there’s a threshold behaviour in its rainfall. We need to better understand what drives those thresholds, and when we’d expect to be pushed over one, as it has huge implications for predicting drought and famine in the region.”

Long interested in ancient East African climate, Jessica wanted to study the Horn of Africa area, which includes Ethiopia and Somalia, because the climate there is very sensitive and variable. But its dry conditions rule out many options scientists use to build historical records from ice, cave deposits, sediments from lake beds or tree rings. So in 2010, she started working with Peter deMenocal at Lamont-Doherty Earth Observatory in New York, who collected sea bed sediments from the area in April and May 2001.

“We boarded ship in Dar Es Salaam in Tanzania and our cruise was to end in Port Said, in Egypt,” Peter told me. That took the team down the Somali coast and into the Gulf of Aden, where a few months earlier suicide bombers killed 17 sailors aboard the USS Cole. Though the scientists were worried, the captain of their Dutch research ship, R/V Pelagia was vigilant. “He had ordered radio silence, and we actually turned off all our lights on the ship at night, even navigation lights,” Peter recalled. “He had also put in orders for us to train on what to do in case we were boarded.”

Read the rest of this entry »

How ocean data helped reveal the climate beast

Wally Broecker's famous quote on display at California Academy of Sciences.  Image copyright: Jinx McCombs, used via Flickr Creative Commons license

Wally Broecker’s famous quote on display at California Academy of Sciences. Image copyright: Jinx McCombs, used via Flickr Creative Commons license

  • This is part two of a two-part post. Read part one here.

On the wall of Wally Broecker’s building at the Lamont-Doherty Earth Observatory hangs a 16-foot long terry-cloth snake, blue with pink spots, that he calls the ‘climate beast’. Left in his office as a surprise by his workmates, its name refers to one of Wally’s most powerful quotes about the climate: “If you’re living with an angry beast, you shouldn’t poke it with a sharp stick.”

Today, the sharp stick is the CO2 we’re emitting by burning fossil fuels, which Wally was warning about by 1975. By that time he had also helped confirm that throughout history, changes in Earth’s orbit have given the climate beast regular kicks, triggering rapid exits from ice ages. He became obsessed with the idea that climate had changed abruptly in the past, and the idea we could provoke the ‘angry beast’ into doing it again.

Among the many samples that Wally was carbon dating, from the late 1950s onwards he was getting treasure from the oceans. Pouring sulphuric acid into seawater, he could convert dissolved carbonate back into CO2 gas that he could then carbon date. And though nuclear weapon tests had previously messed with Wally’s results, they actually turned out to help improved our knowledge of the oceans. The H-bomb tests produced more of the radioactive carbon-14 his technique counts, and as that spike moved through the oceans, Wally could track how fast they absorbed that CO2.

In the 1970s, as Wally and a large team of other scientists sailed on RV Melville and RV Knorr tracking such chemicals across the planet’s oceans, a debate raged. Was cutting down forests releasing more CO2 than burning fossil fuels? Dave Keeling’s measurements showed the amount of CO2 being added to the air was about half the amount produced by fossil fuels. But plants and the oceans could be taking up huge amounts, scientists argued. Thanks to the H-bomb carbon, Wally’s team found the CO2 going into the oceans was just 1/3 of what fossil fuels had emitted. Faster-growing plants therefore seemed to be balancing out the impact of deforestation, and taking up the remaining 1/6 portion of the fossil fuel emissions. Read the rest of this entry »

Continuing the fight for CO2 monitoring

  • This is part two of a two-part post. Read part one here.
Dave Keeling had to balance his work measuring CO2's rise in the air and tracking its movements through the Earth's systems with fighting to get the money to fund his work. Credit: Scripps Institution of Oceanography

Dave Keeling had to balance his work measuring CO2′s rise in the air and tracking its movements through the Earth’s systems with fighting to get the money to fund his work. Credit: Scripps Institution of Oceanography

By 1963, having directly measured a steady increase in CO2 levels over five years, Dave Keeling felt he had clearly shown the value of such non-stop monitoring. But that message hadn’t reached government decision makers. And so Dave swung into the first battle in the war to continue tracking the key greenhouse gas that has flared up repeatedly in the following decades.

Thanks to four new instruments called spectrophotometers, Dave had been able to use the same molecular movements that allow CO2 to absorb heat to measure it. Though his most famous site was at Mauna Loa in Hawaii, one was also installed in Antarctica. Another sailed on a ship and the final one stayed at Dave’s lab at Scripps Institution of Oceanography analysing samples collected in vacuum-filled five litre flasks from aircraft and elsewhere. Thanks to funds from 1957-1958’s International Geophysical Year a team of scientists was busy collecting a “snapshot” of CO2 data that Dave’s boss at Scripps, Roger Revelle, wanted.

So in 1961, Dave moved his family to Sweden for a year to work out exactly what the measurements were showing. He took a fellowship at the Meteorological Institute, University of Stockholm working with its new director Bert Bolin, who had earlier worked on the first computerised weather forecast. With measurements ongoing, annual ‘breathing’ cycles of rising and falling CO2 and the increasing trend underlying them became ever clearer.

Together, Dave and Bert found CO2 concentrations were going up by 0.06 ppm per month on average. Bert also undertook a series of complex calculations by hand to work out CO2 movement and cycles in its levels. In doing so he was showing how oceans, plants on land, and human fossil fuel burning contributed to the patterns that would later need computer models for fuller analysis. This, Dave felt, clearly showed why non-stop CO2 monitoring was needed rather than just snapshots. But by 1963 the shipboard spectrophotometer had come home, and Dave had also called back the one in Antarctica. And with funding cuts biting at the Weather Bureau, now part of the National Oceanic and Atmospheric Administration (NOAA), the staff at Mauna Loa fell from eight to three. And soon afterwards, a problem with Dave’s equipment proved too much for the overstretched team to fix.

“Suddenly there were no precise measurements being made of atmospheric CO2 anywhere,” he recalled. “I had seen the budget cut coming early in 1963 and had tried to prevent its terminating the CO2 program at Mauna Loa Observatory. I even went to Washington to plead for supplemental funding. This had no tangible effect, however, until the cessation of measurements actually occurred. The National Science Foundation (NSF) then found funds to pay for an additional technician at Mauna Loa. I learned a lesson that environmental time-series programs have no particular priority in the funding world, even if their main value lies in maintaining long-term continuity of measurements.” Read the rest of this entry »

What Dave Keeling did ahead of his curve

Dave Keeling in front of the pier at Scripps Institution of Oceanography in San Diego, which houses a variety of measuring equipment. Credit: Scripps Institution of Oceanography

Dave Keeling in front of the pier at Scripps Institution of Oceanography in San Diego, which houses a variety of measuring equipment. Credit: Scripps Institution of Oceanography

On 18 May 1955, Charles David Keeling – Dave to most – set up camp near a footbridge over a river in Big Sur State Park in California. Armed with a set of five litre flasks containing nothing but vacuum, he planned to suck up air samples regularly over the 24 hours. At the time it may have seemed the latest uncertain step of a young man unsure how best to combine his interest in science and love of the outdoors. But instead it became the start of a lifelong quest to accurately measure the main gas that man is changing the world’s climate with: CO2.

“At the age of 27, the prospect of spending more time at Big Sur State Park to take suites of air and water samples instead of just a few didn’t seem objectionable, even if I had to get out of a sleeping bag several times in the night,” Dave wrote in his autobiography. “I did not anticipate that the procedures established in this first experiment would be the basis for much of the research that I would pursue over the next forty-odd years.”

Growing up in the midwest US near Chicago, Dave’s interest in science was kindled at age five, when his economist father introduced him to the wonders of astronomy. To show Dave how the seasons came about, together in their living room they circled a globe around a lamp, serving as the sun. Going through school during the Second World War, Dave took a special class in preflight aeronautics as well as the conventional sciences.

He then enrolled in the University of Illinois early, during the summer, to fit in a year of study before he reached the conscription age of 17. With limited science options available at this time of year, he chose to major in chemistry. “I didn’t particularly like chemistry and repeatedly doubted that I had made the right choice,” he recalled. But before the year – 1945 – was out, the war was over, and so Dave could continue his course. Chemistry students were expected to study economics, but Dave felt that he’d had enough economics at home. So he opted out of chemistry, ultimately getting a general liberal arts degree.

Yet he was still offered a place to study for a chemistry PhD at nearby Northwestern University with a friend of his mother’s. He took it without applying for any others, but later realised his previous studies had left him unprepared. “Accepting so soon was probably a mistake,” he wrote. Required to take a minor subject as part of his studies, Dave chose geology. His supervisor even suggested he might like to make this his major, though Dave declined, graduating in chemistry after a gruelling five years. And while his skills were in great demand from the post-war chemical industry Dave wanted a job that would let him work outside. So he applied for geology roles at universities, managing to find one at the California Institute of Technology. Read the rest of this entry »

Extra climate targets urge faster CO2 cuts

University of Bern's Marco Steinacher has helped show that setting limits on different aspects of damage from climate change will likely limit CO2 emissions more than just temperature alone. Credit: University of Bern

University of Bern’s Marco Steinacher has helped show that setting limits on different aspects of damage from climate change will likely limit CO2 emissions more than just temperature alone. Credit: University of Bern

To give the world a chance of restricting damage caused by climate change, we need more than just a single temperature target, Swiss researchers have found. Marco Steinacher and his teammates at the University of Bern worked out the chances that climate change can be kept within harmful limits in six different areas. “Considering multiple targets reduces the allowable carbon emissions compared to temperature targets alone, and thus CO2 emissions have to be reduced more quickly and strongly,” Marco told me.

In December 2009, world leaders agreed the non-binding Copenhagen Accord, which ‘recognises’ that scientists think world temperature increases beyond 2°C above the pre-industrial average from 1850-1899 would be dangerous. It also mentions sea level rise, protecting ecosystems and food production. And as climate talks have continued since the 1990s, specific new dangers of CO2 emissions have been found. One serious impact that has been realised in the last decade comes from the fact that oceans absorb CO2 from the air, which makes the seas more acidic. That can make it harder for sea creatures’ shells to form, and together with warmer seas can damage coral, and in turn reduce fish numbers available for food. “Traditional climate targets have not addressed this effect,” Marco said.

It might seem reasonable to assume that negotiating climate deals on temperature limits alone could protect against other dangers. But until recently only very simple ‘Earth system’ models were available to test this against the idea of having several targets. They couldn’t simulate regional effects on quantities such as ocean acidification or farming productions, Marco said. “Climate targets that aim at limiting such regional changes can only be investigated with a model that has a certain amount of complexity,” he explained. Read the rest of this entry »

Missed Greenland melt cause raises sea level concern

A team of scientists (the tiny figures in the foreground) at the southwest margin of the Greenland Ice Sheet. Almost all of the surface of the sheet melted in July 2012 - and there's a lot more ice benath the surface, that's why faster melting could be so serious for sea level rise. Credit: University of Sheffield

A team of scientists (the tiny figures in the foreground) at the southwest margin of the Greenland Ice Sheet. Almost all of the surface of the sheet melted in July 2012 – and there’s a lot more ice benath the surface, that’s why faster melting could be so serious for sea level rise. Credit: University of Sheffield

The almost complete melting of Greenland’s ice sheet surface on July 11-12 2012 was caused by climate processes not projected by models. That’s according to an international team led by Edward Hanna from the University of Sheffield, UK that has looked at what might have driven the melt. “The models used to predict future climate change are clearly deficient in simulating some of the recent jet stream changes that we have shown to be responsible for enhanced warming and ice melt over Greenland,” Edward told me. And the world needs to pay attention when Greenland defrosts, as the water it produces is a major part of sea level rise.

Having long studied the Greenland ice sheet, or GrIS, Edward was perhaps one of the people least amazed by 2012’s events. “Last year’s record melt was a bit of a surprise, but perhaps not so startling in retrospect, given strong recent warming of the ice sheet area since the early-mid 1990s,” he said. With other scientists he has also found a clear change since 2007 in early summer Arctic wind patterns relative to previous decades that has led to warm Greenland summers. In particular, jet stream patterns of winds weaving north and south in drunken circles around the pole have changed to drive warm winds over Greenland. That study also linked these changes to cool, wet summers in the UK since 2007, whose unusual wetness in 2012 is seemingly the other face of the GrIS melt coin. Read the rest of this entry »

Follow

Get every new post delivered to your Inbox.

Join 160 other followers