How lessons from space put the greenhouse effect on the front page

Normally during a total lunar eclipse, like this one on April 15, 2014, you can still see the moon, but in 1963 Normally during a total lunar eclipse, like this one on April 15, 2014, you can still see the moon, but in 1963

Normally during a total lunar eclipse, like this one on April 15, 2014, you can still see the moon, but in 1963 Jim Hansen saw it disappear completely. Explaining why would send him on a scientific journey to Venus, before coming back down to Earth. Image credit: NASA

Jim Hansen’s life changed on the evening the moon disappeared completely. In a building in a cornfield Jim and fellow University of Iowa students Andy Lacis and John Zink, and their professor Satoshi Matsushima, peered in surprise through a small telescope into the wintry sky. It was December 1963, and they had seen the moon replaced by a black, starless circle during a lunar eclipse. The moon always passes into Earth’s shadow during such eclipses, but usually you can still see it.

At first they were confused, but then they remembered that in March there had been a big volcanic eruption. Mount Agung in Indonesia had thrown tonnes of dust and chemicals into the air: perhaps that was blocking out the little light they’d normally have seen? With a spectrometer attached to their telescope they measured the moon’s brightness, data Jim would then base his first scientific research on. Using this record to work out the amount of ‘sulphate aerosol’ particles needed to make the moon disappear, Jim began a lifelong interest in planets’ atmospheres. That would lead him to become director of the NASA Goddard Institute of Space Studies (GISS), where he has led the way in exposing the threat from human CO2 emissions.

Jim was born in Iowa in 1941, the fifth of seven children of a farmer, who had left school at 14, and his wife. As he grew up they moved into the town of Denison, his father becoming a bartender and his mother a waitress, and Jim spending his time playing pool and basketball. Jim claims he wasn’t academic, but found maths and science the easiest subjects, always getting the best grades in them in his school. Though his parents divorced when he was young, public college wasn’t expensive at the time, meaning Jim could save enough money to go to the University of Iowa.

The university had an especially strong astronomy department, headed by James Van Allen, after whom brackets of space surrounding the Earth are named. These ‘Van Allen Belts’ are layers of particles that he discovered, held in place by the planet’s magnetic field. Satoshi Matsushima, a member of Van Allen’s department, could see Jim and Andy’s potential and convinced them to take exams to qualify for PhD degrees a year early. Both passed, with Jim getting one of the highest scores, and were offered NASA funding that covered all their costs.

A few months later, it was Satoshi who suggested measuring the eclipse’s brightness, feeding Jim’s interest in atmospheres on other planets. “Observing the lunar eclipse in 1963 forced me to think about aerosols in our atmosphere,” Jim told me. “That led to thinking about Venus aerosols.” In an undergraduate seminar course Jim had given a talk about the atmospheres of outer planets, which James Van Allen had attended. The elder scientist told him that recently measured data was suggesting Venus’ surface was very hot. Aerosols stopped light reaching the Earth during the eclipse – could they be warming up Venus by stopping heat escaping, Jim wondered? That would become the subject of his PhD, and Satoshi and James Van Allen would be his advisors. Read the rest of this entry »

Advertisements

When the climate change fight got ugly

  • Steve Schneider talks about climate and energy with Johnny Carson on the Tonight Show in 1977, early on in his efforts to bring human-caused climate change to the public's notice.

    Steve Schneider talks about climate and energy with Johnny Carson on the Tonight Show in 1977, early on in his efforts to bring human-caused climate change to the public’s notice.

    This is part two of this profile. Read part one here.

“How many of you think the world is cooling?” That’s what Steve Schneider asked the studio audience of the Tonight Show with Johnny Carson in September 1977. And when the majority put their hands up, he explained that the recent cooling trend had only been short-term. Though the unscripted poll meant Steve wasn’t invited back to the programme, through the summer of that year he had brought climate science to US national TV. The appearances typified Steve’s efforts to bring climate change to the world’s notice – efforts that would later draw attention of a less desirable sort.

Building on his earlier high profile research, Steve had just published ‘The Genesis Strategy: Climate and Global Survival’, predicting ‘demonstrable climate change’ by the end of the century. Whether human pollution would cause warming or cooling, he argued governments should copy the biblical story where Joseph told Pharoah to prepare for lean years ahead. In a decade already torn by rocketing food and oil prices, the advice resonated with many who wanted to head off any further crises.

Some scientists criticised Steve and those like him for speaking straight to the public. In particular, climate science uncertainties were so great that they feared confusion – like that over whether temperatures were rising or falling – was inevitable. That dispute grew from a basic question about science’s place in society. Should researchers concentrate on questions they can comfortably answer using their existing methods? Or should they tackle questions the world needs answered, even if the results that follow are less definite?

At a meeting to discuss climate and modelling research within the Global Atmospheric Research Programme (GARP) in 1974 near Stockholm, Sweden, Steve pushed for the second approach. Given the food problems the world was struggling with at the time, it seemed obvious that climate change impacts like droughts, floods and extreme temperatures would bring famines. “I stood alone in arguing that we had to consider the implications of what we were researching,” Steve later wrote. While some attacked him angrily, saying they weren’t ready to address these problems, conference organiser Bert Bolin agreed that socially important questions must be answered.

The suggestion was also controversial because it meant blurring the lines between climate science and other subjects, such as agriculture, ecology and even economics. The director at the US National Center for Atmospheric Research (NCAR) in Boulder, Colorado, where Steve worked, warned that crossing subject boundaries might cost him promotion. But he responded with characteristic wilfulness, founding a journal doing exactly what he was warned not to. Read the rest of this entry »

The ice-age U-turn that set the stage for the climate debate

Steve Schneider (left), Jim Hansen (centre), and S. Ichtiaque Rasool (right) at NASA Goddard Institute for Space Studies in New York, circa 1971. Image copyright: Stephen H. Schneider

Steve Schneider (left), Jim Hansen (centre), and S. Ichtiaque Rasool (right) at NASA Goddard Institute for Space Studies in New York, circa 1971. Image copyright: Stephen H. Schneider

On 13 July 1971, world-leading researchers gathered in Stockholm, Sweden, concluded their presentations about human influence on climate, and opened the meeting to questions from the press. But rather than asking about the most important climate meeting yet, the assembled reporters first looked to the meeting’s 26-year old secretary. “Where is Dr. Schneider? When is the ice age coming?” they asked.

The journalists sought out Stephen Schneider about a paper by him and his NASA Goddard Institute for Space Studies (GISS) boss, S. Ichtiaque Rasool, published just four days before. Using early computer models, they warned of a scenario where enough dusty aerosol pollution could be ‘sufficient to trigger an ice age’. For Steve, this would be the first encounter of many with the media’s interest in climate, leading him ultimately to help define how scientists influence the wider world.

As a PhD student at Columbia University in New York in the late 1960s, Steve came into contact with some of the world’s leading experts on climate. Wally Broecker, who at that time was helping establish the timing of the ice ages, lectured him on oceanography. A talk by Joe Smagorinsky from the US National Oceanic and Atmospheric Administration (NOAA), who was establishing some of the first computer climate models with Suki Manabe, played on Steve’s childhood fascination with hurricanes. And when he took a seminar by Ichtiaque talking about planets’ atmospheres – why Mars was too cold, Venus too hot, and Earth just right – he was hooked.

While writing up his PhD thesis he got a part-time job with Ichtiaque, tackling a key question at the time. Burning fossil fuels creates two types of pollution that influence climate – warming CO2 and cooling aerosols. But which one would win out? On the advice of fellow GISS scientist Jim Hansen, Steve used a method partly developed by astronomer Carl Sagan to calculate the aerosol effect. He put this into a model of warming from CO2 Ichtiaque gave him. They found that doubling CO2 levels in the air would raise temperatures by about 0.7°C – much lower than Suki’s earlier estimate of 2°C for this ‘climate sensitivity’ figure. But models where aerosols were spread everywhere experienced 3-5°C cooling, prompting Ichtiaque to write the ice age comment, referring to other controversial research of the time.

Ichtiaque had asked Steve to handle criticism of the study, but in the meantime Steve had managed to get an invite to the Stockholm gathering of leading climate scientists. Being a ‘rapporteur’ he was supposed to only be taking notes at the three week Study of Man’s Impact on Climate (SMIC) meeting, organised by Bert Bolin. But Steve couldn’t resist showing Suki some of his modelling work on clouds’ role in climate – and then the aerosol study was published. Ichtiaque had mischievously told a reporter that Steve was presenting the work at SMIC, forcing his young colleague to give a brief seminar, and face the press. Read the rest of this entry »

Fighting for useful climate models

  • This is part two of a two-part post. Read part one here.
Princeton University's Suki Manabe published his latest paper in March this year, 58 years after his first one. Credit: Princeton University

Princeton University’s Suki Manabe published his latest paper in March this year, 58 years after his first one. Credit: Princeton University

When Princeton University’s Syukuro Manabe first studied global warming with general circulation models (GCMs), few other researchers approved. It was the 1970s, computing power was scarce, and the GCMs had grown out of mathematical weather forecasting to become the most complex models available. “Most people thought that it was premature to use a GCM,” ‘Suki’ Manabe told interviewer Paul Edwards in 1998. But over following decades Suki would exploit GCMs widely to examine climate changes ancient and modern, helping make them the vital research tool they are today.

In the 1970s, the world’s weather and climate scientists were building international research links, meeting up to share the latest knowledge and plan their next experiments. Suki’s computer modelling work at Princeton’s Geophysical Fluid Dynamics Laboratory (GFDL) had made his mark on this community, including two notably big steps. He had used dramatically simplified GCMs to simulate the greenhouse effect for the first time, and developed the first such models linking the atmosphere and ocean. And when pioneering climate research organiser Bert Bolin invited Suki to a meeting in Stockholm, Sweden, in 1974, he had already brought these successes together.

Suki and his GFDL teammate Richard Weatherald had worked out how to push their global warming study onto whole world-scale ocean-coupled GCMs. They could now consider geographical differences and indirect effects, for example those due to changes of the distribution of snow and sea ice. Though the oceans in the world they simulated resembled a swamp, shallow and unmoving, they got a reasonably realistic picture of the difference between land and sea temperatures. Their model predicted the Earth’s surface would warm 2.9°C if the amount of CO2 in the air doubled, a figure known as climate sensitivity. That’s right in the middle of today’s very latest 1.5-4.5°C range estimate.

Comparison between the measured sea surface temperature in degrees C calculated by the GFDL ocean-coupled GCM, from a 1975 GARP report chapter Suki wrote - see below for reference.

Comparison between the measured sea surface temperature in degrees C calculated by the GFDL ocean-coupled GCM, from a 1975 GARP report chapter Suki wrote – see below for reference.

At the time no-one else had the computer facilities to run this GCM, and so they focussed on simpler models, and fine details within them. Scientists model climate by splitting Earth’s surface into 3D, grids reaching up into the air. They can then calculate what happens inside each cube and how it affects the surrounding cubes. But some processes are too complex or happen on scales that are too small to simulate completely, and must be replaced by ‘parameterisations’ based on measured data. To get his GCMs to work Suki had made some very simple parameterisations, and that was another worry for other scientists. Read the rest of this entry »

The model scientist who fixed the greenhouse effect

Syukuro ("Suki") Manabe in the 1960s at Princeton University, New Jersey, where he taught from 1968-1997. He was working on weather prediction in Tokyo during the difficult postwar years when he was invited to come to the US Weather Bureau's unit working on the general circulation of the atmosphere. He was assigned programmers to write computer code so he could concentrate on the physical concepts and mathematics. Image copyright: AIP Emilio Segrè Visual Archives, used with permission.

Syukuro (“Suki”) Manabe in the 1960s at Princeton University, New Jersey, where he taught from 1968-1997. He was working on weather prediction in Tokyo during the difficult postwar years when he was invited to come to the US Weather Bureau’s unit working on the general circulation of the atmosphere. He was assigned programmers to write computer code so he could concentrate on the physical concepts and mathematics. Image copyright: AIP Emilio Segrè Visual Archives, used with permission.

In 1963, using one of the world’s first transistor-based supercomputers, Syukuro Manabe was supposed to be simulating how Earth’s atmosphere behaves in more detail than ever before. Instead, the young US Weather Bureau scientist felt the frustration, far more common today, of a crashed system. But resolving that problem would lead ‘Suki’ Manabe to produce the first computerised greenhouse effect simulations, and lay the foundations for some of today’s most widely used climate models.

After growing up during the Second World War, studying in bomb shelters, Suki entered the University of Tokyo in 1949 to become a doctor like his father and grandfather. The same year Japanese physicist Hideki Yukawa won a Nobel Prize, and helped drive many students into his subject, including Suki. “I gradually realized, ‘Oh my God, I despise biology,’” he told interviewer Paul Edwards in 1998. But to start with, he wasn’t very successful in his new subject. “At the beginning my physics grade was miserable – straight C,” he recalled.

Those grades came about because Suki’s main interest was in the mathematical parts of the subjects, but he hadn’t been thinking about what the maths really meant. When he realised this he concentrated on the physics he found most interesting, in subjects related to the atmosphere and oceans, and his grades started to improve. “By the time I graduated from geophysics and went on to a Master’s course at the University of Tokyo, I was getting a pretty solid way of thinking about the issues,” he said.

Suki went on to get a PhD, but when he finished the kinds of jobs in meteorology he was qualified for were hard to find in Japan. But he had applied his interests to rainfall, in an approach known as numerical weather prediction pioneered by scientists like John von Neumann, Carl-Gustaf Rossby and Bert Bolin. Another leader in the field, Joe Smagorinsky, was looking at rainfall in a similar way, and had read Suki’s research. Joe was setting up a numerical weather prediction team at the US Weather Bureau in Washington, DC, and in 1958 invited Suki to join him.

Their early models split the world into grids reaching into the air and across its surface, calculating what happens within and between each cube as today’s versions still do. But Joe wanted Suki to go further in preparation for the arrival of a transistorised IBM ‘Stretch’ computer in 1963. Joe wanted to develop complex system models that included the role of water movements, the structure of the atmosphere, and heat from the Sun. In particular Joe wanted to push from simulating two layers in the atmosphere to nine. Read the rest of this entry »

The man who got the world to agree on climate

  • This is part two of a two-part post. Read part one here.
When not tackling climate science or negotiations Bert Bolin liked nothing more than a little choir singing. Credit: KVA

When not tackling climate science or negotiations Bert Bolin liked nothing more than a little choir singing. Credit: KVA

In 1975, advised by Bert Bolin, the Swedish government drafted a bill on future energy policy containing a conclusion that elsewhere might be controversial even today. “It is likely that climatic concerns will limit the burning of fossil fuels rather than the size of the natural resources,” it foresaw. Produced thanks to Bert’s early role tackling environmental issues, it was one of the first times humans’ effect on climate and the risk it poses us was noted officially. For more than two decades afterward the Stockholm University researcher would further strengthen that case, both through his research and by putting climate science firmly on the political agenda. And those tireless efforts would help the United Nations’ Intergovernmental Panel on Climate Change (UN IPCC) to consistently achieve what otherwise might have been impossible agreements.

The Swedish bill was a bold statement, given that average air temperatures were only just about to reverse a slight cooling that had gone on since 1940. Bert and scientists like Dave Keeling had shown that CO2 levels in the atmosphere were rising. Basic science established by Svante Arrhenius 80 years before had showed this should warm Earth’s surface. So why was it cooling? The way scientists found the answer was typical of the progress in climate science Bert was overseeing. They would use the latest tools, including computers and satellites, bringing theory and measurement together to improve our understanding.

Climate models in the early 1970s were still simple by today’s standards, but had advanced from the early computerised weather predictions Bert had previously pioneered. And when Columbia University’s Stephen Schneider and S. Ichtiaque Rasool added aerosols of floating dust to CO2 in a model for the first time, they found a possible explanation for the temperature drop. The aerosols, particularly human pollution, created a cooling effect that swamped the warming – so much so they warned it could trigger an ice age. Though Stephen and Ichtiaque soon realised that their model overestimated the cooling, aerosols obviously deserved a closer look.

To clear up such murky problems, the Global Atmospheric Research Programme (GARP) that Bert jointly set up would bring together scientists from around the world, despite the cold war. As GARP’s first experiments, looking at heat and moisture flow between the atmosphere and ocean, started in 1974, Bert organised a meeting in Stockholm on climate physics and modelling. GARP had two goals – improving 6-10 day weather forecasts first, and climate change predictions second. As it gradually became clear how hard the first was, climate forecasting became more important.

Diplomacy was needed among the gathered scientists as arguments flared over how ambitious they should be. Should they strive for satellites that could collect the high resolution data scientists and models needed, even though that was beyond their capabilities at the time? And significantly for later climate work – should they seek to produce results so society could respond to change, even when results were uncertain? Bert was clear on that one: scientists had to answer socially important questions, though he was in a very small minority prepared to say so openly. Read the rest of this entry »

The underprepared figurehead that led climate science from calculation to negotiation

Bert Bolin discussing weather maps in Stockholm circa 1955. Image copyright Tellus B, used via Creative Commons license, see Rodhe paper referenced below.

Bert Bolin discussing weather maps in Stockholm circa 1955. Image copyright Tellus B, used via Creative Commons license, see Rodhe paper referenced below.

In 1957, at the young age of 32 and just one year after completing his PhD, Bert Bolin officially gained a new skill: leadership. Taking over the International Meteorological Institute (IMI) in Stockholm, Sweden, after his mentor Carl-Gustaf Rossby’s sudden death must have been a huge shock. But Bert gained responsibility after responsibility over the next 40 years, ultimately becoming the first chairman of the United Nations’ Intergovernmental Panel on Climate Change (UN IPCC). And though it’s hard to beat setting up a Nobel-prize winning organisation, Bert was not just an administrator – his research helped build the foundations of climate science too.

Growing up in Nyköping, south of Stockholm, Bert recorded the weather with encouragement from a schoolteacher father who had studied meteorology at university. After the pair met the Swedish Meteorological and Hydrological Institute’s deputy director when Bert was 17, he moved north to study maths, physics and meteorology at the University of Uppsala. Immediately after graduating in 1946 he went to Stockholm to do military service, where he first saw Carl-Gustaf giving a series of lectures.

By that time Carl-Gustaf had been living in the US for 21 years, pioneering mathematical and physical analysis of the atmosphere, becoming the country’s foremost meteorologist. He had set up meteorology departments at Massachusetts Institute of Technology in the 1930s, and the University of Chicago, Illinois, in the 1940s. He had also modernised the US Weather bureau and by 1946 wanted to help improve meteorology’s status in his native Sweden. As Carl-Gustaf’s renowned organisational prowess gradually pulled together the IMI, Bert came to study with him, gaining his Master’s degree in 1950.

Carl-Gustaf was collaborating with leading scientists of his time, and through some of these links Bert spent a year working in the US after his Master’s. Perhaps the most notable such relationship was with John von Neumann at Princeton University in New Jersey, who had helped develop the hydrogen bomb. John and his team had made history using arguably the world’s first computer, ENIAC, to predict weather mathematically. But when errors emerged, Carl-Gustaf asked Bert to help analyse why, using his understanding of the atmosphere to prevent such forecasts being ‘mathematical fantasy’. Read the rest of this entry »