How lessons from space put the greenhouse effect on the front page

Normally during a total lunar eclipse, like this one on April 15, 2014, you can still see the moon, but in 1963 Normally during a total lunar eclipse, like this one on April 15, 2014, you can still see the moon, but in 1963

Normally during a total lunar eclipse, like this one on April 15, 2014, you can still see the moon, but in 1963 Jim Hansen saw it disappear completely. Explaining why would send him on a scientific journey to Venus, before coming back down to Earth. Image credit: NASA

Jim Hansen’s life changed on the evening the moon disappeared completely. In a building in a cornfield Jim and fellow University of Iowa students Andy Lacis and John Zink, and their professor Satoshi Matsushima, peered in surprise through a small telescope into the wintry sky. It was December 1963, and they had seen the moon replaced by a black, starless circle during a lunar eclipse. The moon always passes into Earth’s shadow during such eclipses, but usually you can still see it.

At first they were confused, but then they remembered that in March there had been a big volcanic eruption. Mount Agung in Indonesia had thrown tonnes of dust and chemicals into the air: perhaps that was blocking out the little light they’d normally have seen? With a spectrometer attached to their telescope they measured the moon’s brightness, data Jim would then base his first scientific research on. Using this record to work out the amount of ‘sulphate aerosol’ particles needed to make the moon disappear, Jim began a lifelong interest in planets’ atmospheres. That would lead him to become director of the NASA Goddard Institute of Space Studies (GISS), where he has led the way in exposing the threat from human CO2 emissions.

Jim was born in Iowa in 1941, the fifth of seven children of a farmer, who had left school at 14, and his wife. As he grew up they moved into the town of Denison, his father becoming a bartender and his mother a waitress, and Jim spending his time playing pool and basketball. Jim claims he wasn’t academic, but found maths and science the easiest subjects, always getting the best grades in them in his school. Though his parents divorced when he was young, public college wasn’t expensive at the time, meaning Jim could save enough money to go to the University of Iowa.

The university had an especially strong astronomy department, headed by James Van Allen, after whom brackets of space surrounding the Earth are named. These ‘Van Allen Belts’ are layers of particles that he discovered, held in place by the planet’s magnetic field. Satoshi Matsushima, a member of Van Allen’s department, could see Jim and Andy’s potential and convinced them to take exams to qualify for PhD degrees a year early. Both passed, with Jim getting one of the highest scores, and were offered NASA funding that covered all their costs.

A few months later, it was Satoshi who suggested measuring the eclipse’s brightness, feeding Jim’s interest in atmospheres on other planets. “Observing the lunar eclipse in 1963 forced me to think about aerosols in our atmosphere,” Jim told me. “That led to thinking about Venus aerosols.” In an undergraduate seminar course Jim had given a talk about the atmospheres of outer planets, which James Van Allen had attended. The elder scientist told him that recently measured data was suggesting Venus’ surface was very hot. Aerosols stopped light reaching the Earth during the eclipse – could they be warming up Venus by stopping heat escaping, Jim wondered? That would become the subject of his PhD, and Satoshi and James Van Allen would be his advisors.

What Venus infers

NASA Goddard Institute of Space Studies' Manhattan offices, above Tom's Restaurant. Image credit: NASA/Robert B Schmunk

NASA Goddard Institute of Space Studies’ Manhattan offices, above Tom’s Restaurant. Image credit: NASA/Robert B Schmunk

Working out how aerosols affected Venus’ climate would take detailed calculations on early computers of how heat energy passed through its atmosphere. One option, breaking the atmosphere down into thin layers and then adding them together, had been developed by Sueo Ueno at the University of Kyoto. Satoshi got a grant to study in Japan, and took Andy and Jim with him. While in Japan, Jim applied for a research post at NASA GISS, got it, and suddenly found himself with an imminent deadline for his thesis. Pressed for time, Jim turned to supercomputer-based models developed the US National Oceanic and Atmospheric Administration (NOAA)’s Suki Manabe, which used single columns of gases and aerosols to represent a whole planet’s atmosphere. But even that was too complex and time consuming. To finish his PhD in time to start his new job, Jim was left to rely mainly on his own calculations and intense focus.

Starting at GISS – located right above Tom’s Restaurant in Manhattan, made famous by Suzanne Vega and Seinfeld – in 1967 Jim now had access to a much bigger computer. That meant he could continue improving our understanding of Venus’ atmosphere, a topic he stuck with until the early 1980s. He had been hired to look at how light and other radiation warmed planets, including how scattering by aerosols interfered with this process. In 1969 that quest sent him on a year-long stint to Leiden, the Netherlands, where he met his wife Anniek. However GISS director Robert Jastrow invited Jim back to lead the design of an instrument to go on a NASA Pioneer probe destined for Venus. After just six months in the Netherlands, and briefly honeymooning and watching an Apollo launch in Florida, Jim returned to New York.

Steve Schneider (left), Jim Hansen (centre), and S. Ichtiaque Rasool (right) at NASA Goddard Institute for Space Studies in New York, circa 1971. Image copyright Stephen H. Schneider, used with permission.

Steve Schneider (left), Jim Hansen (centre), and S. Ichtiaque Rasool (right) at NASA Goddard Institute for Space Studies in New York, circa 1971. Image copyright Stephen H. Schneider, used with permission.

In 1971 Jim’s GISS coworkers Steve Schneider and S. Ichtiaque Rasool used the new methods to make a controversial and long-remembered prediction of global cooling. They’d wrongly assumed that man-made aerosol pollution would spread across the world, reflecting the Sun’s warmth back into space, while their oversimplified model underestimated CO2’s greenhouse effect. But Jim’s fellow eclipse observer Andy Lacis was now working at GISS, and as part of a larger team the pair devised a new model that would soon help underline the error. Built from scratch and perhaps reflecting Jim’s PhD struggles, it was at least ten times as fast as other methods.

Published in 1974, the new model found the amount of warming at Earth’s surface caused by a doubling of CO2, known as climate sensitivity, would be around 4°C. That was double the figure that came out of Suki Manabe’s NOAA model and so, in 1979 a US government-commissioned report split the difference between them. Predicting that CO2 levels in the air would double over the next century, it forecast temperatures would rise by 3°C, plus or minus 50%, giving a range from 1.5-4°C.

We can send a probe to Venus, but can we understand Earth’s climate?

The increasing power of climate models to explore and explain processes on Earth inspired Jim to return to the eclipse data he’d collected in Iowa. “I tried to use the Agung volcanic aerosols as a test of our understanding of global climate forcing and planetary response,” he told me. He found good agreement between modelled temperatures and observations, but was cautious about that result as at the time there was no temperature dataset that spanned the entire world.

With Jim’s interest in Earth’s climate growing, he came to the tough decision that he would have to resign from the Pioneer Venus mission. He felt obliged to ask University of Arizona’s Donald Hunten, the mission’s ‘father’, for permission. The gruff response he got back was at once cryptic and a career-defining moment. “Be true to yourself,” Donald had said.

The first reliable global measurements of temperature from NASA, published by Jim Hansen and his colleagues in 1981, showed a modest warming from 1880 to 1980, with a dip in temperatures from 1940 to 1970. Graph adapted from Hansen et al. 1981, copyright <i>Science</i>/AAAS, used with permission, see citation below.

The first reliable global measurements of temperature from NASA, published by Jim Hansen and his colleagues in 1981, showed a modest warming from 1880 to 1980, with a dip in temperatures from 1940 to 1970. Graph adapted from Hansen et al. 1981, copyright Science/AAAS, used with permission, see citation below.

Now free to look closely at Earth’s temperature data, Jim found that gaps in available records were not as much of a problem as people had thought. Measurements hundreds of miles apart were closely linked, and on that basis they built a global record showing 0.2°C warming from the mid-1960s to 1980 and around 0.4°C over the previous century. This was just what the GISS climate models predicted should have happened thanks to human CO2 emissions. On that basis Jim, Andy and their coworkers projected 3-4.5°C warming by 2100 if energy use grew rapidly. “The degree of warming will depend strongly on the energy growth rate and choice of fuels for the next century,” they wrote in the 1981 Science paper describing their findings. “Thus, CO2 effects on climate may make full exploitation of coal resources undesirable.”

As this work was unfolding, GISS director Robert Jastrow invited Jim for lunch with Walter Sullivan, the New York Times science reporter, at the Moon Palace, a Chinese restaurant opposite their offices. “He took me along to talk about a proposed mission to the outer planets,” Jim told me. “I thought that I should send him the thing that I was really working on.” Walter and his editors found our impact on the planet so striking the story put the greenhouse effect on the front page for the first time. A follow-up editorial declared that radical energy policy change might become necessary.

Soon afterwards Jim would replace Robert as GISS director, but that didn’t mean he avoided a bad reaction. Ichtiaque Rasool, by then working in Paris, suggested Hansen had emphasised ‘the worst case in order to get the attention of the decision makers who control the funding’. “Somehow Ichtiaque got a burr under his saddle,” Jim recalled. “Several people asked me what was going on with him, and one suggested that he was jealous of the publicity the paper got – and he didn’t like the conclusion that greenhouse gases overwhelmed the effect of aerosols.” The accusation of using exaggeration to greedily grasp for cash is still directed at climate scientists today by people suspicious of their motives.

The headlines in fact had the opposite effect, driving the US Department of Energy to withdraw promised funding for GISS, and forcing Jim to lay off five researchers. But despite this setback, and a shyness that often makes it hard for him to talk, Donald Hunten’s advice would stay with him. As his understanding of the threat from climate change grew, Jim would speak out with increasing force, becoming the loud and often controversial voice for action he remains today.

  • This is the first part of a profile of Jim Hansen. Read part two here.
In 1981, Jim Hansen, Andy Lacis and their colleagues predicted the impact of CO2 emissions on global temperatures between 1950 and 2100 based on different scenarios for energy growth rates and energy source. If energy use stayed constant at 1980 levels (scenario 3, bottom lines), temperatures were predicted to rise just over 1°C. If energy use grew moderately (scenario 2, middle lines), warming would be 1–2.5 °C. Fast growth (scenario 1, top lines) would cause 3–4°C of warming. In each scenario, the warming was predicted to be less if some of the energy was supplied by non-fossil (renewable) fuels instead of coal-based, synthetic fuels (synfuels). Image copyright <i>Science</i>/AAAS, adapted from Hansen et al 1981, used with permission, see reference below.

In 1981, Jim Hansen, Andy Lacis and their colleagues predicted the impact of CO2 emissions on global temperatures between 1950 and 2100 based on different scenarios for energy growth rates and energy source. If energy use stayed constant at 1980 levels (scenario 3, bottom lines), temperatures were predicted to rise just over 1°C. If energy use grew moderately (scenario 2, middle lines), warming would be 1–2.5 °C. Fast growth (scenario 1, top lines) would cause 3–4°C of warming. In each scenario, the warming was predicted to be less if some of the energy was supplied by non-fossil (renewable) fuels instead of coal-based, synthetic fuels (synfuels). Image copyright Science/AAAS, adapted from Hansen et al 1981, used with permission, see reference below.

Further reading:

I’ve already written about the following pivotal climate scientists who came before Jim Hansen, or were around at the same time: Svante Arrhenius, Milutin Milanković, Guy Callendar part I, Guy Callendar part II, Hans Suess, Willi Dansgaard, Dave Keeling part I, Dave Keeling part II, Wally Broecker part I, Wally Broecker part II, Bert Bolin part I, Bert Bolin part II, Suki Manabe part I, Suki Manabe part II, Steve Schneider part I, Steve Schneider part II, Steve Schneider part III

I got the main outline of Jim’s life and inspiration from his book, ‘Storms of my Grandchildren’.

Most details of Jim’s early career come from Spencer Weart’s interviews with him, conducted in the year 2000 and now hosted by the American Insitute of Physics’ Oral Histories Project. Others come from profiles by Elizabeth Kolbert in the New Yorker in 2009 and by David Herring on the NASA Earth Observatory website from 2007.

Matsushima, S., Zink, J., & Hansen, J. (1966). Atmospheric extinction by dust particles as determined from three-color photometry of the lunar eclipse of 19 December 1963 The Astronomical Journal, 71 DOI: 10.1086/109863
Hansen, J., & Matsushima, S. (1967). The Atmosphere and Surface Temperature of Venus: a Dust Insulation Model The Astrophysical Journal, 150 DOI: 10.1086/149410
Somerville, R.C.J., P.H. Stone, M. Halem, J.E. Hansen, J.S. Hogan, L.M. Druyan, G. Russell, A.A. Lacis, W.J. Quirk, and J. Tenenbaum (1974). The GISS model of the global atmosphere J. Atmos. Sci. DOI: 10.1007/BFb0019776
HANSEN, J., WANG, W., & LACIS, A. (1978). Mount Agung Eruption Provides Test of a Global Climatic Perturbation Science, 199 (4333), 1065-1068 DOI: 10.1126/science.199.4333.1065
Hansen, J., Johnson, D., Lacis, A., Lebedeff, S., Lee, P., Rind, D., & Russell, G. (1981). Climate Impact of Increasing Atmospheric Carbon Dioxide Science, 213 (4511), 957-966 DOI: 10.1126/science.213.4511.957
Rasool, S. (1983). On predicting calamities Climatic Change, 5 (2), 201-202 DOI: 10.1007/BF00141271

Spencer Weart’s book, ‘The Discovery of Global Warming’ has been the starting point for this series of blog posts on scientists who played leading roles in climate science.

About these ads

28 Responses to “How lessons from space put the greenhouse effect on the front page”

  1. Richard Mallett Says:

    Of course, one of the arguments against anthropogenic global warming is that other planets have warmed as well since we have been monitoring them.

  2. Jim in IA Says:

    Hansen is a great person. Born in the same state where I live.

    • andyextance Says:

      And here was me thinking IA was Indiana – I liked how ‘Jim in Indiana’ scanned. Are you sure you’re not actually Jim Hansen? ;-)

      • Richard Mallett Says:

        IN is Indiana where they hold the Indianapolis 500:-)

      • Jim in IA Says:

        I’m sure. But, thanks for the suggestion. I like it.
        Have a good day.

        By the way…many people confuse Iowa with Idaho, Indiana, Ohio, etc. It’s better than all those. :-)

  3. Richard Mallett Says:

    James Hansen told the US Congress in 1988 that the world would warm by 0.5 C per decade till 2050.

    The IPCC projections have been :-
    1990: 0.20-0.50 C per decade
    1995: 0.10-0.35 C per decade
    2001: 0.13-0.43 C per decade
    2007: 0.11-0.64 C per decade
    2013: 0.13-0.33 (second draft) revised (final draft) to
    2013: 0.10-0.23 C per decade.

    The observed warming (HADCRUT4) 1950-2013 has been 0.108 C per decade, and (HADCRUT4) 1996-2013 has been 0.085 C per decade.

    • andyextance Says:

      Interesting points. Consistent warming then! For me the 1988 estimate isn’t too bad given the computers they were working with then – I think that was the year I got my Commodore 64. The estimates do look too high right now. For me that’s not a bad thing. If warming does turn out to be slower than the estimates then that gives us more time to make up for our sluggish response to the situation.

      • Richard Mallett Says:

        I doubt that NASA were using Commodore 64s to launch their space shuttles to the space station in 1988 :-) So we have about 5 times longer (based on observations) before we reach whatever the magic number is than Hansen thought.

      • andyextance Says:

        Even if five times as long is right, given that we look set to miss the 2C target we still need to act. http://simpleclimate.wordpress.com/2010/04/24/saturday-round-up-copenhagen-pledges-miss-2%C2%B0c-mark/
        Also, as I’ll mention in the later instalments Jim’s revised his target from down 2C to 1C. We’re already at 0.76C, so even at around 0.085C/decade we’ve got less than 30 years. CO2 emissions are still growing
        http://www.uea.ac.uk/mac/comm/media/press/2013/November/global-carbon-budget-2013
        I know you think I’m alarmist. I guess in one sense I am keen to ‘raise the alarm’, tell people of a threat that needs action. In another, I don’t feel like there’s cause for alarm, because I believe humanity can get through this. But doing that takes looking at the facts and acting accordingly. With the right amount of effort we can bring warming under control, and adapt to the level of warming that we will experience. For me, that’s where the debate should be, how we do that. I deeply respect the extent that you engage with and try to understand the science – you’ve taught me a thing or two. But sometimes the evidence you and others bring up is quite dubious, and that’s when I find myself thinking most, ‘It’s time to move the debate on’.

        Have a nice rest of the weekend.

  4. Marco Says:

    Andy, Jim was born in 1941, not 1949 (3rd paragraph)

  5. Richard Mallett Says:

    In reply to Marco (who doesn’t have a ‘reply’ link next to his comment) the Rense article was pro-IPCC.

  6. Richard Mallett Says:

    Again this is getting complicated, because I can’t ‘reply’ to Andy’s last comment, but he talked about missing a 2 C target.

    As I have mentioned previously, from 1695-1733 the Central England annual average temperature increased from 7.25 C to 10.47 C (10 year running mean, as computed by the Met. Office Hadley Centre, 7.97 to 9.90 C) – so what’s so special about a 2 C rise in the future ?

    (Incidentally, since 2006, the annual average has gone down from 10.82 to 9.56 C and from 2004, the 10 year running mean has gone down from 10.45 to 9.70 C.)

    Why is 2 C a target, and what’s expected to happen if we miss it?
    Why 2 C in particular, and not 1.5 or 2.5 C ?
    Why is it detrimental ?
    Why might it not even be beneficial to some, particularly those in extreme latitudes, for example ?

    • Marco Says:

      Richard, a few things for you to think about:
      1. The Central England Temperature record is not nearly reliable enough to make such large claims about differences between years. For example, in-between 1703 and 1722 there is no local record. The temperatures in that time period come from Utrecht (The Netherlands) adjusted based on certain meteorological conditions. There are plenty of other issues, and you’d do well to read the original papers on the topic. There’s a good reason the Met Office doesn’t start the series before 1772.
      2. Central England is not the world, and we’re not talking about relatively short periods with a 2 degree higher temperature, but a sustained, decades-long higher temperature. If you know that 6 degrees is the difference between current conditions and an ice age, you just may reconsider what that 2 degrees means for the world. To give one example: 2 degrees is ultimately +6/7 meters higher sea levels. It may take a few centuries to get there…if we’re lucky.
      3. The 2 degrees is somewhat of an arbitrary target, mainly based on what can be achieved and when impacts are likely to become really, really bad.
      4. Why is it detrimental? Let’s start with sea level rise: +1 meter is already enough to create significant problems in several large cities around the world. The Netherlands will need to spend a few % of its GDP to adapt to that. Miami is already seeing trouble with the rising seas. New Orleans, especially with American approaches to adaptation, is doomed. Shanghai cannot handle a few meters, nor can the region east of Hanoi, south of Kolkata, and significant parts of Bangladesh. Try to look up how many people we’re talking about. Then there are region that already have problems with sustained droughts. They’ll become essentially unlivable, unless you live like the current desert dwellers in Africa. The Midwest of the US may get another dustbowl…times two or three. That’s a whole lotta agricultural land that is in trouble. Australia has already seen some recent heatwaves that caused the people living there to reconsider their livelihood. To hot for anything, move to the coast. And so we can continue.
      5. Could there be benefits? Depends on how you rank something as a benefit. I’m pretty sure the Inuit aren’t too happy to see their culture being slowly destroyed because the landscape is changed. You may not see that as bad: “not so cold! Less snow!” but for a people who have for centuries made this their home it’s not funny. There will be regions where the impact on culture may be more limited, but imagine the French having to adopt the Spanish siesta culture, because it is too warm in summer to do anything during the day. Skiing in the Alps? Likely very unreliable. There will be years where it’s excellent, and other years where there is no snow. If projections are even remotely accurate, with 2 degrees half of the Olympic Winter Games cities of the past 50 years would not be able to reliably host such games. The chances of not having proper winter conditions would just not be present. Plenty more examples if needed!

      • Richard Mallett Says:

        Thank you for your comprehensive reply. As you say, it may take a few centuries to get there; and there will undoubtedly be warming and cooling periods mixed in with that, as there were (for example) towards the end of the 19th. century in the CET. Even if we start in 1772 there are about 7 dramatic warming and cooling periods before the relatively calm 1900-1990 period. It’s important to understand the causes of those past non-anthropogenic warming and cooling periods.

        As you say, the CET is not the world, but it correlates well with what we have from Europe (hardly any records from anywhere else) in the early 19th. century. We have to work with what we have.

      • Marco Says:

        Richard, we indeed have to work with what we have, and from that we know regional variability is large. So large that the CET does not correlate well with what we have from Europe. Take for example Stockholm:
        http://people.su.se/~amobe/stockholm/stockholm-historical-weather-observations-ver-1.0/graphics/annual-temperatures/stockholm_annual_1756_2012_eng.png
        You can get the raw data and compare it to CET, and find plenty of years where there are major differences between the two. For example, the coldest year in the Stockholm record is 1867. In the CET it is a very average year. In the CET 1879 is the coldest year, while in Stockholm it’s relatively cold, but there are at least 20 years colder than that. In the period 1756 to 2012 the average annual difference between CET and Stockholm is 3.5 degrees, but you can find plenty of variation around that, ranging from a minimum of 1.5 to a maximum of 6 degrees difference. That’s how “well” this correlates in the West/Northwest of Europe. I haven’t checked the Central Europe reconstruction, as that is already a combination of many records, but wouldn’t be surprised to see similar major variability between that reconstruction and CET and Stockholm.

      • Richard Mallett Says:

        Yes, in CRUTEM4 the best correlations in the longest records (starting before 1800) with CET are :-

        1. Brussels
        2. De Bilt
        3. Karlsruhe
        4. Basel
        5. Kobenhavn
        6. New Haven
        7. Bern
        8. Stuttgart
        9. Geneve
        10. Hohenpeisenberg

        Stockholm is 21st. out of 40. You are wise to look at individual station records, not combinations.

  7. Another Week of Climate Disruption News – May 11, 2014 – A Few Things Ill Considered Says:

    […] 2014/05/10: SimpleC: How lessons from space put the greenhouse effect on the front page […]


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 183 other followers

%d bloggers like this: