Canadian journalist Donna Laframboise. Former National Post & Toronto Star columnist, past vice president of the Canadian Civil Liberties Association. New posts: Mondays & Wednesdays.
Rajendra Pachauri, chairman of the Intergovernmental Panel on Climate Change (IPCC), rarely misses an opportunity to spread alarm. In September 2007, during a presentation at United Nations headquarters in New York, he declared that “20-30% of plant and animal species [are] at risk of extinction” due to global warming.
In December of that year, when the IPCC received the Nobel Peace Prize, Pachauri used his Nobel lecture to tell the world that failure to prevent climate change “could prove extremely harmful for the human race and for all species that share common space on planet earth.” He warned that, if global average temperature exceeded “about 3.5 ºC, model projections suggest significant extinctions (40%–70% of species assessed) around the globe” [italics added].
In February 2008, Pachauri cited the 20-30% figure while addressing a committee of the North Carolina legislature. In March, the bulletin of the International Atomic Energy Agency published an interview with him, in which he discussed the extinction issue in a similar manner.
That November Pachauri told an audience at a Zurich university that climate change “will reduce biodiversity and perturb functioning of most ecosystems” – once again referencing the 20-30% figure. By December he’d taken this message to Poland. Never tiring of the theme, in September 2009 he again raised the extinction fear at the UN in New York, then at the Copenhagen climate summit that December, then in an article he authored for the UK’s Guardian newspaper in April 2010.
It is by no means clear how the IPCC arrived at these scary numbers, however. In Chapter 4 of the 2007 Working Group 2 report, in a section titled “Global synthesis including impacts on biodiversity,” one paragraph ends with the following:
Based on all above findings and our compilation (Figure 4.4, Table 4.1) we estimate that on average 20% to 30% of species assessed are likely to be at increasingly high risk of extinction from climate change impacts possibly within this century as global mean temperatures exceed 2°C to 3°C relative to pre-industrial levels…The uncertainties remain large, however, since for about 2°C temperature increase the percentage may be as low as 10% or for about 3°C as high as 40% and, depending on biota, the range is between 1% and 80% (Table 4.1; Thomas et al., 2004a; Malcolm et al., 2006). As global average temperature exceeds 4°C above pre-industrial levels, model projections suggest significant extinctions (40-70% species assessed) around the globe (Table 4.1).
Please note the words I’ve bolded. When people talk about estimates, possibilities, large uncertainties, likelies, maybes, and suggestions; when their guesses range from 10 to 40%, and from 1 to 80% – they aren’t saying much of anything. Nevertheless, the IPCC’s thinking on this matter does seem to be summed up by the excerpt above.
Please also note the two studies that get mentioned by name. One is by Chris Thomas (plus 18 co-authors). The second is by Jay Malcolm (plus four co-authors). According to the IPCC’s Table 4.1 (cited three times in the excerpt above), the Malcolm paper, which discusses vegetation only, estimates that different kinds of ecosystems – such as tundra, scrubland, and deciduous forest – could lose between 2 and 47% of their current area.
When the public hears about extinction, however, it thinks of animals. Table 4.1 says the Thomas paper (described as examining both flora and fauna planet wide) estimates that between 9 and 31% of species are “committed to extinction” if the average global temperature rises 1.2 to 2 degrees.
The dozens of other research papers listed in Table 4.1 are more limited in scope. One examines only eucalypts plants. Others look at Australia’s golden bowerbird, Brazil’s cerrado plants, Hawaii’s honeycreepers (small birds), or Antarctic mollusks (snails). One study discusses waterfowl habitat in America’s Prairie Pothole region.
It would appear, therefore, that the IPCC’s 20-30% planet-wide extinction estimate rests heavily on the Thomas paper. It is the only research cited by the IPCC that claims to be global and to have considered both animals and plant life. Incidentally, I’m not the only person who has come to this conclusion. University of Virginia School of Law professor Jason Johnston raised this matter in a May 2010 research paper (info & download link for 82-page PDF here).
Cue the dramatic music, because this is where the train leaves the track. What Pachauri’s many audiences have had no way of knowing is that the Thomas paper was controversial long before the IPCC decided to accord it center stage in its analysis.
THE NOTORIOUS THOMAS PAPER
In 2004, Nature – a UK science weekly – started off the New Year with a bang. The cover of it’s January 8th issue featured a fabulous close-up photo of a lizard. “Feeling the heat. Biodiversity losses due to global warming” declared the headline. Pages 145-148 introduced the Thomas paper to the world, after which the story was picked up by mainstream news outlets. As one scholar later observed: “It is rare for a scientific paper to be the lead item on the evening news, or to fill the front pages of our national newspapers, but the Thomas et al. paper received exceptional worldwide media attention.”
Unfortunately for Pachauri, many experts consider this paper to be a load of rubbish.
Enter Daniel Botkin. Considered one of the pre-eminent biologists of the 20th century, he helped develop some of the first computer models used by ecologists. In addition to degrees in physics and biology, he has four decades of professional experience under his belt. He has taught at several universities including Yale and the University of California, Santa Barbara – where he was chair of the Environmental Studies program for six years.
Botkin calls the Thomas study “the worst paper I have ever read in a major scientific journal.” On his blog he explains:
First, the paper uses a theory that is inappropriate and illogical for the question. Second, the data on which the calculations are based — the areas of the world’s biomes — are crude, lacking estimates of measurement error. My textbook Environmental Science: Earth as a Living Planet has a chapter on the scientific method in which I state that “a measurement without a statement about its degree of uncertainty is meaningless.”
That this was a paper with shortcomings is confirmed by the fact that, by July 2004 (six months after it first appeared), Nature had received, edited, and published three separate critiques.
Nature Critique #1
The first pointed out that, rather than using well-established, universally recognized methods to arrive at their conclusions the Thomas team had employed a novel (and therefore unproven) analytical approach.
It added that the likelihood of errors was amplified by the fact that the Thomas paper incorporated findings from many studies that used a range of techniques (apples, oranges, and lemons had all been treated similarly).
Echoing Botkin, the critique authors said they were not aware of “any means of quantifying” the uncertainty associated with “the simplistic link” the Thomas paper had attempted to draw between a reduction of habitat and a particular species’ risk of extinction.
Nature Critique #2
The second critique accused the Thomas authors of circular mathematical reasoning and of jumping to conclusions. It then continued:
The effects of global change on extinction risk are difficult to anticipate. Global warming will increase some habitats and their speciesholding capacity, just as warming reduces other habitats. The net effect for biodiversity of these habitat expansions and contractions is not obvious, particularly as species ranges may shift poleward from the tropics, where the greatest number of species is currently.
Nature Critique #3
The third critique pointed out that because no one yet understands the role genetics plays when species attempt to adapt to changing environments, certain assumptions in the Thomas paper “may not be justified” and certain of its methods may “yield poor results” and “may be inaccurate.”
This critique said that while the Thomas paper’s overall conclusion was “compelling,” the paper “could be greatly underestimating the threat to biodiversity from climate change.” That logic, of course, cuts both ways. Once we understand genetics better, we may discover the extinction risk to be less than we thought rather than worse.
The misgivings about the Thomas paper didn’t end there. In July 2005, three months prior to when the Malcolm paper (on which the IPCC chose to rely, above) received its publication green light, a conservation biologist at Oxford University had his own paper accepted for publication in a prominent British journal. Owen Lewis devoted 6,000 words (3 times the length of this blog post) to explaining why the findings of the Thomas paper were highly questionable.
Lewis argues that since we don’t know how many species are currently constrained by climate alone (as opposed to, say, natural predators), we can’t tell if today’s distribution patterns reflect the true limit of the climate they’re able to tolerate. He points out that the “widespread ability of species to persist if transplanted or introduced outside their current range” suggests the natural world is more resilient than we might think. (Tomatoes, for example, are native to South America. Introduced to Europe during the 1500s, they thrived to such an extent they then became a cornerstone of Italian cuisine.)
Lewis is troubled by the fact that the Thomas paper studied only populations known to occupy relatively small geographic ranges. This is a problem, he says, because it is “well know that species with small geographic ranges are particularly prone to extinctions.” Moreover:
…only a small fraction of the species included by Thomas et al. are from tropical forests, but these forests account for over 50% of terrestrial biodiversity (perhaps considerably more) and may be less affected by climate change than habitats at higher latitudes.
According to Lewis, because so little is known about the tropical invertebrates (insects, worms, snails, butterflies, and so forth) that constitute “the bulk of global biodiversity” scholars “are certainly not in a position to predict their future ranges” – or to know how large their habitat would need to be in order to ensure their survival.
Then there’s the fossil record. Lewis says there’s evidence that temperatures both increased and decreased during the past 10,000 years “to a greater extent than the minimum warming scenario investigated by Thomas et al., and at a similarly rapid rate.” Those temperature changes appear to have “had relatively little impact on extinctions,” he says. It’s also possible, he suggests, that “those species most sensitive to climate change” have already been weeded out via natural selection.
The long and short of it? Lewis thinks it’s highly inappropriate for the findings in the Thomas paper to be extrapolated to the entire globe. The layers upon layers of uncertainty, he says, should “make us very wary of the accuracy” of these predictions.
By June 2006, a scholar at the Helmholtz Centre for Environmental Research in Germany had also submitted a paper to a scientific journal disputing the Thomas findings. Carsten Dormann says the Thomas researchers simply “assume that species distributions are affected overwhelmingly by climate, that species will…not adapt to climate and that the statistical methods are robust.”
While it’s one thing, he says, to employ ecological models to generate hypothesis for further testing, it’s another matter entirely to present the results of these modeling exercises as predictions that policy makers and the public then interpret as forecasts. The problems associated with the Thomas approach, he says, “are so numerous and fundamental that common ecological sense should caution us against putting much faith in…their findings.”
Dormann suggests that the Thomas researchers use data that isn’t appropriate for their purpose, that “many papers reporting on species distribution have not provided the scientific rigour” necessary to be reliable, and that extrapolating from small studies in order to make global statements is fraught with danger. “Species distribution analyses are no easy game,” he insists. They require “intimate knowledge of the species, of the statistics and a lot of thought.”
Arguing that far more attention should be paid to validating ecological models before ecologists use them to predict the future, Dormann says his own paper “is intended to call on the scientist employing species distribution models…to reflect more thoroughly on their merits and limitations.”
By March 2007 a paper by Botkin (the eminent biologist who makes his appearance at the top of this section) had also been published. Co-authored with 18 other scholars from the US, Denmark, Spain, the UK, Australia, and Switzerland, this paper accuses many researchers of employing techniques whose reliability has never been confirmed to make predictions about the future. “Of the modeling papers we have reviewed, only a few were validated,” they report.
In the opinion of these authors, the Thomas paper “may have greatly overestimated the probability of extinction.” Like Oxford biologist Lewis, above, they believe the past sheds important light on the threat climate change may pose in the future:
…the fossil record indicates that, in most regions, surprisingly few species went extinct during the [last 2.5 million years] – in North America, for example, only one tree species is known to have gone extinct…
…Until recently, it was thought that past temperature changes were no more rapid than 1 degree Celsius…per millennium, but recent information from both Greenland and Antarctica, which goes back approximately 400,000 years, indicates that there have been many intervals of very rapid temperature change, as judged by shifts in oxygen isotope ratios. Some of the most dramatic changes…are actually of greater amplitude than anything projected for the immediate future.
Declaring that serious problems “need to be overcome” before “too much weight can be placed” on the methods used in the Thomas paper, Botkin and his co-authors make it clear that this is not top-notch research.
to be continued… Part 2 here