Panicology

Home > Other > Panicology > Page 24
Panicology Page 24

by Hugh Aldersey-Williams


  The Vesuvius eruption that destroyed Pompeii ranks a mere five on the volcanic explosivity index (VEI) analogous to the Richter scale. Like Richter’s, this scale is logarithmic for intensity and also for the likelihood of an eruption. Thus, a six such as Mount Pinatubo or Krakatoa has ten times the power of a five, but whereas a five can be expected every decade or so, a six is a once-a-century event. We can expect a seven eruption every millennium, an eight every 10,000 years, and so on. These eruptions throw out so much ash and gas that, like Tambora in 1815, they can affect the entire planet. Even the Pinatubo eruption managed to depress the global temperature by 0.5°C for several years. The last eruption with a VEI of eight was more than 20,000 years ago. One need hardly add that ‘The volcano that could wipe out life on Earth’, in the Daily Mail’s headline, is ‘due to erupt any day now’.

  Chilling News

  ‘Will global warming trigger a new ice age?’ Guardian

  One of the most thrilling headlines of recent years was surely this one tucked away on an inside page of the London commuters’ paper, Metro: ‘British Winter Olympics in 2031?’ Was the government about to fund a huge investment in refrigeration units and snow-blowers so that our Olympians could at last realize their potential on home soil? No need. It would all happen thanks to natural forces. ‘The “plus side” of climate change is that Britain may be cold enough to host the Winter Olympics in 25 years’ time, a scientist has warned.’ And you thought the weather getting warmer was the plus.

  The clue is in the word ‘warned’. After all, we wouldn’t need a warning in order to enjoy a plus, would we? The real story concerned the decidedly less enticing possibility of a new ice age, although space did not permit an explanation of how this might come about.

  With global temperatures on the rise by many measures, ice age warnings have been out of fashion lately. But this was not always the case. In 1973, the American magazine Science Digest wrote: ‘the world’s climatologists are agreed… that we do not have the comfortable distance of tens of thousands of years to prepare for the next ice age… once the freeze starts, it will be too late.’ A couple of years later, a New York Times headline ran thus: ‘Scientists ponder why the world’s climate is changing: major cooling widely considered to be inevitable’.1 Popular science books and television documentaries amplified the alarm. The most notorious of these, Lowell Ponte’s The Cooling, claimed that ‘cooling has already killed hundreds of thousands of people in poor nations… If it continues, and no strong measures are taken to deal with it, the cooling will cause world famine, world chaos, and probably world war, and this could all come by the year 2000.’2

  What would bring on a new ice age? It depended on who you read, but there was no shortage of choices. Climatologist Stephen Schneider cited the possibility of an extreme snow deluge or disintegration of Antarctic ice sheets. Science writer John Gribbin’s Future Weather discussed possible fluctuations in the output of the sun and dust in the atmosphere. The astronomer Fred Hoyle thought meteorites might do it.

  All were writing during a cool spell in the world’s climate. The 1910s to the 1940s had been unusually warm, creating dust-bowl conditions in the American Midwest. But thereafter, it was cooler than average long enough for climatologists to ‘become used to the idea that the world was in a cooling phase’, as Gribbin admits in a later, rather different book on climate change, Hothouse Earth.3

  An ice age is perhaps the climatic change we can imagine most clearly, partly because we have evidence of what previous ice ages were like, and partly because it brings with it the visible transformation of water, the staff of life, from a usable liquid to an inaccessible solid – a dramatic and disastrous change of state that has no equivalent in global warming or other major climatic shifts. We have a folk memory of the Little Ice Age in the seventeenth and eighteenth centuries, reinforced in the paintings of Bruegel and Avercamp and in stories of the Pilgrim Fathers being helped through their first winter in the New World by the native Americans. And, at least in cultures that experience winter, we possess a darker mythology of the cold personified by ice queens and ice maidens, snowmen and Jack Frost, which has no warm-world counterpart.

  Scientific evidence obtained from a range of sources indicates that ice ages occur every 100,000 years or so, punctuated by shorter interglacial periods lasting 10–20,000 years. Put another way, the Earth is a world of ice where, every now and again, circumstances conspire to produce a temporary thaw.

  These circumstances relate to the Earth’s changing position and orientation relative to the sun. It is only an approximation to say that the Earth orbits at a constant distance from the sun and therefore always receives the same quantity of solar radiation to heat it. In fact, the planet’s orbit is slightly eccentric, and the axis about which it is spinning wobbles in various ways. These wobbles and eccentricities exhibit cyclical patterns called Milankovich cycles after the Serbian mathematician who spent his life investigating them. Calculating the periods of these cycles, and therefore how they overlap, is still no simple task, never mind explaining how much each of them affects the sun’s ability to heat the Earth’s surface, which is one reason why there is such uncertainty over what influences our climate.

  In 1982, Gribbin was clear, however. Our interglacial holiday is almost over, and ‘we are moving rapidly into an orbital configuration appropriate for a full Ice Age’. A string of bad winters might be all it takes: ‘Northern Hemisphere summers are already cool enough for the ice-sheets to remain if once they become established.’4

  There are shorter-term, but still severe, fluctuations in temperature that also challenge science. About 12,000 years ago, towards the end of the last ice age, there was a millennium-long cold spell known as the Dryas (after the tundra flower whose fossil pollen has been used to analyse the phenomenon). And then, 8,200 years ago, came a shorter cold snap. The prevailing theory is that both events were triggered when reservoirs of glacier meltwater gushed into the ocean, although the relevant dates don’t correspond perfectly. This ice-cold water, the theory goes, interrupted the so-called ocean conveyor, the circulation of water (and heat) among all the seas, thereby preventing the Gulf Stream from warming the North Atlantic. Although scientists have known about the Dryas event for nearly a century, it is only since 1990 that it has been explained as a side effect of a broader warming trend. Before that, it was simply viewed as the stuttering last gasp of the ice age.

  Revived fears of a new ice age depend heavily on this still poorly understood mechanism thought to be responsible for the Dryas event. This time, fast-melting Arctic ice would flick the Atlantic switch. The popular term ‘ice age’ used in this context is misleading, as the change would not affect the entire northern hemisphere as a full ice age does, nor would it last as long as a full ice age. The world as a whole might get warmer, but northern Europe would cool – and quickly. Britain’s climate might not become like that of the south of France after all, but more like that of Newfoundland. In 2004, around the time of the release of The Day After Tomorrow, the apocalyptic film loosely based on this scenario, the Guardian included a new ice age on a list of ten possible global catastrophes assembled on the premise that ‘in this post-9/11 world of paranoia, no doomsday scenario is too outlandish to be taken seriously’.

  Outlandish they may be, but predictions of a drastic cooling do deserve serious consideration. The conventional method of evaluating risk is to look at the probability of the risk combined with the impact of the threatened event if it does happen. The probability of an abrupt cooling, even in northern Europe, may be low compared to the likelihood of other climate change. But its impact – on agriculture, and on where and how people live – could well be more ruinous than a few degrees of global warming.

  Unfortunately, we know little about how the oceans work. We have not had long to observe them, and there is only indirect evidence of their past behaviour, for example in ice core samples. Like the atmosphere, the oceans are a complex system in which huge f
orces are engaged. We know something of the ocean surface from satellite observations, but little about what goes on beneath, where the sheer mass of water, its movements and its stored heat are colossal. For example, just 3 metres of the oceans’ depth has the same heat capacity as the entire Earth’s atmosphere, yet the depth of ocean that effectively mixes with the atmosphere is ten to a hundred times greater than this. Even more than in atmospheric science, understanding relies on computer models, and some global-scale effects with vital implications for our future, such as the ‘bipolar seesaw’ under which one hemisphere is thought to warm if the other cools, are still poorly understood.

  Wallace Broecker of the Lamont-Doherty Earth Observatory in New York State, who made the original link between the ocean conveyor shutdown and climate change, has pointed out that our circumstances now, near the end of an interglacial warm period, are hardly comparable with those at the end of the ice age when the Dryas event took place. He believes that it would take a substantial temperature increase of around 5°C to flick the ocean switch, a rise not expected to be seen even under the more pessimistic climate change scenarios until around 2100. And Craig Wallace, then at the Climatic Research Unit at the University of East Anglia, felt bound to disappoint the compilers of the Guardian’s catalogue of doom, announcing that, although his simulations suggested a weakening of the Gulf Stream by 10–50 per cent by this date, ‘global warming will continue, which means the planet will still get hotter, only slightly less so’.

  Recent findings suggest that the Gulf Stream may be slowing more rapidly than expected, yet the North Atlantic in general is actually warming – leaving our future climate as hard to predict as ever. Nevertheless, none of the UK climate scenarios to be published in 2008, based on the 2007 report of the Intergovernmental Panel on Climate Change, is expected to make a case for cooling.

  What does the more distant future hold? What happened to that fateful ‘orbital configuration appropriate for a full Ice Age’? Well, nothing actually. It’s still coming. But exactly when is anybody’s guess. The uncertainties in the Milankovich cycles leave plenty of scope for argument. Some believe that the Earth has already begun to cool, it’s just that we haven’t noticed because of all the warming! This view has prompted climate change sceptics to esteem our emissions of carbon dioxide as this ‘wonderful and unexpected gift of the industrial revolution’.5 According to Bill McGuire, professor of geophysical hazards at University College London, ‘we should expect our planet to be plunged into bitter cold within the next few thousand years’6 – but this was in a popular Guide to the End of the World published in 2002. In the same year, climatologists came to the more cheerful view that we are actually in the midst of a longer-than-usual interglacial period, and still have at least 50,000 years to go before the big freeze.

  8. Our Declining Resources

  A bare cupboard is a perpetual worry. In 1972, The Limits to Growth plotted downward graphs showing that oil and gas, gold, zinc, copper, lead and mercury could all run out within twenty years. They didn’t. Now wheat, grazing land, fish and water are said to be in short supply. Will there be enough to go round? At the same time, there are less direct costs to human growth as we lose diversity, both cultural and biological, that holds for us a deeper, more spiritual value.

  Wild Talk

  ‘Earth faces extinction crisis’ Independent

  The newspapers have three basic stories about the diversity of nature. First, the good news: curious creature (or plant) discovered. Next, the bad news: pretty creature (or plant) on the verge of extinction. The third category comes across as more bad news: invading foreign species threatens our delicate flowers (or animals).

  The good, the bad and the ugly, but often none is what it seems. A spectacular example of the good news came along in 2006. ‘Discovered: Europe’s first new mammal in 100 years’, The Times announced. The Cypriot mouse was indeed an addition to the relatively short roster of European mammals, but it was strictly a rediscovery of a species previously thought to have gone extinct. Two harlequin frogs and six British butterflies were among other species coming ‘back from the brink’ in the same year.

  The bad news may be misleading too, not because it is not bad, but because it is not news. Actual extinctions may be occasion for mourning but they are hard to date precisely and are rare for well-known species. There are no obituaries for plants and animals. Instead, we are reproached with picture spreads of exotic species, their mug-shots arrayed like war-dead, sometimes with dramatic red stamps across them declaring that they are ‘endangered’ or ‘threatened’. ‘Wave goodbye to hippos and polar bears’, as the Daily Mail tearfully had it. More rarely, this story is generalized. ‘Earth facing “catastrophic” loss of species’ was a recent headline in the Guardian. ‘Scientists call for action in biodiversity crisis: Warning that world faces next mass extinction.’

  Outrunning both of these in frequency, though, in the British media at least, are stories of invasion and conquest, where one species comes into conflict with others. Favoured species tend to get written up on their way down (‘Endangered birds being wiped out by grey squirrels’), those we dislike on their way up (‘Moth that can kill humans is found breeding in Britain’). These articles express fears of human immigration by proxy, as a Daily Telegraph headline-writer could not help but reveal: ‘Forget the plumbers, now Poland is sending us its rare butterflies’. But biology does not discriminate in this way. Whether we fancy a species or regard it as a pest, it is a species nevertheless. Invasion may seem to be good for local biodiversity, but often it is bad as the invading species can harm the survival chances of many indigenous species.

  These vignettes suggest accurately enough that the balance of nature is always changing and that there will always be ‘winners’ and ‘losers’ among individual species of plants and animals. What they do not reliably show is the systematically increasing endangerment – and loss – of species from the Earth. In 1996 the International Union for the Conservation of Nature updated its ‘Red Lists’ of threatened species around the world. They counted 20 per cent of all mammals, for example, as ‘vulnerable’, ‘endangered’ or ‘critically endangered’. When the lists were updated in 2006, seventy species of mammal had gone extinct, although those considered vulnerable or worse had held steady at 20 per cent. All other classes of species from birds to insects to plants were worse off to varying degrees, with amphibians especially badly hit, rising from 2 per cent threatened to 31 per cent.1

  In press stories, the threatened species is often exotic, its existence threatened by practices such as tropical logging over which we tell ourselves we have no control. The creature’s predicament is to be pitied, but there’s nothing we can do. It was thus a surprise to have this normally rather abstract moral dilemma fetch up on English doorsteps when eleven local authorities used it as a reason to impose a moratorium on all new housing planning applications in parts of Surrey and the Thames valley, having been advised by the government’s conservation agency, English Nature, that development would pose a threat to three heathland bird species living there. The entire front page of the Independent displayed heroic photographs of ‘The birds that blocked 20,000 new homes’.

  This brings home the big questions about biodiversity loss. How great is it? Which way is it heading? Is it our fault? What must we give up so that the Dartford warbler, the nightjar and the woodlark may prosper? And, shockingly perhaps, does biodiversity matter anyway?

  There have always been species losses, most notably during five mass extinction events. Best known is the probable asteroid strike on the Earth that led to the extinction of 85 per cent of species including the dinosaurs 65 million years ago. The other extinctions all occurred more than 200 million years ago, including the super-volcano or asteroid that wiped out 95 per cent of species 251 million years ago. However, these catastrophic but infrequent events account for only 4 per cent of species extinctions over time. Extinctions occur continually due to natural selection. A
species lifetime is typically a few million years, and most species that have ever lived are now extinct.

  All these losses are more than compensated in the normal scheme of things by the rate at which new species evolve. Natural speciation is thought to give rise to three new species a year, although it is hard to be sure as scientists have not observed the process directly. This process has ensured that the present era of life on Earth is more diverse than at any time in its 600 million year history.

  However, the current rate of species loss is very great, perhaps a hundred to a thousand times this rate of increase, as we humans compete against other species for food and water and destroy their habitat. The ‘globalization of nature’, by which invasive species spread with either deliberate or inadvertent human assistance, is accelerating this loss.2 Scientists estimate that the Earth loses on average a species a day, including roughly one bird and one mammal every year – 2007’s mammal was the Yangtze river dolphin, the probable demise of which did occasion a rare obituary on the front pages of the Independent in August of that year. While recorded extinctions seem tolerably low, the loss of populations across what an economist might term a healthy ‘basket of species’ is much higher. WWF’s Living Planet Index is such a basket, representing 1,300 land, freshwater and marine species, and it has declined by 30 per cent from 1970 to 2003.3

  There are several good reasons to be concerned about this. Americans such as the biologist Edward O. Wilson have found that the argument that works best with Congress and organizations like the World Bank is a utilitarian one. We depend on plants and animals not only for food, but also for oxygen generation, water capture, protection against natural disasters, clothing, labour, transport and medicines. And there is huge untapped potential among species yet to be exploited. According to the UK Natural Environment Research Council, we obtain 90 per cent of our calories from just thirty crops, for example, but there are 1,650 species that could be grown for food. The Amazon river turtle deserves to be saved, Wilson has said, not only for its own sake, but because it tastes good – and could yield 400 times the amount of meat produced by cattle raised in the same area of cleared forest. Meanwhile, the pharmaceutical multinationals are hacking their way through the jungles of Central America in search of natural drugs.

 

‹ Prev