The Coming Plague

Home > Other > The Coming Plague > Page 86
The Coming Plague Page 86

by Laurie Garrett


  Biologists were appalled. Like archivists frantic to salvage documents for the sake of history, ecologists scrambled madly through the planet’s most obscure ecospheres to discover, name, and catalogue as much flora and fauna as possible—before it ceased to exist. All over the world humans, driven by needs that ranged from the search for wood with which to heat their stoves to the desire for exotic locales for golf courses, were encroaching into ecological niches that hadn’t previously been significant parts of the Homo sapiens habitat. No place, by 1994, was too remote, exotic, or severe for intrepid adventurers, tourists, and developers.

  At Harvard University, Dr. E. O. Wilson was one of the leaders of a worldwide effort to catalogue the world’s species and protect as much of the planet’s biodiversity as possible. He estimated that there were 1.4 million known species of terrestrial flora, fauna, and microorganisms on earth in 1992, and perhaps as many as 98.6 million yet to be identified. The vast majority of those unknown plants and creatures, he argued, were living in the world’s rain forests.3 There the plentiful supply of rain, tropical sunlight, and nutrient-rich soil bred such striking diversity that Wilson found 43 different species of ants living on a single tree in the Amazon.4 Devoted biologists were literally risking their lives in a mad rush to identify the missing 10 to 98.6 million species, some 50 percent of which were thought to reside in the rain forests of Amazonia, Central Africa, and South Asia.

  The pace of the loss was staggering—on the order, by UN estimates, of 4.75 million acres annually. 5

  Whether supplying the highly profitable heroin and cocaine markets, which in the Andes was responsible for devastation of upward of 90 percent of the Colombian forest and only slightly less alarming percentages of the forests of Ecuador, Peru, and Bolivia, the fast-food beef consumption habits, or the coffee needs of the wealthy world, entrepreneurs of the developing nations were responding to all too present economic incentives when they destroyed their natural ecologies.6 Without competing economic incentives for protecting the ecospheres it seemed unrealistic to expect that local human beings would take meaningful steps to reverse or slow the pell-mell pace of deforestation.7

  Using Landsat satellite imagery that was enhanced to reflect geographic features that might be hidden in flat photographs, David Skole and Compton Tucker, of the University of New Hampshire and NASA’s Goddard Space Flight Center, made computer estimates of destruction in the Amazon between 1978 and 1988. Six percent of the Amazon’s upper canopy and 15 percent of its total forest mass had, they concluded, effectively been destroyed.

  Though it was well known to biologists that tiny isolated pockets of dense vegetation surrounded by devastation couldn’t support a diverse range of species, none of the prior ecosphere calculations had factored for such islets of forestry. When Skole and Tucker studied the Amazon, however, they realized that many areas looked like a checkerboard, with slashes and zigs and zags of devastation slicing the rain forest into ever-thinner islets bordered by constantly thickening swaths of desertification or development. Humanity didn’t nibble into the forest from its edges; it built huge superhighways that plunged into the pristine center and side roads that bisected one subsection after another.

  So, the two scientists concluded, about 15,000 square kilometers of Amazonia were being directly destroyed by human beings every year, but another 38,000 square kilometers were indirectly destroyed annually by the isolation and fragmentation process.8 That combined effect represented an annual forest loss of an area larger than the United Kingdom of Great Britain and Northern Ireland. It also implied that between 1978 and 1988 Amazonia effectively lost 15 percent of its productive forest.

  When ecospheres were so severely stressed, certain species of flora and fauna that were best suited to adapt to the changed conditions would quickly dominate, often at the expense of less flexible competitors. The net result would be a marked decline in diversity. This could clearly be visualized when, for example, a tropical area was cut to make way for a golf course. Though the golf course was comprised of flora and fauna, its range of diversity was strictly controlled by human beings. At the course’s periphery Nature would constantly try to push its way back in, but the aggressive species were usually limited to the healthiest plants and animals. If humans ceased trying to control the golf course, those sturdy aggressor species would swiftly move in, but it would be years before the original scope of diversity would be restored—if ever.

  Both deforestation and reforestation could, therefore, give rise to microbial emergence. If an ecology had been entirely devastated, and its eventual replacement species were of inadequate diversity to ensure a proper balance among the flora, fauna, and microbes, new disease phenomena might emerge.

  Such was the case in 1975–76 in the Atlantic seaside town of Lyme, Connecticut. Like many New England coastal communities that dated back to the colonial era, Lyme was a quaint town of two-hundred-year-old buildings, birch trees, and homes interspersed with pockets of picturesque pastoral scenery.

  During the mid-1970s fifty-one residents of the town came down with what looked like rheumatoid arthritis. The ailment, dubbed Lyme disease, would by 1990 have surfaced in all 50 states and parts of Western Europe. Though scattered reports of Lyme would emanate from states with ecologies as disparate as those of Alaska and Hawaii, more than 90 percent of all cases were reported out of coastal and rural areas between Long Island, New York, and Maine. New York would, by 1988, lead the world in Lyme diagnoses with 6.09 cases per 100,000 adults, and reported cases from the northeastern states would double every year between 1982 and 1990.9

  The typical Lyme disease patient suffered localized skin reddenings that were indicative of insect bites, followed days to months later by skin lesions, meningitis, progressive muscular and joint pain, and arthritic symptoms. Untreated, the ailment could be lifelong, leading to a range of neurological disorders, amnesia, behavioral changes, serious pain syndromes in the bones and muscles, even fatal heart disease or respiratory failure. 10 Once physicians learned of Lyme, the disease was undoubtedly overdiagnosed in endemic areas of the Northeast,11 but there remained a clear upward trend in the United States in bona fide cases, and by 1992 Lyme was the most reported vector-borne disease in the country.

  Most Lyme sufferers lived in wooded areas that were inhabited by common North American feral animals: deer, squirrels, chipmunks, and the like. Notably absent in these untroubled, quiet woods were the ancient predators, such as wolves, cougars, and coyotes. Keeping deer and small mammal populations in check had, in fact, become a major headache for affluent wooded communities all over North America.

  In 1982, Dr. Allen Steere of Tufts University in Boston discovered that Lyme patients were infected with a previously little-studied spirochete bacterium, Borrelia burgdorferi.12 Subsequently he and other physicians showed that many of the dreadful symptoms of the disease were the result of the immune system’s protracted battles with the microbe.13

  Scientists soon determined that the Borrelia bacteria were transmitted to people by a tick, Ixodes dammini. While the tick was happy to feed on Homo sapiens, its preferred lunch was deer blood, specifically that of the white-tailed deer then common to the North American woods. About 80 percent of all North American cases were linked to either residing in a deer habitat or hiking through such an area.14

  Harvard’s Andy Spielman showed, however, that getting rid of the deer in a region didn’t eliminate Lyme disease. While the incidence of the disease among human beings might decline, it didn’t go away. Further, there was a seasonal periodicity to Lyme outbreaks that coincided with the life cycle of the I. dammini tick, but not necessarily with that of the deer. 15

  Spielman and his lab staff figured out that the ubiquitous northeastern mouse Peromyscus leucopus was the natural reservoir for the B. burgdorferi bacterium that caused Lyme disease. The immature ticks lived on the mice and fed on the rodents’ blo
od. The mice, which were harmlessly infected with the bacteria, passed their B. burgdorferi on to the ticks. As spring approached, the winter thaw each year witnessed surges in the populations of both the P. leucopus mice and their tick passengers. The two species, rodent and insect, shared the ecology of low scrub brush that grew along the sand-duned shores and woodlands of the American Northeast. The deer grazed through these areas, picking up I. dammini ticks, which, while feeding on deer blood, passed on the bacteria.

  The deer carried the ticks with them as they made long foraging journeys through woodlands and into suburban yards. Because there were no predators around to keep the deer population in check, their sheer numbers were great enough to force the animals to scour boldly for food, often stepping right into suburban front yards and patios to nibble at carefully cultivated azaleas and lawns. That, in turn, guaranteed that three more species—Homo sapiens, felines, and canines—would come in contact with I. dammini ticks and the B. burgdorferi bacteria they carried.16

  Studies in New York showed that the territory inhabited by the I. dammini tick was expanding at a steady and rapid rate, as deer, pet dogs, humans, rodents, and even some birds carried the insects further and further from the initial outbreak sites. By 1991, Lyme, the disease, and I. dammini, its vector, had spread widely throughout wooded and scrub-brush ecospheres all over the Northeast. Their invasion, and the epidemic they spawned, was new.17

  To understand how, and why, Lyme disease had suddenly emerged in North America, Spielman and his colleagues tried to recapitulate the history of the expansion of I. dammini’s territory.18

  The work took Spielman’s group back in time to the arrival of British colonists in North America. When the Pilgrims landed in Massachusetts they set to work with Puritanical fervor clearing local forests and building settlements, Spielman said. By the late eighteenth century Massachusetts was the center of North America’s iron industry, and remaining forests of the region were denuded to supply fuel for iron smelting. By the nineteenth century most of the woodlands of the entire Northeast had been so thoroughly devastated that housing construction required importation of wood from what were then the western territories.

  “The result was an ecology just as artificial as a concrete parking lot,” Spielman said, speaking of the later return of flora and fauna to the denuded areas. The grand tall trees, oaks and larches, never returned, nor did the large carnivorous animals. What did replace the old forests was an ecology similar to what probably had comprised the edges of the woods in the sixteenth century: scrub brush, small birches and other nonshade trees, meadows, deer, chipmunks, voles, squirrels, and birds.

  “It’s an artificial landscape that we have created, largely by neglect, here in the East,” Spielman said, adding that the new ecology was filled with insect and rodent vectors, “lurking out there in this system of change.”

  Into the denuded forests came aggressor flora and, unchallenged by predators, the deer, rodents, and I. dammini ticks. As their numbers soared, bringing the deer, in particular, back from the edge of extinction in the Northeast, a new disease paradigm emerged.19

  As the invasion of I. dammini ticks and deer into artificially reforested areas demonstrated, no matter how hard Homo sapiens struggled to pave the world, Nature never ceased trying to force its way back. No area could escape the steady global spread of plant, animal, and insect species. In the absence of natural predators or competitors, alien species introduced into artificial ecologies—including mega-cities—could quickly overwhelm all suitable niches. And with the immigrant species could—and had—come microbes that were new to the local environment.

  The Lyme case demonstrated the fallacy of viewing flora and fauna per se as “natural.” From the point of view of microbial opportunity, loss of original biodiversity couldn’t be corrected merely by introducing a handful of aggressor species.

  During the early 1980s ecologists Paul and Anne Ehrlich of Stanford University developed the “Rivet Hypothesis” of diversity. They thought of the ecosphere as a huge airplane held together by steel rivets, or species. As each species died out, the total mass of the “airplane” might remain the same, but rivets were lost, weakening the overall structure. Eventually, a critical number of rivets having been lost, the plane would come apart, crash, and perish.

  The epochal “Rivet Hypothesis” was given credibility by several experiments conducted in laboratories around the world. Scientists grew plants in environmentally sealed greenhouses filled with devices that measured carbon dioxide, oxygen, and total biomass. And it turned out that the more diverse the species assortment in a greenhouse—even when total biomass, sunlight, and all other factors were equal—the greater the oxygen production and general vitality of the little ecosphere.

  In a survey of nineteen tropical forest ecospheres, researchers from the Missouri Botanical Garden found striking evidence that the changing ratio of oxygen to carbon dioxide was already having dire effects: forest turnover rates were increasing dramatically. Whole sections of forest biota “rivets” were dying and regenerating with radically escalating haste. In several major forests—particularly in Central Africa and Amazonia—turnover rates over the 1970–94 period had increased 150 percent every five years. The result, wrote Al Gentry (who died in a plane crash over Ecuador while making these surveys), was a net decrease in biodiversity as the older, massive hardwood trees, and the multitude of flora and fauna that existed in the ecospheres they created, died off and were replaced by a limited range of aggressive smaller trees and tropical vines. These gas-dependent species had less dense wood, and could transform forests into carbon sinks which would emit chemicals that further exacerbated the CO2 imbalance and ozone crisis. Gentry predicted an accelerated rate of species extinctions and a radical change in the density and diversity of the world’s rain forests, all occurring at astonishing speed.20

  From an atmospheric scientist’s point of view the most crucial issue was the decline in oxygen production from the earth’s flora due both to its overall declining mass and to the lowered range of diversity among surviving vegetation. Coupled with increased production of carbon dioxide owing primarily to human fossil fuel consumption and forest burning, and the expected increase in oxygen-dependent Homo sapiens, a clear chemical crisis loomed.

  The most immediate impact was chemical destruction of the earth’s ozone layer. The invisible layer of gas composed primarily of uncoupled oxygen atoms, or ozone, had unique physical properties. The atoms responded to specific wavelengths of light, repelling those in the ultraviolet and infrared bands. Little light in those wavelengths emanated upward from the earth’s surface, but the planet was bombarded with such radiation from the sun. If not for the ozone layer, the planet would be a humanly uninhabitable hothouse bathed in mutation-causing ultraviolet light.

  Throughout the 1980s researchers, particularly at NASA’s Goddard Space Flight Center, amassed evidence that the ozone layer was weakening, especially over the South Pole. Over Antarctica an actual seasonal hole had developed in the ozone layer, through which poured levels of ultraviolet light unprecedented in known human history.

  By 1990 a fierce debate raged in scientific circles over the size and significance of that ozone hole. But something was undoubtedly happening to the global ecology. Glaciers were retreating in some parts of the world, skin cancer rates were up in Australia and southern Chile, surface temperatures of oceans in some areas had risen, and mean surface air temperatures were up. Some researchers found, in fossils and deep glacial ore samples, evidence of such periods of warming in the earth’s past, indicating that such events could all be part of a historic cycle on the planet. Further, it was possible that the bulk of the warming was induced not by human pollution and rain forest destruction, but by natural catastrophic events such as the 1991 eruption of the Mount Pinatubo v
olcano in the Philippines. 21

  There was strong evidence, however, that halogen ions, particularly chlorides and bromides, were making their way via human pollution into the atmosphere. These were the breakdown products of thousands of plastics, pesticides, fuels, detergents, and other modern materials. Once inside the ozone layer, the halogens acted as chemical scavengers, attaching themselves to free oxygen atoms to form heavy molecules that then fell out of the protective layer into lower tiers of the earth’s atmosphere. In this way, ozone was actively depleted.

  Most Western scientists insisted that the pollution- and deforestation-driven ozone depletion and global warming hypothesis was correct, though among believers there were significant differences of opinion about its current and forecast severities. The strongest evidence supported an estimate of a net global temperature increase of half a degree centigrade during the twentieth century, with five degrees centigrade marking the difference between, on the one hand, the Ice Age and, on the other, a severely deleterious greenhouse warming effect.

  The first outcome of this warming was a higher surface water evaporation rate, which, in turn, led to greater levels of rainfall and monsoon in key areas of the planet. In places that normally had low levels of rainfall, such as the Sahara Desert, there would be even less precipitation. The net result would be greater extremes in water distribution, with severe droughts afflicting some parts of the planet, flooding and hurricanes hitting others. That, in turn, was expected to alter everything from the migrations of birds to the feeding patterns of blue whales; from habitat ranges of malarial mosquitoes to the amount of the planet’s arable land suitable for profitable agricultural growth.22

 

‹ Prev