The Coming Plague

Home > Other > The Coming Plague > Page 2
The Coming Plague Page 2

by Laurie Garrett


  For three days scientists presented evidence that validated McNeill’s words of foreboding: viruses were mutating at rapid rates; seals were dying in great plagues as the researchers convened; more than 90 percent of the rabbits of Australia died in a single year following the introduction of a new virus to the land; great influenza pandemics were sweeping through the animal world; the Andromeda strain nearly surfaced in Africa in the form of Ebola virus; megacities were arising in the developing world, creating niches from which “virtually anything might arise”; rain forests were being destroyed, forcing disease-carrying animals and insects into areas of human habitation and raising the very real possibility that lethal, mysterious microbes would, for the first time, infect humanity on a large scale and imperil the survival of the human race.

  As a member of a younger generation trained in an era of confident, curative medicine and minimal concern for infectious diseases, I experienced such discussion as the stuff of Michael Crichton novels rather than empiric scientific discourse. Yet I and thousands of young scientists also reared in the post-antibiotic, genetic engineering era had to concede that there was an impressive list of recently emergent viruses: the human immunodeficiency virus that caused AIDS, HTLV Types I and II which were linked to blood cancers, several types of recently discovered hepatitis-causing viruses, numerous hemorrhage-producing viruses discovered in Africa and Asia.

  In February 1991 the Institute of Medicine (IOM), which is part of the U.S. National Academy of Sciences, convened a special panel with the task of exploring further the questions raised by the 1989 scientific gathering and advising the federal government on two points: the severity of the microbial threat to U.S. citizens and steps that could be taken to improve American disease surveillance and monitoring capabilities. In the fall of 1992 the IOM panel released its report, Emerging Infections: Microbial Threats to Health in the United States,3 which concluded that the danger of the emergence of infectious diseases in the United States was genuine, and authorities were ill equipped to anticipate or manage new epidemics.

  “Our message is that the problem is serious, it’s getting worse, and we need to increase our efforts to overcome it,” Lederberg said on the day of the report’s release.

  After the release of the report, the U.S. Centers for Disease Control and Prevention in Atlanta began a soul-searching process that would, by the spring of 1994, result in a plan for heightened vigilance and rapid response to disease outbreaks. The slow response to the emergence of HIV in 1981 had allowed the epidemic to expand by 1993 to embrace 1.5 million Americans and cost the federal government more than $12 billion annually in research, drug development, education, and treatment efforts.

  The CDC was determined that such a mistake would not be repeated.

  But there were dissident voices in 1993 who protested both the American scientific community’s often narrow emphasis on viruses and its focus on threats posed solely to U.S. citizens. Disease fighters like Joe McCormick, Peter Piot, David Heymann, Jonathan Mann, and Daniel Tarantola argued forcefully that microbes had no respect for humanity’s national borders. Furthermore, they said, in much of the world the most dangerous emerging diseases were not viral, but bacterial or parasitic. A far larger view was needed, they argued.

  Other critics stressed that a historical perspective on mankind’s bumbling, misguided attempts to control the microbes would reveal that much of the fault lay with the very scientific community that was now calling for vigilance. What seemed to make sense as microbe control action, viewed from the academic and government offices of the world’s richest country, argued the likes of Uwe Brinkmann, Andrew Spielman, and Isao Arita, could prove disastrous when executed in the planet’s poorer nations.

  The critics charged that Americans, by virtue of their narrow focus on the appearance of disease within the United States, were missing the real picture. It was a picture captured in the sight of a little Ndbele girl wrapped in a green kanga. She lay on the hardened clay floor of a health care clinic outside Bulawayo, Zimbabwe. Her mother sat beside her, casting beseeching looks at every stranger who entered the two-room clinic. The four-year-old girl cried weakly.

  “That is measles,” said the clinic director, pointing a stern finger at the child. The director led an observer outside to show the local innovations in toilet hygiene and efforts to increase the protein content of village children’s diets.

  When he returned an hour later to the wattle-clay clinic, the mother was rocking back and forth on the balls of her feet, tears silently streaming down her face; the child’s soft crying had ceased. A few hours later the mother and her husband placed across bicycle handlebars a rolled straw mat containing their little girl’s body and, staring blankly at the horizon, forlornly walked the bike down the red clay road.

  At a time when mothers of the world’s wealthiest nations arranged to have their children “immunized” by deliberately exposing the youngsters to measles, mumps, even chicken pox, these diseases were forcing parents in some of the world’s poorest nations to find ways to cope with the expected deaths of more than half their children before the age of ten.

  The long list of vaccines and prescription drugs that American physicians urged their patients to take before traveling south of Tijuana was ample testimony to the health impact of the world’s wide gulf in wealth and development. In the 1970s Americans and Europeans who were distressed by the poverty of the Southern Hemisphere poured money into the poorest countries for projects intended to bring their populations into “the modern age.” The logic of the day was that the health status of a population would improve as the society’s overall structure and economy grew to more closely resemble those of the United States, Canada, and Western Europe.

  But by 1990 the world’s major donor institutions would be forced to conclude that modernization efforts seemed only to have worsened the plight of the average individual in the Third World, while enhancing the power, wealth, and corruption of national elites and foreign-owned institutions. Bucolic agricultural societies were transformed in the space of a single generation into countries focused around one or more vast urban centers that grew like ghastly canker sores over the landscape, devouring the traditional lifestyles and environments of the people and thrusting young job seekers into sprawling semi-urban slums that lacked even a modicum of proper human waste disposal or public health intervention.

  In the industrialized free market world of the 1970s, people at all societal strata became increasingly conscious of the link between environmental pollution and personal health. As the dangers of pesticide misuse, lead paint, asbestos fibers, air pollution, and adulterated foods became apparent, residents of the world’s wealthiest countries clamored for regulations to curb contamination of the environment and the food supply.

  With the discovery of Earth’s ozone holes, the world’s scientists initiated a debate about global responsibility for preventing further pollution destruction of the planet’s protective gaseous layer. Similarly, marine biologists argued with increasing vehemence that all the nations of the world shared responsibility for the sorry state of Earth’s oceans and the near-extinction or endangerment of its fish, coral, and mammal populations. Conservationists turned their attention to global wildlife protection. And biologists like Harvard’s E. O. Wilson and the Smithsonian’s Frank Lovejoy warned of a global mass flora and fauna extinction event that would rival that of the great Cretaceous period dinosaur die-off.

  Citing the fossil evidence for five great extinction events in Earth’s ancient history, Wilson asked how much more environmental destruction at man’s hand the world could tolerate: “These figures should give pause to anyone who believes that what Homo sapiens destroys, Nature will redeem. Maybe so, but not within any length of time that has meaning for contemporary humanity.”4

  As humanity appro
ached the last decade of the twentieth century, the concept of a Global Village—first elucidated in the 1960s by Canadian philosopher Marshall McLuhan as a description of the sense of worldwide interconnectedness created by mass media technology—had clearly entered mass consciousness in the context of Earth’s ecology. Environmentalists were thinking on the macro level, plotting ways to change the whaling policies of places as disparate as Japan, Alaska, Russia, and Norway. The World Bank decided to include ecological concerns in its parameters for issuing loans to developing countries. The Chernobyl nuclear accident proved, in the eyes of many scientists, that it was folly to consider toxic risk control a problem whose solutions were always constrained by issues of national sovereignty.

  And in 1992 the United States elected a Vice President who espoused an ambitious global Marshall Plan to protect the environment. Albert Gore warned that nothing short of a massive worldwide shift in human perspective, coupled with elaborate systems of international regulation and economic incentives, would be adequate to ensure the survival of the planet’s ecology. And he adopted the rhetoric of critical environmentalists, saying, “Those who have a vested interest in the status quo will probably continue to stifle any meaningful change until enough citizens who are concerned about the ecological system are willing to speak out and urge their leaders to bring the earth back into balance.”5

  At the macro level, then, a sense of global interconnectedness was developing over such issues as economic justice and development, environmental preservation, and, in a few instances, regulation. Though there were differences in perspective and semantics, the globalization of views on some issues was already emerging across ideological lines well before the fall of the Berlin Wall. Since then it has sped up, although there is now considerable anxiety expressed outside the United States about American domination of the ideas, cultural views, technologies, and economics of globalization of such areas as environmentalism, communication, and development.

  It wasn’t until the emergence of the human immunodeficiency virus, however, that the limits of, and imperatives for, globalization of health became obvious in a context larger than mass vaccination and diarrhea control programs. From the moment it was discovered in 1981 among gay men in New York and California, AIDS became a prism through which the positive lights by which societies hoped to be viewed were fractured into thousands of disparate and glaring pieces. Through the AIDS prism it was possible for the world’s public health experts to witness what they considered to be the hypocrisies, cruelties, failings, and inadequacies of humanity’s sacred institutions, including its medical establishment, science, organized religion, systems of justice, the United Nations, and individual government systems of all political stripes.

  If HIV was our model, leading scientists concluded, humanity was in very big trouble. Homo sapiens greeted the emergence of the new disease first with utter nonchalance, then with disdain for those infected by the virus, followed by an almost pathologic sense of mass denial that drew upon mechanisms for rationalizing the epidemic that ranged from claiming that the virus was completely harmless to insisting that certain individuals or races of people were uniquely blessed with the ability to survive HIV infection. History, they claimed, would judge the 1980s performance of the world’s political and religious leaders: would they be seen as equivalent to the seventeenth-century clerics and aristocracy of London who fled the city, leaving the poor to suffer the bubonic plague; or would history be more compassionate, merely finding them incapable of seeing the storm until it leveled their homes?

  Over the last five years, scientists—particularly in the United States and France—have voiced concern that HIV, far from representing a public health aberration, may be a sign of things to come. They warn that humanity has learned little about preparedness and response to new microbes, despite the blatant tragedy of AIDS. And they call for recognition of the ways in which changes at the micro level of the environment of any nation can affect life at the global, macro level.6

  Humanity’s ancient enemies are, after all, microbes. They didn’t go away just because science invented drugs, antibiotics, and vaccines (with the notable exception of smallpox). They didn’t disappear from the planet when Americans and Europeans cleaned up their towns and cities in the postindustrial era. And they certainly won’t become extinct simply because human beings choose to ignore their existence.

  In this book I explore the recent history of disease emergence, examining in roughly chronological order examples that highlight reasons for microbial epidemics and the ways humans respond, as cultures, scientists, physicians, bureaucrats, politicians, and religious leaders.

  The book also examines the biology of evolution at the microbial level, looking closely at ways in which disease agents and their vectors are adapting to counter the defensive weapons used to protect human beings. In addition, The Coming Plague looks at means by which humans are actually aiding and abetting the microbes through ill-planned development schemes, misguided medicine, errant public health, and shortsighted political action/inaction.

  Finally, some solutions are offered. Fear, without potential mitigating solutions, can be very volatile. It has, throughout history, prompted the lifelong imprisonment of the victims of a disease. Perhaps less onerously, it can lead to inappropriate expenditures of money and human resources aimed at staving off a real or imagined enemy.

  What is required, overall, is a new paradigm in the way people think about disease. Rather than a view that sees humanity’s relationship to the microbes as a historically linear one, tending over the centuries toward ever-decreasing risk to humans, a far more challenging perspective must be sought, allowing for a dynamic, nonlinear state of affairs between Homo sapiens and the microbial world, both inside and outside their bodies. As Harvard University’s Dick Levins puts it, “we must embrace complexity, seek ways to describe and comprehend an ever-changing ecology we cannot see, but, nonetheless, by which we are constantly affected.”

  Now in his eighties and retired from the daily practice of medicine, my Uncle Bernard wonders how many American physicians today would recognize a case of malaria, diphtheria, rheumatic fever, tuberculosis, or typhus without needing the guiding advice provided by time-consuming laboratory diagnostic analysis. He doubts whether most physicians in the industrialized world could diagnose an old scourge, like yellow fever or dengue fever, much less spot an entirely new microbe. As he and the rest of the pre-antibiotic era physicians of the developed world retire and age, Bernard asks if doctors of the year 2000 will be better or worse equipped to treat bacterial pneumonia than were physicians in his pre-antibiotic days.

  Preparedness demands understanding. To comprehend the interactions between Homo sapiens and the vast and diverse microbial world, perspectives must be forged that meld such disparate fields as medicine, environmentalism, public health, basic ecology, primate biology, human behavior, economic development, cultural anthropology, human rights law, entomology, parasitology, virology, bacteriology, evolutionary biology, and epidemiology.

  The Coming Plague tells the stories of men and women who struggled to understand and control the microbial threats of the post-World War II era. As these disease vanquishers retire, the college laboratories and medical schools grow full of youthful scientific energy, but it is not focused on the seemingly old-fashioned, passé tasks that were invaluable in humanity’s historic ecological struggles with the microbes. As we approach the millennium, few young scientists or doctors anywhere in the world can quickly recognize a tiger mosquito, Peromyscus maniculatus mouse, pertussis cough, or diphtherial throat infection.

  The skills needed to describe and recognize perturbations in the Homo sapiens microecology are disappearing with the passing of the generations, leaving humanity, lulled into a complacency born of proud discoveries and medical triumphs, unprepared for the coming plague.

  1

&nbs
p; Machupo

  BOLIVIAN HEMORRHAGIC FEVER

  Any attempt to shape the world and modify human personality in order to create a self-chosen pattern of life involves many unknown consequences. Human destiny is bound to remain a gamble, because at some unpredictable time and in some unforeseeable manner nature will strike back.

  —Mirage of Health, René Dubos, 1959

  Karl Johnson fervently hoped that if this disease didn’t kill him soon somebody would shoot him and put him out of his misery. The word “agony” wasn’t strong enough. He was in hell.

  Every nerve ending of his skin was on full alert. He couldn’t bear even the pressure of a sheet. When the nurses and doctors at Panama’s Gorgas Hospital touched him or tried to draw blood samples, Johnson inwardly screamed or cried out.

  He was sweating with fever, and he felt the near-paralytic exhaustion and severe pain he imagined afflicted athletes who pushed their training much too far.

  When nurses on the Q ward first looked at Johnson lying beside his two colleagues they recoiled from the sight of his crimson blood-filled eyes. All over Johnson’s body the tiny capillaries that acted as tributaries flowing to and from the veins’ rivers of blood were leaking. Microscopic holes had appeared, out of which flowed water and blood proteins. His throat hurt so much he could barely speak or drink water, thanks to a raw and bleeding esophageal lining. Word around the hospital was that the three were victims of a strange and contagious new plague that felled them in Bolivia.

  In brief moments of lucidity Johnson would ask how many days had passed. When a nurse told him it was Day Five, he groaned.

  “If my immune system doesn’t kick in fast, I’m a dead man,” he thought.

 

‹ Prev