Pandemic

Home > Other > Pandemic > Page 21
Pandemic Page 21

by Sonia Shah


  * * *

  Rita Colwell’s research on Vibrio cholerae’s secret history in the sea catapulted her to the highest echelons of scientific research, including a six-year stint as director of the National Science Foundation. By the time the seventh pandemic enveloped Haiti, she was seventy-six years old. The impact of marine vibrios on humans had never been more apparent. Climate change had rendered the seas increasingly warm, and vibrio infections rose around the world, not just in Haiti. In the warming North Sea and Baltic Sea, vibrio infections soared.38 In the United States between 2006 and 2008, vibrio infections grew by 43 percent. Pathogenic vibrio had spread into places they’d never been problematic before, like Alaska and Chile and Iceland, infesting shellfish and threatening those who eat it.39

  I met Colwell in her office at the far end of the sprawling University of Maryland campus in College Park, where she is a distinguished professor, in the fall of 2011. (She is also a distinguished professor at Johns Hopkins University, and the chair of two microbial-detection companies.) She is well aware of the paradigm shift her work had triggered. “Thirty years ago,” she said, “we were ridiculed to even say that the bacterium existed in the environment. But now it is in textbooks, the evidence is so overwhelming! It is understood!” Even after all these years, she still sounds surprised.

  But Colwell is not done shaking up the scientific establishment. It is no peculiarity of cholera that the environment shapes its dynamics, she said. As the climate changes, the environment will play an equally salient role in the dynamics of other, novel infectious diseases. Inside cholera’s story was a new explanatory framework for understanding emerging diseases, one in which the environment—biological, social, political, and economic—is both the source and the driver. This insight, Colwell said, has such far-reaching implications that it’s tantamount to a scientific revolution, on the order of the paradigm shift from Hippocratic medicine to germ theory. She calls it the Cholera Paradigm.40

  * * *

  Determining just how the changing climate will influence infectious diseases is not straightforward. Odd weather combinations shape outbreaks of infectious disease in unpredictable ways. It was a cold snap in winter that led wild mute swans to alter their migration patterns in 2006 and ferry H5N1 into more than twenty European countries.41 It was a mild winter in 1999 that allowed mosquitoes to breed in the sewers of New York City all season, followed by a summer drought, which forced thirsty birds to congregate in crowded watering holes, leading to the city’s first West Nile virus outbreak.42

  Clearly environmental conditions shaped these outbreaks, but could anyone have predicted just how? Consider an environmentally sensitive pathogen like Plasmodium falciparum, which causes mosquitoborne malaria. More rainfall can lead to more malaria—since it creates puddles and ponds that malaria-carrying mosquitoes breed in—or less, since runoff and floodwaters wash mosquito eggs away. Similarly, droughts can lead to more malaria by turning rivers into stagnant, mosquito-friendly ponds—or less malaria, since dry weather desiccates mosquitoes’ bodies.

  Still, certain correlations between weather and infectious disease are clear. The heaviest rainfalls (those in the top 20 percent) preceded 68 percent of the outbreaks of waterborne diseases that occurred between 1948 and 1994 in the United States.43 Cases of West Nile virus rise by 33 percent after heavy rainfall.44 And scientists agree that warmer temperatures will expand the range for the kinds of creatures that bring us diseases: bats, mosquitoes, and ticks among them.45 It’s already started. In Costa Rica, certain bat species have moved into higher-than-normal elevations, and in North America they have expanded their wintering ranges to the north.46 The mosquito carrier of yellow fever and dengue, Aedes aegypti, long restricted to the southeastern Gulf states, popped up in California in 2013.47 The Asian tiger mosquito, Aedes albopictus, spread northward and into higher latitudes in Italy.48 Ticks have expanded northward and into higher latitudes in northern Europe and the eastern United States.49

  Warmer weather makes life easier for these disease vectors. It can speed up their life cycles, too. Bark beetles destroy the tissues of trees by laying their eggs in tunnels under their bark. In warmer weather, the beetles can switch from a two-year life cycle to a one-year life cycle. Since the late 1990s, the beetles have been attacking increasingly younger trees and a wider range of species of trees, decimating nearly 30 billion conifers from Alaska to Mexico. In some states, like Wyoming and Colorado, one hundred thousand lodgepole pine trees decimated by beetles fall every single day.50 A climatically sped-up life cycle may be one reason why. Other pathogens can similarly accelerate their cycles. Malaria parasites can cut days off their developmental cycle when the ambient temperature rises. That makes it more likely that they’ll develop into infective forms within the short life span of their mosquito carriers.

  Thus as our climate shifts toward warmer temperatures, hotter seas, and more volatile precipitation, cholera and its progeny will likely benefit. Simply by altering the distribution of pathogens, climate shifts will increase the burden of disease, as populations are exposed to new pathogens to which they lack immunity.

  But that’s what we can predict about the pathogens with which we are already acquainted. What about the pathogens we haven’t yet met? According to the microbiologist Arturo Casadevall, the rising of Earth’s ambient temperature could unleash whole new kingdoms of them.

  * * *

  We live in a world saturated with fungi. We ingest dozens of fungal spores with every inhale, and stomp through the world on ground that teems with fungi.

  Fungi can be potent pathogens. Unlike viruses, which require living cells to survive, fungi can persist even after all their hosts are dead because they feed on dead and decayed organic material. They also can survive independently in the environment in the form of highly durable spores.51

  Fungi are major pathogens of plants, as any backyard gardener knows. Some, such as Phythophthora infestans, which caused famine-producing potato blight, changed the course of human history. Others, such as the bat scourge Pseudogymnoascus destructans and the amphibian-plaguing chytrid fungus, have brought whole species to the brink of extinction.52

  And yet while pathogenic bacteria and viruses regularly plague humans, aside from the odd yeast infection or case of athlete’s foot, we suffer from very few fungal pathogens. This may be a function of our warm-bloodedness, Casadevall says. Unlike the reptiles, plants, and insects that regularly fall prey to fungal pathogens, mammals keep their blood at a scorching temperature—more than 35° above Earth’s average ambient temperature of 61° Fahrenheit—regardless of the weather around us. Most fungi, adapted to environmental temperatures, can’t take the heat of our blood and perish in our oven-like bodies.

  Heat is such an effective antidote for infection that reptiles try to do it, too, producing “artificial fevers” by sunning themselves and thus raising their internal temperatures when suffering infections. Similarly, scientists have shown that warming frogs’ bodies to 98° Fahrenheit cures them of chytrid fungus infection.

  Warm-blooded mammals’ superior defense against fungal pathogens, Casadevall speculates, may explain the mystery of how mammals came to dominate over reptiles after the extinction of the dinosaurs. The cold-blooded lifestyle is far more efficient than ours. Warm-bloodedness requires mammals to consume ten times more daily calories than we’d have to if we were cold-blooded.53 “You mammals,” Casadevall chided, staring out at his audience at a midmorning talk on the subject, “you just had breakfast, and you’re probably already thinking about lunch.” (My stomach grumbled agreeably.) A crowd of crocodiles wouldn’t have had to think about food for a week, he pointed out. And yet after the dinosaurs went extinct, their fellow reptiles did not stage a second act; diminutive, inefficient—but fungal-pathogen-free—mammals rose to dominance instead.

  Warm-bloodedness would have provided especially critical protection against pathogens during Homo sapiens’ early years on Earth. Most of our pathogens back then
were adapted to the ambient temperature because they lived at least part of their lives in the environment. (There weren’t enough of us around back then for them to make a full-time living in our bodies.) Keeping our blood warm foiled them. Unfortunately, today most of our pathogens come from other mammals, which means that by the time they get to us they’ve already adapted to warm blood. Still, we run fevers to scorch them out anyway, Casadevall points out, an atavistic gesture to an earlier era when our interior heat saved us from pathogens.

  The problem is that our warm blood repels fungal pathogens only because its temperature diverges from the ambient temperature around us, to which fungi are accustomed. If fungal pathogens evolved to tolerate higher temperatures, that gradient would disappear. It is technically possible: in lab experiments, fungi that usually perish at temperatures above 82° Fahrenheit can be bred to tolerate temperatures up to 98°. Climate change could produce the same result, on a planetwide scale, slowly but inexorably training fungi to tolerate increasing temperatures, including, at some point, the heat of our blood.

  Heat-tolerant fungi, if they emerge, would pose an infectious disease threat like no other, Casadevall says. Except for warm-bloodedness, we have no defenses against them. “If you don’t believe me, ask the amphibians,” he says, referring to the fungal pathogens that have decimated them. “Ask the bats.”54

  As the global temperature rises, fungal pathogens have already started to creep into the infectious-disease landscape. In California and Arizona, the soil-dwelling fungus Coccidioides immitis and C. posadasii caused seven times more human infections—dubbed “Valley Fever”—in 2009 than in 1997.55 Disease-monitoring programs such as HealthMap and Pro-MED increasingly carry reports of outbreaks of fungal diseases. HealthMap reported twice as many in 2011 than in 2007, and Pro-MED reported seven times as many in 2010 than in 1995.56 These could be random peaks that will be followed by dips—or they could be harbingers of a wave of climate-change-driven fungal pathogens to come.

  * * *

  Climate change, like all of the other ways we’ve put ourselves at risk of pandemics today, is a product of modernity. We can trace each excess atom of carbon in the air today to specific activities that occurred with the rise of capitalism, from the firing up of the first coal-powered factories to the gas-guzzling cars and jets of today. That suggests that tackling the next pandemic requires grappling, in one way or another, with the novel problems created by industrialization and globalization. But that would solve only part of the problem. Tomorrow’s pandemic may be a product of modernity, but pandemics in general are not. In fact, the specter of contagion has been haunting our kind for millions of years.

  While the dynamics of infectious disease, from that of cholera in the nineteenth century to the emerging pathogens of today, are dictated by specific historical conditions, our modern confrontation with pathogens is just the latest skirmish in a much longer, more fraught, more complex confrontation between us and microbes.

  NINE

  THE LOGIC OF PANDEMICS

  There’s no straightforward record of the ancient pandemics that plagued us. They can be discerned only obliquely, by the contours of the long shadows they’ve cast. But according to evolutionary theory and a growing body of evidence from genetics and other fields, pandemics and the pathogens that cause them have shaped fundamental aspects of what it means to be human, from the way we reproduce to the way we die. They shaped the diversity of our cultures, the outcomes of our wars, and lasting ideas about beauty, not to mention our bodies themselves and their vulnerability to the pathogens of today. Their powerful and ancient influence informs the specific ways modern life provokes pandemics the way the tides shape the currents.

  Disease is intrinsic to the fundamental relationship between microbes and their hosts. All it takes to confirm that is a brief tour through the history of microbial life and a peek inside our own bodies. Humans dominate the planet in modern times, but in the past, it was the microbes that ruled. By the time our earliest ancestors, the first multicellular organisms, clambered out of the sea around 700 million years ago, microbes had been colonizing the planet for nearly 3 billion years. They had radiated into every available habitat. They lived in the sea, in the soil, and deep inside Earth’s crust. They could withstand a wide range of conditions, from temperatures as low as 14°F to as high as 230°F, feeding on everything from sunlight to methane. Their hardiness allowed them to live in the most extreme and remote places. Microbes colonized the pores inside rocks, ice cores, volcanoes, and the ocean’s depths. They thrived in even the coldest and saltiest seas.1

  For the microbes, our bodies were simply another niche to fill, and as soon as we formed, they radiated into the new habitats our bodies provided. Microbes colonized our skin and the lining of our guts. They incorporated their genes into ours. Our bodies were soon home to 100 trillion microbial cells, more than ten times the number of human cells; one-third of our genomes were spiked with genes that originated in bacteria.2

  Did our ancestors willingly play host to the intruding microbes that colonized their interiors? Possibly. But probably not. For like the outsized military of an insecure state, we developed a swollen arsenal of weaponry to surveil, police, and destroy microbes. We shed layers of skin to rid ourselves of the microbes that would colonize its surface. We constantly blinked our eyelids to wash microbes off our eyeballs. We produced a bacteria-killing brew of hydrochloric acid and mucus in our stomachs to repel microbes that might attempt to colonize their interiors. Every cell in our body developed sophisticated methods of protecting itself from microbial invasion, and the capacity to kill itself if it failed. Specialized cells—white blood cells—coursed through our bodies with no other role but to detect, attack, and destroy intruding microbes. In the time it just took you to read these lines, a flood of them washed through your entire body, surveilling for signs of microbial invasion.

  The development of these immune defenses attests to the ongoing threat that microbes must have posed. To survive, our bodies had to be finely tuned to fight contagion. Our immune defenses were not some vestigial backup system, like a retired security guard relaxing in the back of a rarely visited shop. They were ever alert and activated by hair trigger. Today, just looking at a picture of a person suffering a microbial attack, such as someone with skin lesions or who is sneezing, will cause our white blood cells to start pumping out elevated quantities of immune fighters such as cytokine interleukin-6, just as if we’d been invaded by a microbe ourselves.3

  Sustaining this battle readiness against microbes wouldn’t have been easy. Anytime our immune system was activated, we needed to consume more oxygen. During periods when we had to expend energy elsewhere, such as when our bodies were incubating and nursing our young, we were forced to let down our guard. Then as now, we lacked sufficient resources to keep the expensive immune system running. Protecting the body from the appetites of microbes is, in the lingo of biologists, “costly.” We paid the price because surviving in a microbial world required it.4

  But even though the immune system helped us ward off pathogenic incursions into the body, it did not seal us off entirely. Far from it: to this day, any diminishment in our battle readiness—or change in microbes’ ability to foil our defenses—results in violent confrontation. When our immune defenses are weakened by age or disease or exhaustion, microbes invade our cells. Once they do that, they wreak havoc in a variety of ways. Some replicate with abandon, using up our nutrients or damaging our tissues in the process. Others, like cholera, secrete toxins that help them replicate or spread. Some simply provoke reactions from other sensitive bodily systems. Their methods vary, but the result is the same: they thrive while we sicken.

  We call these culprits “pathogens,” but really they are just microbes doing the same thing they do everywhere else: feeding, growing, and spreading. And they do so relentlessly. That’s their nature. Under optimal conditions, microbes double their numbers every half hour. They never age. So long as there’s
enough food around, they won’t die unless something kills them. That is to say, they will predictably exploit the resources available to them to the fullest extent possible. If that results in epidemics and pandemics, so be it.

  The logic of microbial life and the nature of our immune defenses conjure up a pandemic-scarred past. But there’s more. Evolutionary biologists and geneticists interpret certain anomalies, such as unusual signatures in our DNA and strange behaviors that are otherwise difficult to explain, as clues, too. For a growing number of experts, they’re as suggestive as the trembling hands of a seemingly unscathed trauma victim would be to a criminal detective: only a violent, pandemic-plagued past could explain them.

  * * *

  These anomalies are not what most people would consider either strange or hard to justify. They are two fundamental parts of our life cycle: sexual reproduction and death. We take them as given. But for evolutionary biologists, they are puzzling developments in our evolution that demand explanation.

  Grasping this rather counterintuitive notion requires a brief digression into what’s called the “selfish gene theory” of evolution. The basic idea is that genes—or, rather, the genome, which is the entire complement of genes in a given individual—are the movers and shakers of evolution. The genome consists of long twisted molecules of DNA (or RNA), which are carried around in each of our cells, bits of which (the genes) provide instructions for a wide range of biological traits, from eye color to nose shape to the sound of one’s voice. According to the selfish gene theory, all of evolution can be boiled down to their machinations. Some genes, by dictating or “coding” for traits that help them spread more widely, become dominant. Others, which code for traits that are useless or harmful to their own dissemination, die out.

 

‹ Prev