Pandemic

Home > Other > Pandemic > Page 23
Pandemic Page 23

by Sonia Shah


  What could have caused such a dramatic change so quickly? The scientist who discovered the loss of the gene, the sialic-acid expert Ajit Varki, suspects that it was a pandemic. That’s because, besides a variety of other roles in cell-to-cell interactions, sialic acids are used by pathogens to invade cells. (They bind to them, which is like turning a key in a lock, allowing them access to the interior of the cell.) A pandemic caused by a pathogen that invaded cells using the particular sialic acid that was lost could have killed off all the individuals who produced it, leaving only those who didn’t. Varki suggests that it was probably some form of malaria, noting that the malaria parasite Plasmodium reichenowi, which today causes malaria in chimpanzees, binds to the lost sialic acid, which is called N-Glycolylneuraminic acid, or Neu5Gc.17

  That malaria-like pandemic had profound consequences for the survivors. Their cells, unlike those of every other primate and all other vertebrates, no longer produced Neu5Gc. That meant that any attempt at conception between a survivor and anyone who hadn’t lived through the pandemic would have failed. The survivor’s immune system would register Neu5Gc-laden sperm cells, or those of a developing fetus, as foreign and attack them; as Varki’s experiments on genetically engineered mice have shown, survivors could reproduce only with each other.

  A new species would have been born. Indeed, according to fossil evidence, the first upright, walking hominid species, Homo erectus, diverged from their predecessors, the ape-like Australopithecus, right around the time when New5Gc was lost. If Varki is right, our first pandemic helped make us human.18

  The striking thing about these findings about ancient pandemics is that the paradoxical observations they’re based on were made in the course of unrelated inquiries. Both the discovery of our lost sialic acid and that of the diversity in our pathogen-recognition genes were flukes. Varki discovered the lost sialic acid in 1984, when he administered horse serum to a patient with bone-marrow failure and found that the patient’s immune system reacted to the sialic acids in it. He spent decades figuring out why, stumbling upon the story of the ancient pandemic in the process. Scientists discovered the diversity in our pathogen-recognition genes in the course of attempting organ transplants. Unless the donor and recipient shared identical pathogen-recognizing HLA genes, surgeons found, the recipient’s immune system would attack the donor’s organ as if it were pathogenic. Attempts to match donors and recipients according to their HLA genes slowly revealed the vast scale of variation among us. And yet despite the happenstance nature of these discoveries, both led to conclusions that jibed with the theories of evolutionary biologists, separately attempting to resolve their own paradoxes. We’d probably know even more about our pandemic past if we actually attempted to search for it on purpose.19

  * * *

  While the tracks that ancient pandemics left are faint, at least for now, their aftershocks are not. They can be felt by all of us, from the idiosyncrasies in our immune systems to the historical trajectories of our ancestors, in ways that scientists are just now starting to understand.

  Ancient epidemics led to the development of our heightened immune responses. These now predispose us to a range of ills, including spontaneous abortions. Five percent of all women experience recurrent, spontaneous abortions due to immunological reasons: in one way or another, the mother’s immune system, spuriously sensing a foreign intruder, attacks the fetus. Our bodies respond similarly to any tissues and cells of fellow Homo sapiens. That’s why, unless the immune systems of transplant recipients are medically suppressed, they will almost certainly attack donor organs (besides those donated by an identical twin).20

  Our heightened immune responses, in particular those we developed to survive the ancient pandemic that Varki discovered, may predispose us to developing cancer, diabetes, and heart disease if we consume red meat. Red meat, being the flesh of mammals, is rich in Neu5Gc, the sialic acid we lost. Consuming it may trigger the same kind of immune reaction in us that mating with Australopithecus did among our ancestors 2 million years ago. Our bodies, registering their tissues as foreign and pathogenic, attempt to fight them off with inflammation. Those tiny inflammatory responses, over time, may increase the risk of developing cancer, heart disease, and diabetes, all of which have been linked to inflammation. In lab experiments, Varki found, mice genetically engineered to react to Neu5Gc with inflammation as we do suffer a fivefold increase in cancers when exposed to the sialic acid.21

  Genetic variants that helped us survive pathogens in the past now burden us with heightened risks of contracting other diseases and conditions. The most famous is the sickle-cell gene, which deforms red blood cells. This gene spread among people in sub-Saharan Africa who suffered malaria epidemics because it slashed the death rate from that disease. In 2010, more than 5 million infants were born with the gene. But while it helps them survive malaria, those born with a double dose of the gene suffer sickle-cell anemia, a disorder that is fatal in the absence of modern medicine.22

  Similarly, the gene that helped Africans survive sleeping sickness now puts them at risk of kidney disease, which may explain the high rates of kidney disease in African Americans today.23 Genetic changes that helped people survive malaria made them more susceptible to other pathogens such as cholera.24 The genetic mutation that allowed people to survive leprosy, which is present in 70 percent of modern Europeans, is now associated with inflammatory bowel diseases such as Crohn’s disease and ulcerative colitis. Other genetic mutations, which bestowed on Europeans greater protection against bacterial infections, simultaneously damaged their ability to digest gluten. The result is celiac disease, which afflicts up to 2 percent of European populations today.25

  Genes that studded our red blood cells with proteins resulting in what is now known as the A and B blood groups, which may have evolved to help protect people from severe infections during pregnancy, now make people more susceptible to arterial and venous thromboembolism.26 Particular variants in our pathogen-recognition genes, which protected us from ancient epidemics, correlate with a range of autoimmune disorders, from diabetes and multiple sclerosis to lupus.27 Whether or not people survive HIV or malaria, or launch an adequate immune response to measles, depends upon their particular pathogen-recognizing HLA genes, which evolved to help ward off the pathogens of the past.

  Ancient epidemics and pandemics have cast a long shadow upon us. While connections between our genetic adaptations to ancient epidemics and our vulnerability to modern pathogens have only recently been detectable, thanks to advances in genetic research, scientists expect that many more such connections exist and are yet to be found. It may be that much of our vulnerability to the pathogens of today—and tomorrow—is shaped by how our ancestors survived the pathogens of the past.28

  * * *

  Given the outsize role pathogens and pandemics have played in our evolution, it stands to reason that they’ve probably helped shape our behavior, too. According to psychologists, historians, and anthropologists, they have. The evolutionary psychologists Corey L. Fincher and Randy Thornhill theorize that culture itself—the differentiation of populations into behaviorally and geographically distinct groups—originated as a behavioral adaptation to an epidemic-filled past.

  The theory starts with the idea of “immune behaviors.” These are social and individual practices that help people elude pathogens, such as avoiding certain landscape features like wetlands or swamps, or practicing certain culinary rituals, like adding spices with antibacterial properties to foods. These behaviors are not necessarily purposely designed to protect people from pathogens; people may not have even been aware that they helped do so. But immune behaviors, once developed, stick around because the people who indulge in them are less vulnerable to infectious diseases. The behaviors, passed down through the generations, become entrenched.

  In our early evolution, when human mobility was relatively limited, immune behaviors would have been highly localized, since pathogens and their victims would have been so intim
ately adapted to each other. Traces of this are detectable today. In Sudan, anthropologists have found that immune behaviors that protect people from a pathogen called Leishmania differ from village to village. This variability likely corresponds with the variability of the pathogens to which they’re exposed: what works in one place against one strain of the pathogen may not work as well elsewhere against different strains. Indeed, more than one hundred genetically distinct strains of Leishmania have been found over small geographic distances.29

  The specificity of immune behaviors would have made interactions with outsiders especially risky. Because they would not be privy to the specialized knowledge of local pathogens and the appropriate immune behaviors required to avoid them, they could undermine and disrupt these practices (or introduce nonlocal strains of the pathogens). Thus the value of insiders would have grown in relation to outsiders. With it, so did practices that highlight those differences—like costumes and tattoos—and attitudes that police them, like xenophobia and ethnocentrism. The result, over time, is the development of distinct cultural groups.

  The disease historian William McNeill hypothesizes that these highly localized immune behaviors contributed to the development of the caste system in India, in which strict rules limit contact between castes, and there are elaborate rules for purifying the body if contact occurs. These may be the result, in part, of each group having specific immune behaviors tailored to its local pathogens, McNeill speculates, and the resulting need for a system that policed group boundaries.30

  Suggestively, in places where there are more pathogens, there are more ethnic groups (among traditional peoples), and vice versa.31 Of all the various factors that could potentially predict the level of ethnic diversity in a given region, pathogen diversity is one of the strongest.32 And in experiments, people who are made more aware of pathogens express greater allegiance to their ethnic group, suggesting that biases toward one’s own group, the basis of cultural difference, is indeed linked to fear of disease. In a 2006 study, anthropologists found that eliciting people’s fears of contagion (for example, by indicating that a glass of milk they were about to drink was spoiled) heightened their ethnocentrist attitudes, compared to people whose fears of contagion were not thus heightened.33

  The differentiation of cultural groups by pathogens also dictated the outcome of confrontations between them. Groups of people have been able to vanquish other groups by wielding what McNeill calls an “immunological advantage.” They simply introduce pathogens to which they’ve adapted but against which their rivals have no immunity. It happened in West Africa three thousand years ago, when Bantu-speaking farmers who’d adapted to a deadly form of malaria penetrated the interior of the continent, bringing the pathogen with them. They rapidly defeated the hundreds of other linguistic groups believed to have populated the region in what historians call the “Bantu expansion.” Immunological advantages allowed the people of ancient Rome to repel invading armies from northern Europe, who perished from the Roman fevers to which locals had adapted. The protection afforded by Rome’s immunological advantage rivaled those of a standing army. “When unable to defend herself by the sword,” the poet Godfrey of Viterbo noted in 1167, “Rome could defend herself by means of the fever.”34

  Most famously, Europeans conquered the Americas starting in the fifteenth century by decimating native peoples with the Old World pathogens to which they had no immunity. Smallpox introduced by Spanish explorers killed the Incas in Peru and nearly half the Aztecs in Mexico. The disease spread throughout the New World, destroying native populations ahead of European settlement.35 Meanwhile, the people of tropical Africa repeatedly repelled the forays of European colonizers, who were felled by the malaria and yellow fever to which locals had adapted. (One unhappy result was the development of the brutal Atlantic triangle trade of the sixteenth to nineteenth centuries. Having failed to establish colonies in sub-Saharan Africa, Europeans carried captives from Africa across the ocean to the Americas to serve as slave labor on their sugar plantations.) These and other confrontations, decided by the immunological distinctions among us, continue to reverberate through modern society today.36

  * * *

  Seemingly contradictory ideas about beauty, in particular the attractiveness of potential mates, may have evolved as immune behaviors, too. While the precise architecture of romance remains decidedly mysterious, evolutionary biology suggests a few general rules. One is that people should be attracted to mates who will be good coparents and help them produce viable children. That’s just simple logic: people attracted to bad coparents tend not to have many kids, or not many who survive, diluting their numbers over time.

  The contradiction is that in the case of humans, the attractiveness of mates doesn’t seem to correlate with their likelihood of being good coparents. Cross-cultural studies have shown that women find male facial features that are controlled and made more pronounced by the hormone testosterone—broad chins, deep-set eyes, and thin lips—attractive. In general, the more testosterone a male has, the more likely he is to be attractive to women.37 And yet, at the same time, the more testosterone a male has, the less likely it is that he will be a good coparent. Compared to low-testosterone males, high-testosterone males are more likely to engage in antisocial behavior and are less likely to get married. If they do marry, they are more likely to get divorced, have extramarital affairs, and act violently toward their spouses. A high level of testosterone, in that case, should make males less attractive to females. But it’s just the opposite.38

  Broad chins and deep-set eyes, in other words, are like the peacock’s tail. Long, heavy, and conspicuously showy, the peacock’s tail is a clear hindrance to male birds’ survival. Female peahens looking for good mates should prefer male birds with less showy tails. But numerous studies have shown that, like human females who prefer high-testosterone males, peahens prefer male birds with the longest, fanciest tails.

  The reason why, evolutionary biologists say, is that a peacock’s long, fancy tail—precisely because it is a hindrance—signals to the peahen that he is a strong, able mate. It’s advertising. And one thing it advertises is the strength of the peacock’s defenses against pathogens. Peacocks with the longest, fanciest tails, scientists have found, have stronger immune systems and are less pathogen-infested than peacocks with shorter tails. And choosing them over peacocks with short, dull tails does help peahens enjoy greater reproductive success. Peahens who mate with long-tailed peacocks have bigger babies at birth who are more likely to survive in the wild, compared to peahens who mate with shorter-tailed peacocks. And so despite the fact that their dazzling, elaborate tails are a hindrance to peacocks’ survival, peahens continue to find them attractive.

  The male features in humans that indicate high-testosterone levels may perform a similar function. They, too, advertise the strength of a male’s immune system: high levels of the hormone correlate with stronger immune defenses. It may be that females find high-testosterone facial features attractive for the same reasons that peahens find long, showy tails attractive: they demonstrate the pathogen-fighting prowess of their mates.

  In a study of twenty-nine different cultures, psychologists found that those that placed more emphasis on the physical attractiveness of potential mates were indeed those with higher burdens of pathogens. Another study found that females who express greater awareness of contagion prefer males with more masculine features. There’s also experimental evidence to support the link between ideas about male beauty and contagion. Scientists experimentally manipulated subjects’ fear of contagion (for example, by showing them pictures of white fabric stained with blood) and then asked them to judge male features. They found that women whose awareness of pathogens had been heightened preferred images of males with more masculine features, compared to women who were not thus provoked.39

  Another curious facet of attractiveness and mate choice that may have originated as a strategy to survive ancient epidemics has to do with pathogen-recognizi
ng HLA genes. Choosing a mate with pathogen-recognition genes different from your own improves the chances that your children will be able to survive a broad range of pathogens. Indeed, couples whose pathogen-recognition genes differ enjoy greater reproductive success than couples whose pathogen-recognition genes are more similar. (They suffer fewer spontaneous abortions and their children are more closely spaced in age, suggesting that they experience few miscarriages.)

  Of course, the composition of other people’s pathogen-recognition genes can influence our choices about mates only if we can somehow distinguish between people with similar pathogen-recognition genes and those with exotic pathogen-recognition genes. Although most people are unaware of it, it turns out that we can. Numerous studies have shown that people, like other animals, can sense the composition of others’ pathogen-recognition genes by scent. (Precisely how pathogen-recognition genes influence body odor is unclear. It may revolve around how the proteins coded by the genes bind to cells or affect the bacterial fauna in the body that create odors.) And people have preferences based on those odors. In one study, subjects whose pathogen-recognition genes had been typed were asked to wear cotton T-shirts for two nights in a row (while refraining from using perfumes in soaps or other products and eating foods that produced strong odors). The T-shirts were then stuffed into unlabeled jars, which were presented to the subjects to sniff. Each preferred the scent of those T-shirts worn by people whose pathogen-recognition genes differed the most from their own.40

  That’s not to say we choose mates based solely, or even in part, on their body odor, of course. But it’s quite possible that we had to in our epidemic-plagued past. To this day we can sniff out the difference and feel a twinge of residual desire based on it.

  Microbes have exerted a similarly powerful influence on us via their perch from inside our bodies. Scientists are just starting to unravel the mysteries of the microbes that live in and on us, collectively known as the microbiome. So far, they’ve found that they’re often invisible puppet masters, too, with critical processes such as that of brain development in mammals, sex in insects, and immunity in mice triggered solely by the presence of certain microbes.41 The microbes that live in human guts influence our risk of developing obesity, depression, and anxiety. They may play a role in controlling our behavior as well. Experimentally ridding mice of their microbes altered their behavior in suggestive ways, reducing both their anxiety responses and ability to perform tasks requiring memory; exposing one mouse to the microbes of another led it to behave in ways that mimic the other.42

 

‹ Prev