Guns, Germs, and Steel

Home > Other > Guns, Germs, and Steel > Page 23
Guns, Germs, and Steel Page 23

by Jared Diamond


  Hence let’s begin by temporarily setting aside our human bias and considering disease from the microbes’ point of view. After all, microbes are as much a product of natural selection as we are. What evolutionary benefit does a microbe derive from making us sick in bizarre ways, like giving us genital sores or diarrhea? And why should microbes evolve so as to kill us? That seems especially puzzling and self-defeating, since a microbe that kills its host kills itself.

  Basically, microbes evolve like other species. Evolution selects for those individuals most effective at producing babies and at helping them spread to suitable places to live. For a microbe, spread may be defined mathematically as the number of new victims infected per each original patient. That number depends on how long each victim remains capable of infecting new victims, and how efficiently the microbe is transferred from one victim to the next.

  Microbes have evolved diverse ways of spreading from one person to another, and from animals to people. The germ that spreads better leaves more babies and ends up favored by natural selection. Many of our “symptoms” of disease actually represent ways in which some damned clever microbe modifies our bodies or our behavior such that we become enlisted to spread microbes.

  The most effortless way a germ could spread is by just waiting to be transmitted passively to the next victim. That’s the strategy practiced by microbes that wait for one host to be eaten by the next host: for instance, salmonella bacteria, which we contract by eating already infected eggs or meat; the worm responsible for trichinosis, which gets from pigs to us by waiting for us to kill the pig and eat it without proper cooking; and the worm causing anisakiasis, with which sushi-loving Japanese and Americans occasionally infect themselves by consuming raw fish. Those parasites pass to a person from an eaten animal, but the virus causing laughing sickness (kuru) in the New Guinea highlands used to pass to a person from another person who was eaten. It was transmitted by cannibalism, when highland babies made the fatal mistake of licking their fingers after playing with raw brains that their mothers had just cut out of dead kuru victims awaiting cooking.

  Some microbes don’t wait for the old host to die and get eaten, but instead hitchhike in the saliva of an insect that bites the old host and flies off to find a new host. The free ride may be provided by mosquitoes, fleas, lice, or tsetse flies that spread malaria, plague, typhus, or sleeping sickness, respectively. The dirtiest of all tricks for passive carriage is perpetrated by microbes that pass from a woman to her fetus and thereby infect babies already at birth. By playing that trick, the microbes responsible for syphilis, rubella, and now AIDS pose ethical dilemmas with which believers in a fundamentally just universe have had to struggle desperately.

  Other germs take matters into their own hands, figuratively speaking. They modify the anatomy or habits of their host in such a way as to accelerate their transmission. From our perspective, the open genital sores caused by venereal diseases like syphilis are a vile indignity. From the microbes’ point of view, however, they’re just a useful device to enlist a host’s help in inoculating microbes into a body cavity of a new host. The skin lesions caused by smallpox similarly spread microbes by direct or indirect body contact (occasionally very indirect, as when U.S. whites bent on wiping out “belligerent” Native Americans sent them gifts of blankets previously used by smallpox patients).

  More vigorous yet is the strategy practiced by the influenza, common cold, and pertussis (whooping cough) microbes, which induce the victim to cough or sneeze, thereby launching a cloud of microbes toward prospective new hosts. Similarly, the cholera bacterium induces in its victim a massive diarrhea that delivers bacteria into the water supplies of potential new victims, while the virus responsible for Korean hemorrhagic fever broadcasts itself in the urine of mice. For modification of a host’s behavior, nothing matches rabies virus, which not only gets into the saliva of an infected dog but drives the dog into a frenzy of biting and thus infecting many new victims. But for physical effort on the bug’s own part, the prize still goes to worms such as hookworms and schistosomes, which actively burrow through a host’s skin from the water or soil into which their larvae had been excreted in a previous victim’s feces.

  Thus, from our point of view, genital sores, diarrhea, and coughing are “symptoms of disease.” From a germ’s point of view, they’re clever evolutionary strategies to broadcast the germ. That’s why it’s in the germ’s interests to “make us sick.” But why should a germ evolve the apparently self-defeating strategy of killing its host?

  From the germ’s perspective, that’s just an unintended by-product (fat consolation to us!) of host symptoms promoting efficient transmission of microbes. Yes, an untreated cholera patient may eventually die from producing diarrheal fluid at a rate of several gallons per day. At least for a while, though, as long as the patient is still alive, the cholera bacterium profits from being massively broadcast into the water supplies of its next victims. Provided that each victim thereby infects on the average more than one new victim, the bacterium will spread, even though the first host happens to die.

  SO MUCH FOR our dispassionate examination of the germ’s interests. Now let’s get back to considering our own selfish interests: to stay alive and healthy, best done by killing the damned germs. One common response of ours to infection is to develop a fever. Again, we’re used to considering fever as a “symptom of disease,” as if it developed inevitably without serving any function. But regulation of body temperature is under our genetic control and doesn’t just happen by accident. A few microbes are more sensitive to heat than our own bodies are. By raising our body temperature, we in effect try to bake the germs to death before we get baked ourselves.

  Another common response of ours is to mobilize our immune system. White blood cells and other cells of ours actively seek out and kill foreign microbes. The specific antibodies that we gradually build up against a particular microbe infecting us make us less likely to get reinfected once we become cured. As we all know from experience, there are some illnesses, such as flu and the common cold, to which our resistance is only temporary; we can eventually contract the illness again. Against other illnesses, though—including measles, mumps, rubella, pertussis, and the now defeated smallpox—our antibodies stimulated by one infection confer lifelong immunity. That’s the principle of vaccination: to stimulate our antibody production without our having to go through the actual experience of the disease, by inoculating us with a dead or weakened strain of microbe.

  Alas, some clever microbes don’t just cave in to our immune defenses. Some have learned to trick us by changing those molecular pieces of the microbe (its so-called antigens) that our antibodies recognize. The constant evolution or recycling of new strains of flu, with differing antigens, explains why your having gotten flu two years ago didn’t protect you against the different strain that arrived this year. Malaria and sleeping sickness are even more slippery customers in their ability rapidly to change their antigens. Among the slipperiest of all is AIDS, which evolves new antigens even as it sits within an individual patient, thereby eventually overwhelming his or her immune system.

  Our slowest defensive response is through natural selection, which changes our gene frequencies from generation to generation. For almost any disease, some people prove to be genetically more resistant than are others. In an epidemic those people with genes for resistance to that particular microbe are more likely to survive than are people lacking such genes. As a result, over the course of history, human populations repeatedly exposed to a particular pathogen have come to consist of a higher proportion of individuals with those genes for resistance—just because unfortunate individuals without the genes were less likely to survive to pass their genes on to babies.

  Fat consolation, you may be thinking again. This evolutionary response is not one that does the genetically susceptible dying individual any good. It does mean, though, that a human population as a whole becomes better protected against the pathogen. Examples of
those genetic defenses include the protections (at a price) that the sickle-cell gene, Tay-Sachs gene, and cystic fibrosis gene may confer on African blacks, Ashkenazi Jews, and northern Europeans against malaria, tuberculosis, and bacterial diarrheas, respectively.

  In short, our interaction with most species, as exemplified by hummingbirds, doesn’t make us or the hummingbird “sick.” Neither we nor hummingbirds have had to evolve defenses against each other. That peaceful relationship was able to persist because hummingbirds don’t count on us to spread their babies or to offer our bodies for food. Hummingbirds evolved instead to feed on nectar and insects, which they find by using their own wings.

  But microbes evolved to feed on the nutrients within our own bodies, and they don’t have wings to let them reach a new victim’s body once the original victim is dead or resistant. Hence many germs have had to evolve tricks to let them spread between potential victims, and many of those tricks are what we experience as “symptoms of disease.” We’ve evolved countertricks of our own, to which the germs have responded by evolving counter-countertricks. We and our pathogens are now locked in an escalating evolutionary contest, with the death of one contestant the price of defeat, and with natural selection playing the role of umpire. Now let’s consider the form of the contest: blitzkrieg or guerrilla war?

  SUPPOSE THAT ONE counts the number of cases of some particular infectious disease in some geographic area, and watches how the numbers change with time. The resulting patterns differ greatly among diseases. For certain diseases, like malaria or hookworm, new cases appear any month of any year in an affected area. So-called epidemic diseases, though, produce no cases for a long time, then a whole wave of cases, then no more cases again for a while.

  Among such epidemic diseases, influenza is one personally familiar to most Americans, certain years being particularly bad years for us (but great years for the influenza virus). Cholera epidemics come at longer intervals, the 1991 Peruvian epidemic being the first one to reach the New World during the 20th century. Although today’s influenza and cholera epidemics make front-page stories, epidemics used to be far more terrifying before the rise of modern medicine. The greatest single epidemic in human history was the one of influenza that killed 21 million people at the end of the First World War. The Black Death (bubonic plague) killed one-quarter of Europe’s population between 1346 and 1352, with death tolls ranging up to 70 percent in some cities. When the Canadian Pacific Railroad was being built through Saskatchewan in the early 1880s, that province’s Native Americans, who had previously had little exposure to whites and their germs, died of tuberculosis at the incredible rate of 9 percent per year.

  The infectious diseases that visit us as epidemics, rather than as a steady trickle of cases, share several characteristics. First, they spread quickly and efficiently from an infected person to nearby healthy people, with the result that the whole population gets exposed within a short time. Second, they’re “acute” illnesses: within a short time, you either die or recover completely. Third, the fortunate ones of us who do recover develop antibodies that leave us immune against a recurrence of the disease for a long time, possibly for the rest of our life. Finally, these diseases tend to be restricted to humans; the microbes causing them tend not to live in the soil or in other animals. All four of these traits apply to what Americans think of as the familiar acute epidemic diseases of childhood, including measles, rubella, mumps, pertussis, and smallpox.

  The reason why the combination of those four traits tends to make a disease run in epidemics is easy to understand. In simplified form, here’s what happens. The rapid spread of microbes, and the rapid course of symptoms, mean that everybody in a local human population is quickly infected and soon thereafter is either dead or else recovered and immune. No one is left alive who could still be infected. But since the microbe can’t survive except in the bodies of living people, the disease dies out, until a new crop of babies reaches the susceptible age—and until an infectious person arrives from the outside to start a new epidemic.

  A classic illustration of how such diseases occur as epidemics is the history of measles on the isolated Atlantic islands called the Faeroes. A severe epidemic of measles reached the Faeroes in 1781 and then died out, leaving the islands measles free until an infected carpenter arrived on a ship from Denmark in 1846. Within three months, almost the whole Faeroes population (7,782 people) had gotten measles and then either died or recovered, leaving the measles virus to disappear once again until the next epidemic. Studies show that measles is likely to die out in any human population numbering fewer than half a million people. Only in larger populations can the disease shift from one local area to another, thereby persisting until enough babies have been born in the originally infected area that measles can return there.

  What’s true for measles in the Faeroes is true of our other familiar acute infectious diseases throughout the world. To sustain themselves, they need a human population that is sufficiently numerous, and sufficiently densely packed, that a numerous new crop of susceptible children is available for infection by the time the disease would otherwise be waning. Hence measles and similar diseases are also known as crowd diseases.

  OBVIOUSLY, CROWD DISEASES could not sustain themselves in small bands of hunter-gatherers and slash-and-burn farmers. As tragic modern experience with Amazonian Indians and Pacific Islanders confirms, almost an entire tribelet may be wiped out by an epidemic brought by an outside visitor—because no one in the tribelet had any antibodies against the microbe. For example, in the winter of 1902 a dysentery epidemic brought by a sailor on the whaling ship Active killed 51 out of the 56 Sadlermiut Eskimos, a very isolated band of people living on Southampton Island in the Canadian Arctic. In addition, measles and some of our other “childhood” diseases are more likely to kill infected adults than children, and all adults in the tribelet are susceptible. (In contrast, modern Americans rarely contract measles as adults, because most of them get either measles or the vaccine against it as children.) Having killed most of the tribelet, the epidemic then disappears. The small population size of tribelets explains not only why they can’t sustain epidemics introduced from the outside, but also why they never could evolve epidemic diseases of their own to give back to visitors.

  That’s not to say, though, that small human populations are free from all infectious diseases. They do have infections, but only of certain types. Some are caused by microbes capable of maintaining themselves in animals or in the soil, with the result that the disease doesn’t die out but remains constantly available to infect people. For example, the yellow fever virus is carried by African wild monkeys, whence it can always infect rural human populations of Africa, whence it was carried by the transatlantic slave trade to infect New World monkeys and people.

  Still other infections of small human populations are chronic diseases such as leprosy and yaws. Since the disease may take a very long time to kill its victim, the victim remains alive as a reservoir of microbes to infect other members of the tribelet. For instance, the Karimui Basim of the New Guinea highlands, where I worked in the 1960s, was occupied by an isolated population of a few thousand people, suffering from the world’s highest incidence of leprosy—about 40 percent! Finally, small human populations are also susceptible to nonfatal infections against which we don’t develop immunity, with the result that the same person can become reinfected after recovering. That happens with hookworm and many other parasites.

  All these types of diseases, characteristic of small isolated populations, must be the oldest diseases of humanity. They were the ones we could evolve and sustain through the early millions of years of our evolutionary history, when the total human population was tiny and fragmented. These diseases are also shared with, or similar to the diseases of, our closest wild relatives, the African great apes. In contrast, the crowd diseases, which we discussed earlier, could have arisen only with the buildup of large, dense human populations. That buildup began with the rise of ag
riculture starting about 10,000 years ago and then accelerated with the rise of cities starting several thousand years ago. In fact, the first attested dates for many familiar infectious diseases are surprisingly recent: around 1600 B.C. for smallpox (as deduced from pockmarks on an Egyptian mummy), 400 B.C. for mumps, 200 B.C. for leprosy, A.D. 1840 for epidemic polio, and 1959 for AIDS.

  WHY DID THE rise of agriculture launch the evolution of our crowd infectious diseases? One reason just mentioned is that agriculture sustains much higher human population densities than does the hunting-gathering lifestyle—on the average, 10 to 100 times higher. In addition, hunter-gatherers frequently shift camp and leave behind their own piles of feces with accumulated microbes and worm larvae. But farmers are sedentary and live amid their own sewage, thus providing microbes with a short path from one person’s body into another’s drinking water.

  Some farming populations make it even easier for their own fecal bacteria and worms to infect new victims, by gathering their feces and urine and spreading them as fertilizer on the fields where people work. Irrigation agriculture and fish farming provide ideal living conditions for the snails carrying schistosomiasis and for flukes that burrow through our skin as we wade through the feces-laden water. Sedentary farmers become surrounded not only by their feces but also by disease transmitting rodents, attracted by the farmers’ stored food. The forest clearings made by African farmers also provide ideal breeding habitats for malaria-transmitting mosquitoes.

  If the rise of farming was thus a bonanza for our microbes, the rise of cities was a greater one, as still more densely packed human populations festered under even worse sanitation conditions. Not until the beginning of the 20th century did Europe’s urban populations finally become self-sustaining: before then, constant immigration of healthy peasants from the countryside was necessary to make up for the constant deaths of city dwellers from crowd diseases. Another bonanza was the development of world trade routes, which by Roman times effectively joined the populations of Europe, Asia, and North Africa into one giant breeding ground for microbes. That’s when smallpox finally reached Rome, as the Plague of Antoninus, which killed millions of Roman citizens between A.D. 165 and 180.

 

‹ Prev