Some viruses also lay low in specific tissues, biding their time. The best known are chickenpox and herpes. In fact, chickenpox (Varicella) is a member of the Herpesvirus family and is unrelated to the true Poxviruses (smallpox, cowpox, and so on). Several related variants of herpesvirus cause cold sores and genital herpes. Although the symptoms may be suppressed by treatment or vanish spontaneously, herpes never disappears completely. A few viruses remain hidden in a quiescent state. Symptoms may re-emerge under certain circumstances—if, say, the victim undergoes a period of stress. Chickenpox may also lie latent in nerve cells, re-emerging later in life as shingles, a painful skin rash. After reemerging, the virus may be passed on to others.
Development of genetic resistance to disease
“What does not kill me makes me stronger.”
—Friedrich Nietzsche
On average, the healthier, faster zebras escape being eaten by the lion and survive to carry on the species. Disease, like large predators, preferentially carries off the young, the old, the weak and crippled, and the feeble-minded, together with those who have no friends, family, or allies to help them. Vulnerability is not merely a physical matter. Declining mental alertness may increase vulnerability to disease due to lack of appropriate behavior. From a Darwinian perspective, both predation and disease improve the species, often in a rather nonspecific manner, by selecting for healthy and vigorous individuals. In addition, more specific effects occur.
When a virulent epidemic rages through human populations, some survive and some die. In the days before vaccination, antibiotics, and modern medical technology, what decided who was fortunate and who was not? In addition to sheer luck, both social and biological factors affect the chances of catching a disease, as well as the likelihood of surviving if infected. We start with the strictly biological factors.
First, we must distinguish immunity from resistance. Both protect against infection, although in quite different ways. Immunity occurs within a single lifetime. It results from previous infection by the same disease, or one closely related. The immune system remembers, and when exposed again, it rapidly extinguishes the invader. This assumes that a person survived the first encounter with the disease. Vaccination is based on deliberately exposing people to mild or crippled variants of a disease. This prepares the immune system for meeting the real-life, dangerous version of the disease. Immunity may be full or partial. It may last a lifetime or just a few years. Immunity is not inherited and cannot be passed on to children.
Resistance is genetic. A person is born with it or not, and resistance operates the first time you are exposed to a disease. After a lethal epidemic has passed, the humans who are resistant will have survived. Some who are sensitive will also have survived, for assorted other reasons. Nonetheless, if the death rate is significant, the proportion of people carrying genes for resistance will increase. The survivors pass these resistance genes on to their children, and the next generation starts out with a higher proportion of resistant individuals. If the same disease returns, it will kill a substantial number of the sensitive individuals in the new generation, and the proportion of resistant individuals will go up again. After several recurrences, the majority of the human population will be resistant.
Both smallpox and bubonic plague illustrate the emergence of resistance among humans. The earliest smallpox epidemics recorded in Japan had a 70%–90% death rate. By the mid–twentieth century, although smallpox was still dangerous, the death rate had fallen to around 10%–20%. Bubonic plague shows a similar history. The first European outbreaks in the mid-1300s were highly lethal, and several successive epidemics over the following century reduced the population of Europe by two-thirds. The same disease returned in the 1660s. Despite being called the Great Plague of London, both the infection rates and the probability of death among those infected were much lower. London lost scarcely 10% of its population.
Some diseases go extinct
If a particular infection returns periodically, it will find fewer susceptible victims each time. Eventually, sensitive humans may become so rare that the disease cannot find enough victims to continue transmitting itself and it may go extinct. This has happened to several diseases. Although many ancient records are ambiguous or lack medical detail, others describe outbreaks of pestilence whose symptoms are no longer familiar. Many of the early plagues of Rome and China either no longer are with us or have evolved out of recognition. The Great Plague of Athens, described so graphically by Thucydides, is the classic example.
The mysterious English sweating sickness caused quite a stir in historical times but failed to survive. The sweating sickness appeared in London in 1485. It was probably brought by mercenary troops from France who helped Henry VII seize the throne from Richard III. Symptoms included the sudden onset of fever, headaches, and “great swetyng and stynkyng with rednesse of the face and all the body.” Most victims recovered, although a significant minority died within a day or two. Fatalities were oddly erratic; in some communities, 30%–50% were killed, while in other towns, almost none of those taken sick actually died.
The English and Germans were susceptible, but the Scots, Welsh, and Irish (that is, those of Celtic lineage) were mostly not affected. Neither were the French, who, if truly guilty of bringing it to England, must have suffered from only a mild form of the disease. The worst epidemic, that of 1528–1529, spread to Germany, Holland, Scandinavia, Switzerland, Lithuania, Russia, and Poland, but ignored France and the rest of Southern Europe. This suggests a strong genetic element in susceptibility. Curiously, the English upper classes were hit harder than the common people. Two successive Lord Mayors of London died in the first epidemic of sweating sickness, and in 1517, Cardinal Wolsey fell seriously ill but survived. In all, five outbreaks of sweating sickness occurred over about half a century, and then the disease faded away. A similar but milder disease, the “Picardy sweats,” appeared in France during the 18th century, supporting the idea of a French connection. No known disease today has these symptoms.
Milder germs or mutant people?
When a disease gets milder, what has really happened? Did the disease change, or did the humans? Germs may mutate to avoid killing their victims too quickly, in order to spread themselves around. Humans may become resistant because sensitive individuals die off. Both processes occur in real life. Syphilis became milder. Humans became resistant to measles. In many cases, such as with malaria or leprosy, both processes have occurred. Because we are embarrassed talking about death and dislike thinking of the millions of humans weeded out by influenza, measles, and smallpox, we tend to talk of a disease getting milder even when humans became resistant.
Consider two alternative approaches for a disease to avoid killing its victims too fast. One is for the disease to become genuinely milder and nonlethal. Alternatively, the disease may remain lethal but kill only slowly. This is probably what happened to leprosy. Historical accounts suggest that leprosy was once highly contagious and far more virulent. Today leprosy is difficult to catch and will still kill if untreated, but this takes many years. Both victims and disease have changed genetically over time. Many Europeans carry genes for resistance to infection by leprosy.
Today we have direct genetic evidence for human resistance to schistosome worms, malaria, tuberculosis, leprosy, typhoid/cholera, HIV (AIDS), hepatitis B, and hepatitis C. The great sensitivity of indigenous Americans, both North and South, to influenza, measles, smallpox, and other Old World diseases implies that, here, too, genetic resistance has evolved in the Old World populations.
Group survival involves more than individual resistance
When a human society shows an altered response to a disease that is passed from parent to child, several possibilities exist apart from genetics. Behavioral avoidance is any cultural change that leads to protection. People who use mosquito nets and wear insect repellent become “culturally resistant.” No genetic changes have occurred, but cultural resistance can be “inherited,” in the sen
se that customs are passed from one generation to another. Some social effects also have an underlying biological basis.
People often use the term herd immunity to refer to two distinct protection mechanisms. Here we use the terms indirect immunity and herd resistance. Indirect immunity occurs when an immune or resistant majority shields a minority of sensitive individuals from infection. Let’s suppose that 90% of a human population is either immune or resistant to some particular infection. The other 10% will be protected because the disease will find it very difficult to transmit itself through the population. The minority of sensitive people are hiding in the biological shadow of those who cannot catch—or, therefore, transmit—the infection. In practice, immunizing 75% or so of a population often breaks the chain of infection well enough to protect the unvaccinated minority. The exact numbers depend on the nature of the disease and its transmission mechanism.
Group resistance is quite different and results from having a large population with plenty of genetic diversity. The population has many alternative versions of the genes that protect against infection. Some versions work better against one disease; other versions of the same genes work poorly against the first disease but act well against other infections. Different individuals carry different versions of these protective genes. Even if a totally new and highly virulent disease appears, a large, genetically diverse population will contain some individuals who are inherently resistant. Even during the worst outbreaks of Ebolavirus, around 10% of those infected survived. Even if most individuals die, the species will survive.
Thus, the species, viewed as a unit, may be resistant despite the fact that most individuals are sensitive. Life goes on. Note that we are not talking about “diversity” in the sense of artificially mixing individuals of different races to produce a politically correct community. Most local human populations have considerable internal genetic diversity, especially in the immune system. Despite hitting previously unexposed populations, the Black Death in Europe, smallpox in the New World, and Ebolavirus in Africa all had mortality rates of 60%–80% against “racially pure” local peoples. Contrast the introduction of myxomatosis to Australia. Myxomatosis is a lethal virus disease of rabbits that was released in Australia to control the rabbit population. The initial epidemic killed over 98% of the rabbits. These rabbits were the inbred descendents of just a handful of European rabbits that had colonized the Australian continent. The rabbit population had little genetic diversity, so the die-off was colossal.
The implications of resistance to infection
Over the ages, humans have developed resistance to many infections. Some of these diseases have gone extinct; others have evolved to survive. When a relatively resistant human population and its diseases meet a previously isolated and sensitive population, there are major repercussions. The devastation of American Indians, both North and South, by measles and smallpox introduced by Europeans is a classic example. To Europeans, measles is a mild childhood disease, and smallpox, though not mild, has a death rate of only 10%–20%. American Indians had never been exposed to either disease and, therefore, had no chance to evolve resistance. Consequently, they died in droves.
Many other examples are known in which disease has devastated one side (sometimes both) in human conflicts. Sometimes disease fights for the home team. The colonial takeover of Africa was hindered more by malaria, sleeping sickness, yellow fever, and other gruesome tropical diseases than by military resistance from stone-age warriors, despite the world-renowned bravery of such peoples as the Zulu. However, dense urban populations, who have been previously ravaged by some pestilence and have developed resistance, generally have the advantage. When they come into contact with less dense populations, on the fringes of civilization, the barbarians usually sicken and die. Unfair as it may seem, pestilence usually fights on the side of the Empire, evil or otherwise.
Although it is clearly beneficial to be resistant to disease, sometimes there is an unexpected price to pay. We are beginning to realize that certain hereditary diseases are the dark side of resistance to infectious disease. Sickle cell anemia is a by-product of resistance to malaria, and cystic fibrosis of resistance to diarrheal diseases, especially typhoid. To understand this, we must review the mechanism of inheritance. Humans, like all higher animals, have two copies of each gene, one inherited from their mother and the other from their father. Thus, if one copy is damaged by mutation, a back-up is present.
Resistance to disease often results from having one mutant copy of a gene that is defective in its original function. Children who inherit two defective copies, one from each parent, may suffer from a hereditary defect. Children with one good and one mutant (that is, resistant) copy will be resistant to the disease in question. Children who get two good copies of the gene will be healthy but still be susceptible to the disease. In practice, a balance is struck, depending on how common and how dangerous the disease is and how crippling the hereditary defect is.
Resistance to malaria via the sickle cell gene reduces the oxygen-carrying capacity of the blood. One good copy of this gene allows the blood to carry enough oxygen. Those with one good gene and one mutant gene are resistant to malaria. Those with two defective copies might, in theory, be even more resistant to malaria. Unfortunately, they do not live long enough to find out, because they suffer from sickle cell disease and their blood cannot carry enough oxygen.
We are the survivors of the frequent epidemics that have emerged in the relatively short time since humans began huddling together in overcrowded towns and cities. Consequently, unlike most wild animals, modern-day humans carry many dubious genetic alterations that have allowed us to muddle through the short-term crises of successive plagues. How has this affected our overall health? Have these changes affected our behavior, intelligence, or other mental abilities? The precise effects are mostly unknown, although we are beginning to see a few glimmers, thanks to modern genetic analysis. One fascinating recent link is between the prion protein, whose malformed version is responsible for mad cow disease, and the certain receptors in the brain. The healthy form of the prion protein appears to protect the NDMA receptors from overstimulation. In genetically engineered mice, extra NDMA receptors produce higher intelligence. Would changes in susceptibility to mad cow disease change our intelligence? Perhaps. We are the children of pestilence, held together by genetic jury-rigging.
Hunting and gathering
Early humans were hunter-gatherers. Men hunted game; women gathered roots, nuts, and fruit. Our ancestors roamed in small bands, rarely meeting other tribes. Early hunter-gatherers occupied prime land with plenty of large game. Today’s few remaining hunter-gatherers inhabit marginal areas in jungles or semidesert. Thus, the early hunter-gatherers were probably better fed. On the other hand, they did not have the option of visiting a modern hospital if injured, nor of trading skins and furs for portable DVD players and candy bars. Nevertheless, with some reservations, today’s hunter-gatherers are the best illustration we have of conditions before most of mankind settled into an agricultural way of life.
Patterns of infection vary greatly between hunter-gatherers and settled, agricultural societies. Two major factors are intertwined: low population size and high mobility. Ancient hunter-gatherers almost certainly had much less infectious disease than we have today. As already noted, before the growth of dense human populations, most of our epidemic diseases did not exist. Furthermore, small, mobile, and relatively isolated tribes would rarely have been infected by contact with other groups. Today’s hunter-gatherers tend to catch most of the diseases current among the settled farmers who live nearby. Nevertheless, they are still far less likely to be infected with the parasitic worms and intestinal protozoa that are circulated by the droppings of domestic animals and by irrigation programs. Their lifestyle of roaming over dry plains also protects them from the malaria that is typically found in marshy, wet, and irrigated regions.
Life expectancy and developing civilization
O
verall, early hunter-gatherers were probably healthier and better fed than the sedentary farmers who followed them. Before civilization, life expectancy was probably around 30–40 years. Women bore five or six children, and infant mortality was perhaps as low as 30%, with a fair number of children dying between infancy and adulthood. Although most deaths were caused by infections, accidental deaths were also probably frequent among hunter-gatherers. The agriculturists who followed were more civilized than hunter-gatherers, in the sense of having better technology. However, their stationary lifestyle made them more susceptible to infections, and as villages grew into towns and towns into cities, disease became progressively more of a burden. Despite having more food, early farmers often had poorly balanced diets, as they relied on just a few staple crops. Meat consumption was low, as domestic animals were too valuable to slaughter routinely, and only the rich ate meat regularly.
In early societies, outbreaks of infection from domestic animals were probably quite frequent. But most of these would have died out rather quickly, due to lack of sufficient people—and animals—to keep the infection in circulation. Only after populations of a third to half a million were available could such new infections adapt to humans and survive as specifically human diseases. Although cities go back to roughly 4000 B.C. in Sumer, they were originally too small to support continuous epidemic disease.
Germs, Genes, & Civilization: How Epidemics Shaped Who We Are Today Page 6