Book Read Free

The Plague Cycle

Page 8

by Charles Kenny


  It’s a sign of the health risk of malaria in tropical Africa that this dangerous genetic condition has lasted, simply because it can be a partial defense against the disease. The genetic marker for sickle cells is present in nearly one in five people in northern Angola and around one in ten in much of the rest of the continent south of the Sahara and north of South Africa. Outside of Africa, the sickle cell trait is very rare because, historically, its costs considerably outweighed benefits in areas with less lethal malaria strains, meaning people with the trait were more likely to die, so less likely to have children and pass on that trait to the next generation.4

  Another example of an inherited effective biological response is the human leukocyte antigen, which is associated with recognizing microbes within the body. Justin Cook of the University of California-Merced has found that those in or descended from European and Middle Eastern regions have high levels of human leukocyte antigen diversity. Africans come next, and the original native populations of the Americas and Oceanic states last (both because of less evolutionary pressure connected to disease but also because the original inhabitants were a smaller group of humans in the first place).5

  But it’s worth emphasizing again that genetic variance hasn’t made any group immune from a disease that affects the rest of humanity—doubtless in part because diseases have evolved in response. And the limits to the biological protections against disease help explain why animals, including humans, have developed a whole range of different instinctive responses to reduce their exposure to microbes.

  * * *

  Any time a horse stamps its hoof, tosses its head, or twitches its tail to disperse flies, it’s demonstrating an instinctual response to infection. Many animals go to great lengths to avoid fouling their nests with their own feces, while others, including baboons, rotate roosting sites so that any individual site is clear of parasitic larvae by the time they return. Sheep and cattle all avoid eating forage that is close to recently dropped feces or (in the case of cattle) areas laced with tick larvae.

  Disgust is a human instinctual response to infection risk according to scientists at the London School of Hygiene and Tropical Medicine. There are, they suggest, “a universal set of disgust cues… including bodily wastes, body contents, sick, deformed, dead or unhygienic people, some sexual behavior, dirty environments, certain foods—especially if spoiled or unfamiliar—and certain animals.”6 Worldwide, contact with these cues leads to shuddering, grimacing, heightened blood pressure, reports of nausea, and the desire to withdraw. The relationship between infectious disease risks and these universal disgust cues is clear.

  Disgust is also a source for fears of contact, and especially sexual contact. This is probably why the idea of barriers to the flow of body fluids as a disease preventative is very old indeed. The first mention of a condom-like device to stop the transmission of a deadly condition appears in Greek mythology, when King Minos of Crete used the bladder of a goat over his penis to protect his wife Pasiphae from the serpents and scorpions that swam in his semen and would routinely kill those with whom he had intercourse. Apparently, neither the scorpions nor the goat-bladder condom prevented Minos from fathering more than a dozen children. (The couple were well matched: Pasiphae’s tryst with a bull while disguised in a cow costume led to the birth of the Minotaur.)

  Disgust cues may also lie behind beliefs linking bad smells to sickness—the “miasma theory.” Vegetius’s de Re Militari (in English: Concerning Military Matters) is a fourth-century CE text on the training, logistics, strategy, and tactics of the Roman Army. Among other things, it’s the source of the maxim often repeated by military industrialists: “Let him who desires peace prepare for war.” But the book also contains advice on sanitation that displays miasma thinking at work: “If a large group stays too long during the summer or autumn in one place, the water becomes corrupt, and because of the corruption, drinking is unhealthy, the air corrupt, and so malignant disease arises which cannot be checked except by frequent changes of camp.”7

  Fear of disease is also, of course, a source of the instinctual fear of strangers. When newcomers attempt to join a troop of apes, they’re kept at the outskirts of the group and often attacked by dominant group members for weeks or months before being admitted. While this may also involve protection of food resources or the exclusion of breeding rivals, one reason for the behavior may be to expose infection and allow it to run its course before a new member is fully admitted. Think of it as a primate form of quarantine.8

  Staying away from infected people and keeping infected people away is a reasonable strategy. Continent-level isolation is what kept the original Americas and Australia free of so many Old World infections until the last few hundred years. At a local level, quarantine and social distancing can reduce the average number of new people each person with the disease goes on to infect. In some cases (and the less contagious the disease, the easier this is), limiting connections between people in an infected community can drop the number of new people each disease carrier infects to below one. If you manage that for long enough, the outbreak dies out.

  The 2003 SARS coronavirus outbreak, in which infected people rapidly showed clear symptoms, was controlled through quarantine measures. Cross-country tracing and isolation of people who’d been in contact with a known SARS victim quickly found and sequestered the majority of all the people who were infected by the virus worldwide.9 Even if it’s hard to completely end a large-scale outbreak of a more contagious, less obvious disease in the same way as SARS, you can at least slow its spread by keeping people apart. That is why isolation and social distancing were both introduced in 2020 to reduce the chance that Covid-19 sneezed or coughed out by an infected person reached a previously uninfected host.

  The instinctually understood reality that isolation can be effective may be one reason why surveys from around the world in the spring of 2020 found considerable majorities in favor of lockdown and social distancing strategies—more than 70 percent in Senegal and about three-quarters of Americans in mid-April, for example.10 It may also be why when you make study participants think about infectious disease—showing them pictures of someone with measles, perhaps—those participants report themselves less keen to socialize with others than people who haven’t been shown the pictures (and so probably aren’t thinking about disease).11 Worryingly, test subjects made to think about infection also show more racial prejudice.

  And this experiment encapsulates the checkered history of exclusion through the ages: a sometimes reasonable response to disease threats evolves all too frequently into a mindless justification for deadly prejudice. Nativists, racists, bigots, and the rich have always been prone to use disease outbreaks combined with the exclusion instinct as an opportunity to act on their theories of superiority. Often left unsaid but nevertheless implied is that a disease has flared up because of the moral or intellectual failings of the victims, their genetic inferiority, or their lack of manners.12

  * * *

  The earliest written sources suggest that people have long appreciated the risk of contagion and understood the benefits of exclusion. For thousands of years cities have confined disease sufferers to their quarters, and in the earliest civilizations, officials sometimes obligated soldiers returning from campaigns to burn their clothing and shields.13

  An association between disease and exclusion has also shaped a number of religious customs. The Book of Leviticus in the Bible includes detailed instructions for diagnosing, isolating, and treating victims of leprosy (not the condition we now know of as Hansen’s disease, but another ailment):

  When a man shall have in the skin of his flesh a rising, a scab, or bright spot, and it be in the skin of his flesh like the plague of leprosy; then he shall be brought unto Aaron the priest… and the priest shall look on him, and pronounce him unclean.… And the leper in whom the plague is, his clothes shall be rent, and his head bare, and he shall put a covering upon his upper lip, and shall cry, Unclean, unclean. All t
he days wherein the plague shall be in him he shall be defiled; he is unclean: he shall dwell alone; without the camp shall his habitation be.… And on the eighth day he shall take two he lambs without blemish.… And he shall slay the lamb in the place where he shall kill the sin offering and the burnt offering, in the holy place.

  Treatments also involve the sacrifice of turtle doves, alongside the plentiful use of cedar wood, fine flour, and oil.

  Because of the natural fear of contagion, ministering to the infected has long been associated with devotion and morality, and that’s particularly true of the condition that came to be called leprosy in the Middle Ages: Hansen’s disease itself. An apocryphal story tells of the future King David I of Scotland in 1100 coming upon his sister, wife of King Henry I of England, who was kissing the feet of lepers with devotion. David warns that her husband will never kiss her again. His sister replies, “Who does not know that the feet of an Eternal king are to be preferred to the lips of a mortal king.”14

  Compare that with Diana, Princess of Wales, nine hundred years later, suggesting, “It has always been my concern to touch people with leprosy, trying to show in a simple action that they are not reviled, nor are we repulsed.”15

  The infected suffer lesions that can damage facial bones and the extremities as well as nerve endings, all to the point that fingers, toes, and even whole limbs may fall off. Hansen’s disease is not in fact very contagious: only communicable after prolonged exposure and (apparently) only to the hereditarily predisposed. Lifelong quarantine is an unnecessary and cruel response. But the Chinese Book of Han in 2 CE reports that leprosy victims were sent to a hospital where they were isolated.16 Similarly in Europe, in the first 250 years of the second millennium, the misapplied biblical suggestion of uncleanliness made those accused of leprosy subject to religious dictate over treatment.

  Once pronounced leprous after priestly or magisterial examination, the victim was taken under a black cloth to the altar, where a priest threw earth from the cemetery over his or her head, uttering “dead unto the World but alive unto Christ.” The priest read out a list of prohibitions against entering churches, taverns, or marketplaces, talking to someone upwind of their body, traversing narrow lanes, or touching wells, streams, fountains, or children. And then the leper was led out of town to the lazar house, where they would live with fellow sufferers. The leper’s property was passed on to inheritors, and the only legal recognition of his or her continuing earthly presence was that a wife couldn’t divorce an infected husband.17

  Nineteen thousand leprosarium or lazar houses operated across Europe, stationed on the downwind side of towns. They were filled with victims of Hansen’s disease and others who a priest, magistrate, or jury had determined were at least deserving of the condition.18 The moral nature of leprosy was made clear by Richard of the Abbey of St. Victor in Paris in the 1200s: “Fornicators, concubines, the incestuous, adulterers, the avaricious, usurers, false witnesses, perjurers, those likewise who look upon a woman lustfully… all are judged to be leprous by the priests,” he wrote.19 Confession was thought the only cure, lechery the likely cause.

  Along with Jewish people, lepers were ordered to wear special clothes by the Catholic Church in 1215, and the occupants of lazar houses were regularly accused of the same plots and conspiracies, and suffered the same horrible consequences. In 1321, King Philip V of France became convinced that the heads of lazar houses were planning to poison wells with reptile parts and human excrement in an attempt to contaminate all of France with leprosy. Under torture, some lazar heads admitted the plot, and claimed funding had come from Jews and distant Muslim kings. Across France, lepers were tortured and burned at the stake—inevitably, Jewish people were burned alongside them. Not for the first or last time, disease victims weren’t only blamed for their condition but mistrusted and maltreated as a result.

  In 1364, Guy de Chauliac, Pope Clement VI’s doctor, listed the marks and signs of leprosy in his manuscript La Grande Chirurgie. The list became a standard reference tool used by doctors when they were called to provide evidence at leper trials, and helped end the mass incarceration of those accused of incubating the condition.20 As we’ll see, de Chauliac wasn’t as successful when it came to responding to the Black Death—the great health disaster of his age—but he did report one theory about the plague’s origin: “In some places, they believed that the Jews had poisoned the world.” It was a conspiracy theory that was to have grim consequences, because alongside the hideous natural death toll, others died at the hands of those looking for a “foreign” scapegoat for the disease.

  On April 13, 1348, the year the Black Death returned to Europe after an eight-century absence, the Jewish quarter of Toulon was sacked and forty victims were dragged from their homes and murdered. In the following days, surrounding towns and villages followed suit. These were the first of a series of brutal attacks across Europe as rumors spread that the plague was the result of Jews poisoning wells under the instruction of a rabbi named Jacob, who was set on world domination. By the end of 1348, the popular madness surrounding well poisoning had spread throughout much of Germany. The Jewish population of Basel and Strasbourg were herded into specially constructed houses that were set on fire. In Speyer and Worms, Jewish communities headed off murder through mass self-immolation. More than two-thirds of the German towns with significant Jewish populations saw pogroms between 1348 and 1350.21 We’ve seen that anti-Semitism was not created by the plague, but the plague gave anti-Semites a murderous excuse. (And the roots ran deep: people in areas where more plague-related pogroms occurred in the fourteenth century were considerably more likely to vote for the Nazi Party in the 1930s, a half-millennium later.)22

  Jews were the most horribly treated but not the only suspect group to be singled out as exclusion became the public policy response to the plague. Over time, health commissioners began regulating schools, church services, religious processions, and the movement of beggars, soldiers, and prostitutes. Authorities could lock people into their own houses, seize and burn belongings of the sick, or send victims to pest houses. (Death rates in those houses soared thanks not just to plague but malnutrition, starvation, and other infections of the abandoned.)23

  Argues historian of medicine Dorothy Porter, health legislation “was targeted at restricting the movements of the morally outcast, such as prostitutes and sodomites, ‘ruffians’ and beggars, as well as the plague sick-poor, who were assumed to pose equally serious threats to civil order.”24 As with leprosy, the plague was considered a judgment on the unworthy.

  Eugenia Tognotti of the University of Sassari notes that medicine “was impotent against plague; the only way to escape infection was to avoid contact with infected persons and contaminated objects.”25 So social distancing made sense. The Florentine writer Boccaccio, who reported on the failure of his city’s efforts to keep the plague away, describes one strategy to survive in his fictional account of a group of men and women who abandon the plagued city altogether and retreat to an estate two miles out of town. He suggests those who thought “there was no medicine for the disease superior or equal in efficacy to flight” were “the most sound, perhaps, in judgment, as they were also the most harsh in temper.”26

  And sufficient isolation really worked: in a later plague outbreak in France from 1720 to 1722, nearly nine out of ten villages with fewer than one hundred people were spared a case of the plague altogether. Compare that to mortality that reached 30 to 40 percent in large towns.27 Staying away from urban concentrations of rats, fleas, and people—in cut-off, largely self-sufficient farming communities—was a smart move.

  But commentators at the time wondered if some of the exclusionary measures did more harm than good—and they had a point. There’s good reason to doubt that isolation in houses slowed the movement of rats that helped carry plague, and significant grounds for thinking isolation and incarceration of the sick along with healthy relatives did harm.

  The regulations certainly increas
ed the power of health officials. And because disinfection and quarantine were expensive—cleaning out 1,536 homes in Milan in 1576 cost the equivalent of more than one hundred pounds of gold—taxes rose rapidly in cities undergoing a plague epidemic.28 Alongside war, public health became a major force behind the growing scale of government.

  Another health-safeguarding idea that gained currency was restricting the arrival of peaceful people from other cities and states, since plagues always seemed to come from “elsewhere.” Florence imposed fines on visitors from plague-affected cities and appointed a municipal health commission with the authority to forcibly remove infected persons (the stated reason was that “a corruption or infection of the air” might arise from them).29

  In 1348, Venice kept ships in the harbor from docking for thirty days to see if those on board came down with the plague. Venetian colonies followed suit. In Marseilles in 1383, the isolation period was extended to forty days, and as that practice spread, it came to be referred to as “quarantine” (quaranta is the Italian for “forty”).30

  The proliferation of sea and land quarantines may have helped reduce the extent of later plague pandemics. Four thousand troops manned the southern border in Austria-Hungary, holding many travelers in quarantine for as long as forty-eight days, fumigating trade goods, and putting suspect goods in a warehouse. People were considered expendable enough that if they developed plague symptoms, they could be shot. The last major plague outbreak in Western Europe, in 1720, was traced to a ship that had evaded the quarantine system by bribing the authorities. It went on to kill one hundred thousand people in and around Marseilles.31

  * * *

  The association of disease with outsiders—either socially undesirable or geographically distant—continued through the centuries. Take syphilis, first recorded in Europe in the armies of Charles VIII of France, fighting in Italy in 1494 (newly returned members of Columbus’s crew were among the troops). As that army disbanded, it helped spread the disease across Europe, where it became known variously as the Naples Disease, the Spanish Sickness, the French Pox, the German Sickness, and the Polish Sickness, depending on the course of its spread and the traditional national prejudices of the country most recently infected.32 When it reached the Middle East, it was called “the European Pestilence.” And the Japanese labeled it “the Chinese Pox.” (Similarly, the Irish were blamed for bringing cholera to the US in 1832, the Italians for spreading polio, and tuberculosis was called “the Jewish disease.”)

 

‹ Prev