Book Read Free

The Panic Virus

Page 4

by Seth Mnookin; Dan B. Miller


  In New England, the Puritans were also learning about inoculation, but there acceptance of the procedure was slower. In 1706, a “Coromantee” slave of Cotton Mather’s named Onesimus described for the minister being inoculated as a child in Africa. In the following years, Mather told friends about his slave, who “had undergone an Operation, which had given him something of ye Small-Pox, & would forever praeserve him from it.” Mather, whose wife and three youngest children had died of measles, soon became a passionate advocate of the procedure; still, it wasn’t until 1721, in the midst of an epidemic in which eight hundred Bostonians died and half the city became ill, that he was able to persuade a local doctor named Zabdiel Boylston to inoculate a pair of slaves, along with Mather’s and Boylston’s young sons. After a brief period of illness, all recovered.

  Mather, a Puritan minister best known for his involvement in the Salem Witch Trials several decades earlier, began preaching to anyone who would listen that inoculation was a gift from God. This view, he quickly discovered, was not a popular one: Not long after he and Boylston publicly announced their results, Mather’s house was firebombed. An accompanying warning read, “COTTON MATHER, You Dog, Dam you. I’l inoculate you with this, with a Pox to you.” Some of the procedure’s most vocal opponents feared that inoculation would spread smallpox rather than guard against it. Others cited biblical passages—especially apropos was Job 2:7, which read, “So went Satan forth from the presence of the Lord and smote Job with boils, from the sole of his foot, unto his crown”—as proof that smallpox was a form of a divine judgment that should not be second-guessed or interfered with. (It was for this reason that gruesome vaccination scars came to be known as the “mark of the beast.”)

  The tendency of our forefathers to view smallpox as an otherworldly affliction is easy to understand: It is one of the world’s all-time nastiest diseases. After a dormant period in the first several weeks following infection, the virus erupts into action, causing bouts of severe anxiety, lacerating headaches and backaches, and crippling nausea. Within days, small rashes begin to cover the hands, feet, face, neck, and back. For an unlucky minority, those rashes lead to internal hemorrhaging that causes victims to bleed out from their eyes, ears, nose, and gums.

  Most of the time, however, the progression of the disease is not so swift. Over a period of about a week, the initial rashes transform first into pimples and then into small, balloonlike sacs, which render some of the afflicted all but unrecognizable. Three weeks after infection, the vesicles begin to fill with oozing pus. Several days later, after these increasingly foul boils are stretched to capacity, they burst. The resulting stench can be overpowering: One eighteenth-century account described “pox that were so rotten and poisonous that the flesh fell off . . . in pieces full of evil-smelling beasties.” Throughout the 1700s, between 25 and 30 percent of all smallpox victims died. Even those survivors who were not permanently blinded did not escape unscathed, as the vast majority of them were left with scars across their cheeks and noses.

  The fact that for many people the threat of being afflicted with smallpox was not enough to overcome an innate resistance to having infected pus smeared on an open wound can likely be attributed in part to a phenomenon called the “disgust response.” In a 2001 paper, sociologists Valerie Curtis and Adam Biran speculated about a possible evolutionary explanation for what the cognitive scientist Steven Pinker has called human beings’ “intuitive microbiology”: “Bodily secretions such as feces, phlegm, saliva, and sexual fluids, as well as blood, wounds, suppuration, deformity, and dead bodies, are all potential sources of infection that our ancestors are likely to have encountered,” Curtis and Biran wrote. “Any tendency towards practices that prevented contact with, or incorporation of, parasites and pathogens would have carried an advantage for our ancestors.”6 Looked at from this perspective, it’s a testament to smallpox’s sheer hellishness that anyone willingly underwent the crude vaccination efforts of the early eighteenth century.

  There were, to be sure, other explanations for opposition to inoculation, including the colonists’ hair-trigger resistance to anything that was perceived as infringing upon individual liberties. Even the smallpox epidemic that engulfed Boston in 1752, in which 7,669 of the city’s 15,684 residents were infected, did not sway the procedure’s most fervent opponents. In the years to come, these reactions would, at least for a brief while, become secondary to the fear of losing the struggle that defined America’s very existence: the Revolutionary War.

  On September 28, 1751, nineteen-year-old George Washington and his half-brother, Lawrence, left the family plantation at Mount Vernon for Barbados. This was no vacation: Lawrence hoped the West Indian island’s warmer climate would help cure his tuberculosis. The day after completing their five-week trip, George and Lawrence were persuaded to go to dinner with a local slave trader. “We went,” Washington wrote, “myself with some reluctance, as the smallpox was in his family.” As Elizabeth Fenn recounts in Pox Americana, her history of smallpox outbreaks during the American Revolution, Washington’s concern was justified: Exactly two weeks after that dinner he wrote in his diary that he had been “strongly attacked with the small Pox.” Washington, overcome with the anguish of the disease, would not make another journal entry for close to a month.

  Twenty-four years later, with Washington newly installed as commander in chief of the Continental Army, the colonies were struck with the deadliest smallpox epidemic in their brief history. In Boston, the death toll reached five per day, then ten, then fifteen; by the time it was at thirty, the city’s churches no longer even bothered to ring their funeral bells. It was in the midst of this environment that Washington mounted an ill-conceived wintertime attack on the British forces that were holed up in the walled city of Quebec. Throughout December, ragtag American battalions and straggling troops marched to Canada from as far south as Virginia. Some reached the American encampment already infected with smallpox; others, weakened by their travels, found themselves thrust into a prime breeding ground for the virus. Many of the new conscripts had ignored recommendations that they get inoculated before reporting for duty; once ill, they regularly disregarded procedures for alerting commanding officers about their infections.

  As 1775 drew to an end, Benedict Arnold, then the leader of the northern forces, warned Washington that any further spread of smallpox could lead to the “entire ruin of the Army.” Since the disease was endemic in Europe, many British soldiers had been infected when they were boys—an age when survival rates were relatively high—and were now, like Washington, immune, which allowed them to occupy smallpox-stricken towns and fight against smallpox-infected regiments without fear of falling ill.

  The Americans had no such luxury. By Christmas Day, roughly a third of the troops massed outside Quebec had fallen ill. Still, the Continental forces stuck to their plan, and launched an assault on New Year’s Eve. The American and Canadian coalition maintained its largely ineffectual siege until May, when newly arriving British troops forced an embarrassing retreat. It was the first battlefield defeat in America’s history.

  Determined not to repeat the mistakes of Quebec, Washington spent much of 1776 torn about whether to require variolation for new conscripts. It was a torturous decision: On the one hand, inoculation would protect his soldiers and dampen what was rapidly becoming a full-fledged, smallpox-induced fear of enlistment. On the other, it would mean that significant numbers of American troops would be out of commission for weeks on end. More than once in 1776, Washington issued decrees requiring inoculation, only to change his mind days later.

  In the end, repeated rumors of Tory biowarfare tipped the balance. “It seems,” Josiah Bartlett told a fellow congressional delegate in early 1777, that the British plan was “this Spring to spread the small pox through the country.”7 That February, Washington ordered his commanders to “inoculate your men as fast as they are enlisted.” “I need not mention the necessity of as much secrecy as the nature of the Subject will a
dmit of,” he wrote, “it being beyond doubt that the Enemy will avail themselves of the event as far as they can.” Washington’s plan did remain secret—and from that point forward, American soldiers could focus their energies on defeating the British instead of on maintaining their health.

  6 As part of their research, Curtis and Biran tried to identify universal objects of disgust. While there were some notable differences—in India, people were sickened by the thought of food cooked by menstruating women, while in the United Kingdom, subjects were repulsed by cruelty to horses—everyone listed bodily fluids and decaying or spoiled food near the top of their lists. Curtis and Biran also discovered a near-universal physical reaction that accompanies disgust: moving back the head, wrinkling the eyes and nose, and turning down of the mouth. Don’t believe them? Imagine being trapped in an overflowing outhouse.

  7 These were not paranoid fears: In 1763, the commander of the British forces in North America recommended giving rebellious Native Americans blankets that had been sprinkled with ground-up smallpox-infected scabs.

  CHAPTER 2

  MILKMAID ENVY AND A FEAR OF MODERNITY

  By the time the United States Constitution was adopted in Philadelphia in 1787, the benefits of inoculation were clear: When naturally occurring, smallpox was lethal up to a third of the time; when the result of variolation, that ratio dropped to under 2 percent.8 At the time, there was only a vague understanding of precisely why inoculation was so effective. Today, that process is much better understood. As soon as the immune system realizes the body has been attacked by a foreign body, a type of white blood cell protein called an antibody jumps into action. After identifying the interloper, the antibody carefully traces its contours in order to manufacture an exact mirror image of the invader’s perimeter. Once this has been completed, the immune system is able to disarm the antigen by enveloping it in much the same way that a lock envelops a key. (Strategically, antibodies are more generals than front-line troops: They enlist a specialized type of white blood cell for these nitty-gritty, surround-and-destroy missions.) What makes this defense system so effective is that antibodies have an excellent “memory”: Once they’ve successfully defeated a pathogen, they retain their operating instructions so that they are ready to spring into action if the same disease returns. It’s an elegant system that beautifully demonstrates the sophistication of the human organism, but there is a rub: Because the body must get sick before it can produce antibodies, variolation carried with it the risk of death from the very disease it was meant to protect against.

  The discovery that essentially benign viruses could spur the production of antibodies that protected against lethal diseases changed that calculus dramatically. Like so many history-changing scientific breakthroughs, this one came about through the examination of a seemingly prosaic fact of ordinary life. For centuries, people had observed that milkmaids almost never came down with smallpox. (One popular rhyme played off the fact that virtually everyone else had been scarred by the disease: “Where are you going to, my pretty maid? I’m going a-milking sir, she said. What’s your fortune, my pretty maid? My face is my fortune, sir, she said.”) It wasn’t until the eighteenth century that gentlemen farmers across Europe began more actively exploring the reasons why this might be the case. An English scientist and naturalist named Edward Jenner, among others, speculated it could have to do with the milkmaids’ frequent contact with open blisters on the udders of cowpox-infected cows.

  In 1796, Jenner enlisted a milkmaid named Sarah Nelmes and an eight-year-old boy named James Phipps to test his theory. Jenner transferred pus from Nelmes’s cowpox blisters onto incisions he’d made in Phipps’s hands. The boy came down with a slight fever, but nothing more. Later, Jenner gave Phipps a standard smallpox inoculation—which should have resulted in a full-blown, albeit mild, case of the disease. Nothing happened. Jenner tried inoculating Phipps with smallpox once more; again, nothing.9

  The implication of Jenner’s work was momentous. If cowpox-induced antibodies protected against smallpox—and if cowpox could be transferred directly from person to person—an all-powerful weapon against one of the most ruthless killers in history was suddenly at hand. What’s more, if vaccination were not limited to those who could afford to convalesce while receiving high-quality medical care, preventive health measures could be practiced on a much more egalitarian basis.

  These new realities not only exponentially increased the number of people who could be vaccinated, they also opened up the possibility of “herd immunity,” a mechanism whereby individuals for whom a given vaccine does not work are protected by the successful immunization of those around them. (There are a number of reasons vaccines might not be beneficial to an individual. The main ones are limitations of the vaccine itself—no vaccine is effective one hundred percent of the time—and specific reasons a patient is unable to receive a vaccine, such as poor health or a preexisting immune deficiency.) Herd immunity occurs when a high-enough percentage of a population has been successfully vaccinated to create a barrier in which the immune members of society protect the unimmunized by making it impossible for a virus to spread in the first place. It is like a herd of buffalo encircling its weakest members to protect them from predators.

  The relative safety of the cowpox vaccine also made state-sponsored immunization drives more appealing. Spain instituted mass vaccination programs in its colonies as early as 1803, and the Netherlands, the United Kingdom, and parts of the United States soon followed. Initially, with smallpox’s deadly power still fresh in people’s minds, these efforts were widely accepted, but over time, as the disease began to be perceived as less of a threat, resistance to vaccination grew. By the middle of the century, compliance had dropped to the extent that laws mandating vaccination were being passed in the U.K. and in a number of the American states. These compulsory vaccination programs, in turn, fueled even more impassioned resistance, creating a vicious cycle that continues to this day.

  Looked at in a vacuum, it’s remarkable how static the makeup, rhetoric, and tactics of vaccine opponents have remained over the past 150 years. Then, as now, anti-vaccination forces fed on anxiety about the individual’s fate in industrialized societies; then, as now, they appealed to knee-jerk populism by conjuring up an imaginary elite with an insatiable hunger for control; then, as now, they preached the superiority of subjective beliefs over objective proofs, of knowledge acquired by personal experience rather than through scientific rigor.

  But if we think about vaccine resistance as prompted by what Harvard history of science professor Anne Harrington calls the feeling of being “broken by modern life,” the parallels are less startling. Two hundred years ago, in the early stages of the Second Industrial Revolution, there was growing anxiety throughout the Western world about the dehumanizing nature of modernity. In the United States, the American Revolution’s promise of a country where every person would be free to pursue the good life had been replaced by the reality of fetid, squalid cities where horse carcasses rotted in open-air sewers and newly arrived immigrants were crammed together in abasing, lawless tenements.

  One manifestation of this widespread disillusionment was the flowering of all manner of utopian and spiritualist movements, ranging from Idealism and Occultism to Swedenborgianism and Transcendentalism. By celebrating individualism and holding intuition above empiricism, these philosophies promised a more authentic and meaningful form of existence amid a harsh, chaotic world. Alternative medical practitioners—who proselytized that physical suffering was a symptom of spiritual disorder and implied that anyone opposing their methods did so because he believed in a “one size fits all” approach to medicine—were able to exploit these conditions with particular effectiveness. (The apotheosis of this approach came in 1866, with Mary Baker Eddy’s founding of Christian Science, a religion that eschewed medical interventions in favor of prayer-induced psychosomatic cures.) When the American Medical Association was founded in 1847, a hodgepodge of newly marginalized
“irregular physicians” gravitated toward the Eclectic Medicine movement, which embraced everyone from homeopaths to hydropaths, mail-order herbalists to Native American healers.

  Today, a similarly colorful collection of practitioners has seized on the anxiety created by the bureaucracy of managed care and the generally effacing qualities of modern society in order to peddle a brand of medicine whose primary appeal lies in its focus on the patient as a whole person. Defeat Autism Now! (DAN!), a group that accredits doctors who want to treat patients according to its nontraditional protocols, typifies a medically permissive movement that lists the AMA and the AAP among its primary bogeymen. (A recent DAN! conference included exhibits and presentations by mail-order herbalists, energy healers, and purveyors of home purification systems and hyperbaric oxygen chambers.) As was the case a century and a half ago, a distinct advantage of this avowed break from the mainstream is the built-in defense it offers from accusations of misconduct: Any criticism can be dismissed as a power play by the medical establishment against alternative practitioners who challenge it.

 

‹ Prev