Book Read Free

Pox

Page 42

by Michael Willrich


  But on the other side of the balance, Jacobson provided a crucial source of constitutional authority for the post–World War II “rights revolution.” Constitutional scholars have often noted that in the great reproductive rights decisions of the late twentieth century, civil liberties attorneys and the U.S. Supreme Court revived the old discredited language of substantive due process and changed its basic purpose from the protection of economic rights to the creation of private rights of bodily autonomy and integrity. But the antivaccinationists had made such arguments well over a half century earlier in the long line of cases that culminated in Jacobson. As civil liberties attorneys, women’s rights advocates, and liberal judges fought to extend constitutional due process to encompass reproductive rights during the 1960s and 1970s, they brandished Harlan’s language from Jacobson. Supreme Court Justice William O. Douglas cited Harlan’s words in Doe v. Bolton, a 1973 decision that overturned Georgia’s abortion law, to support the proposition that “the freedom to care for one’s health and person” was “fundamental” and only a “compelling state interest” could justify interference with that liberty. In other major reproductive rights cases, the Court cited Jacobson to defend the existence of a constitutional right to sexual privacy and to support the claim that “a State’s interest in the protection of life falls short of justifying any plenary override of individual liberty claims.”124

  The Jacobson decision has assumed a significance that neither Pastor Henning Jacobson nor Justice John Marshall Harlan could have anticipated in 1905. But the long afterlife of that case underscores an important fact about the contentious history of civil liberties in modern America: free speech wasn’t the half of it. Beginning with the vaccination struggles of the turn of the century, in an era of fast-growing institutional power, ordinary Americans again and again challenged the courts to create new protections for personal liberties—including rights to individual autonomy, medical privacy, and bodily integrity. Harlan’s opinion had treated those claims with a measure of respect. At the very least, he recognized that they were worth fighting for. He said, “There is, of course, a sphere within which the individual may assert the supremacy of his own will and rightfully dispute the authority of any human government, especially of any free government existing under a written constitution, to interfere with the exercise of that will.”125

  But Harlan recognized that under the necessitous conditions of modern life, human freedom sometimes meant little without purposeful governmental action. And so, in Jacobson v. Massachusetts, the U.S. Supreme Court gave its blessing to an unpopular but effective public health technology that would one day be used to eradicate the most deadly disease the world has ever known.

  EPILOGUE

  Gone are the days of the pesthouse and the detention camp—the tent city thrown up at the edge of town, its gas-fired torches standing sentry through the night. Gone, too, the days when we looked into the pockmarked face of a stranger on a crowded streetcar, or a loved one across the table. We have lost the habit of rolling up our sleeves to display our vaccination scars to the medical inspector at the border, the nurse at the schoolhouse door, or the conductor on the departing train. With each passing year, more of us have no scar to show. All of these things are gone, because smallpox is gone.

  America’s turn-of-the-century war on smallpox did not kill humankind’s ancient foe. But it did mark the beginning of the end for the disease in the United States. The deadly New York smallpox epidemic that started in All Nations Block on Thanksgiving Day 1900, setting Alonzo Blauvelt’s vaccination corps into motion in the tenements and factories, was to be the city’s last. Boston, too, had seen its final smallpox epidemic during the deadly 1901–3 visitation that sealed the city’s reputation as a “hotbed of the anti-vaccine heresy.” Over the next twenty-nine years, the city reported a hundred-odd cases, just four of them fatal, and then the pox vanished for good. The story was much the same in Philadelphia, Cleveland, Seattle, and other places where smallpox had raged during the first years of the century.1

  By World War I, a rough pattern had taken hold. Outbreaks of malignant variola major became rare events, aggressively stamped out by America’s increasingly well-organized health departments through a combination of mass vaccination and swift isolation of patients. Having learned something on the vaccination battlegrounds of the turn of the century, public health professionals self-consciously eschewed compulsion and force for public education and the promotion of the idea that every citizen had a positive right to good health. As C.-E. A. Winslow of the Yale School of Medicine observed, “Public health conceived in these terms will be something vastly different from the exercise of the purely police power which has been its principal manifestation in the past.” Of course, every profession seeks to elevate itself by disclaiming the backwardness of its predecessors. And the new public health, far from a retreat, implied a much more ambitious program for governing everyday life in America. But over time, ordinary Americans did more fully accommodate themselves to the call for mass vaccination when the deadlier form of smallpox invaded their communities. When variola major reappeared in Detroit in 1924, causing 163 deaths, a half-million residents submitted to vaccination in a single month.2

  But the new mild type of the disease remained far more difficult to control. Variola minor became the dominant form of smallpox in the United States. Between 1921 and 1930, the United States reported nearly 400,000 cases of smallpox, with a case-fatality rate of less than 1 percent. During the next decade, 108,000 cases were reported, with a case-fatality rate of just .38 percent. As smallpox continued to lose its lethal force, Americans remained ambivalent—or apathetic—about smallpox vaccination. Health departments relied on school mandates and voluntary action to maintain vaccination levels. But by the 1930s, only nine states had compulsory vaccination laws on the books, and four states had laws banning compulsion. During the 1930s, public health experts voiced the old refrain that “the United States lags behind other civilized countries in vaccination protection.” And they were right. With 5,000 to 50,000 cases still occurring each year, health officials estimated that only one in two Americans had ever been vaccinated.3

  The antivaccination movement had continued to challenge the authority of American public health officials. As the Birmingham, Alabama–based Southern Medical Journal lamented in 1921, “All the fools are not dead yet.” Since the Supreme Court’s ruling in Jacobson v. Massachusetts, antivaccinationists had relentlessly railed against school vaccination requirements. They would continue to do so even after the Court, in a 1922 opinion written by Justice Louis D. Brandeis, dismissed a constitutional challenge to a local school vaccination mandate, stating that the Jacobson ruling had effectively decided the question.4

  Time and again, however, when malignant variola major reared its head, the American people bared their arms. As Assistant Surgeon General R. C. Williams of the U.S. Public Health Service commented in 1946, “When you get a scare, everyone within 100 miles gets vaccinated.”5

  In 1947, when a traveler on a bus from Mexico City carried smallpox to Manhattan, more than six million New Yorkers lined up in a single month to get vaccinated. In dramatic contrast to the 1901–2 epidemics in the city, the New York City Health Department did not resort to compulsion and force, instead reaching out to the public through the radio and newspapers, while using the full agencies of the local government to trace cases and contacts. In the end, the city suffered only twelve cases and just two deaths.6

  By the time of the New York outbreak, smallpox had grown scarce in the United States. America’s last confirmed outbreak struck Hidalgo County, in the lower Rio Grande Valley of Texas, in 1949.7

  At the time, few American states mandated smallpox vaccination. Beginning in the late 1930s, nine states and the Territory of Alaska enacted the first laws mandating immunization for another deadly childhood disease—diphtheria. The discovery of the polio vaccine and the ensuing national vaccination campaign during the 1950s changed everythi
ng, turning compulsory immunization from a political liability into a popular cause. Between 1958 and 1965, all fifty states enacted new legislation requiring schoolchildren to undergo vaccination for smallpox and other diseases. By 1969, twelve states had mandated a full slate of childhood immunization shots that included smallpox, measles, polio, diphtheria, pertussis, and tetanus. And more states were jumping on board each year. A new era of compulsory immunization had begun.8

  With no reported cases of smallpox in the United States in more than twenty years, the annual tally of six to eight deaths from complications of vaccination became increasingly unacceptable. In 1971, the United States Public Health Service, the agency that seventy years earlier had sent C.P. Wertenbaker across the South to help communities fight smallpox, recommended that routine childhood vaccinations against smallpox be discontinued. Within three years, every American state had repealed its smallpox vaccination mandate for schoolchildren.9

  As of 1967, smallpox still killed 2 million people every year across the globe. The World Health Organization—leading an unprecedented international campaign—launched an offensive to wipe smallpox from the planet. In an exceptional example of Cold War–era cooperation, the eradication campaign was heavily funded by the United States with the Soviet Union providing enormous quantities of vaccine. The geographical canvas for this massive effort spanned dozens of developing countries in Asia, Africa, and Latin America. Two inventions proved crucial: the introduction of freeze-dried vaccine (which retained its efficacy for months at high heat) and the manufacture of the bifurcated needle, a cheap forked tool that enabled health workers to get four times as many vaccinations from a single unit of vaccine.10

  The eradicators developed a strategy, known as “ring vaccination” or “surveillance-containment,” that resembled a modern, high-tech version of the methods employed by Manhattan’s turn-of-the-century vaccination corps. As each new outbreak of smallpox was reported, a vaccination team descended on the scene, vaccinating everyone they could find in the immediate vicinity and placing the area under close surveillance until the outbreak had subsided. Taking the fight to smallpox, rather than striving for universal vaccination, the surveillance-containment strategy enabled the eradicators to cut short the transmission of smallpox, even in countries that had poorly vaccinated populations. The eradicators had to work around civil wars and surmount cultural barriers; in rural Afghanistan, for example, vaccinators ran up against purdah traditions that limited their access to women and children.11

  When containment teams met outright resistance, they responded with verbal pressure, legal coercion, and, in extreme cases, forcible vaccination. One senior WHO epidemiologist, a physician from the American Centers for Disease Control and Prevention (CDC) named Dr. Stanley Music, recalled how his team’s initial efforts to carry out the containment policy in rural Bangladesh “resembled an almost military style attack on infected villages.... In the hit-and-run excitement of such a campaign, women and children were often pulled out from under beds, from behind doors, from within latrines, etc. People were chased and, when caught, vaccinated.” Dr. Music explained the thinking of the vaccinators. “We considered the villagers to have an understandable though irrational fear of vaccination,” he said. “We just couldn’t let people get smallpox and die needlessly. We went from door to door and vaccinated. When they ran, we chased. When they locked their doors, we broke down their doors and vaccinated them.” The strategy proved highly effective at containing smallpox. But it came at a high price. As one historian of the South Asia eradication program delicately observed, “coercion can leave behind a residue of resentment that sours public attitudes toward the next vaccination campaign.”12

  As reported smallpox cases dwindled, teams conducted “scar surveys” of high-risk areas, inspecting people for vaccination scars or facial pockmarks, just as U.S. military surgeons had done when the Army moved across Luzon during the Philippine-American War. The last naturally occurring case of variola major occurred in a young girl in Bangladesh in late 1975. The final case of variola minor was reported in a hospital cook in Merca, a port town in southern Somalia, on October 31, 1977. On May 8, 1980, the World Health Assembly declared, “[T]he world and all its peoples have won freedom from smallpox, which was a most devastating disease sweeping in epidemic form through many countries since earliest time, leaving death, blindness and disfigurement in its wake.” The Assembly recommended that countries across the world discontinue smallpox vaccination.13

  The smallpox eradication program severed smallpox from its human host—a monumental achievement. Alas, the campaign did not annihilate the variola virus. As immunization levels around the world fell after 1980, the virus took on a new and ominous existence in the laboratory.

  The WHO had authorized two laboratories to keep frozen stocks of variola—the CDC in Atlanta and the Research Institute for Viral Protections in Moscow. By the time the Soviet Union collapsed in 1990, British and American intelligence agencies had believed for some time that the USSR had been developing weapons-grade variola. Those fears were confirmed in the mid-1990s. Civil defense agencies prepared for the worst. Long-standing concerns about the proliferation of weaponized smallpox virus intensified after the terrorist attacks of September 11, 2001, soon followed by the anthrax murders.14

  On December 13, 2002, President George W. Bush announced his administration’s plan to protect the nation from a smallpox attack. The plan, which many in the scientific community had opposed, involved compulsory vaccination of a half-million U.S. military personnel, followed by a voluntary campaign of a roughly equal number of frontline hospital workers and members of public health departments—the most likely health workers to come into contact with the virus during an outbreak. After that, the plan called for the voluntary vaccination of some 10 million firefighters, police, and other “first responders.” The military vaccination campaign went smoothly enough. But the civilian campaign quickly collapsed. Only 38,000 health workers agreed to be vaccinated, and many American hospitals refused to participate at all.15

  The complex concerns elicited by the civilian program would have been familiar to the many Americans who refused vaccination at the turn of the twentieth century. Many of the health workers believed they had a specific medical condition that made smallpox vaccination particularly hazardous for them. (In fact, experts believe as many as one in five Americans today may have contraindications to smallpox vaccination, including immune systems weakened by HIV.) Others worried about the common side effects of smallpox vaccine—still known as “the most dangerous vaccine.” Many felt the risk of a bioterrorist attack was too low to make getting vaccinated a good bet. (The invasion of Iraq had revealed that Saddam Hussein held no secret stockpile of variola.) Another key factor was the lack of a federal program, in the first stages of the vaccination campaign, to compensate people for death, injury, or lost work due to the vaccination. In the end, the failed civilian program reported nearly nine hundred adverse reactions to vaccine, including one death. The military program reported seventy-five cases of heart inflammation and one death.16

  It was a revealing episode. In the absence of a palpable threat of an outbreak, few twenty-first-century Americans would step forward and get vaccinated against smallpox. Clearly, ignorance had little to do with it. Presumably, the 400,000 health workers who declined to roll up their sleeves were exceptionally well-informed about the risks. Even the relatively small risks of the vaccine were deemed unacceptable as long as the threat of a smallpox attack seemed remote.

  Even as smallpox itself disappeared from America and the world in the final decades of the twentieth century, vaccines themselves proliferated. Thanks in large part to the polio success story, so did vaccine laws. By the century’s end, all fifty states mandated that children receive immunization shots to protect them against seven different diseases. The number continues to grow. State-mandated vaccination is far more extensive than it was a century ago. But most states now provide precisely t
he sort of exemptions that the turn-of-the-century antivaccinationists in Europe and the United States had demanded. The people may now ask to be exempted for medical and religious reasons, or even, in some states, for conscientious objections to vaccination.17

  For all of this, public distrust of vaccines is on the rise, caused in part by the unprecedented complexity of the childhood immunization landscape and fueled by the explosive communicative power of the Internet. No longer do rumors of sore arms and lost limbs circulate via word of mouth across communities of workers; a bottomless archive of information and misinformation about vaccines is just a few keystrokes away. According to the CDC’s National Immunization Survey, in 2008 nearly 40 percent of American parents of young children refused or delayed giving them at least one routine shot—up from 22 percent in 2003. One quarter of American parents believe vaccines cause autism, though there is no scientific evidence to support that belief and at least a dozen major scientific studies have concluded there is no connection. In March 2010, the federal “vaccine court” ruled that the theory that a mercury-containing preservative long used in vaccines caused autism was “scientifically unsupportable.” But no one now expects a single court ruling to silence the vaccination controversy.18

  The vaccination question a century ago was in important respects markedly different from the current debate. Then the controversy centered on a single vaccine used to fight one horrific infectious disease. Today, healthy children under six routinely receive nearly a dozen separate vaccines, some mandated by state law, all recommended by the CDC, that offer protection against viruses ranging from varicella (chicken pox) to the human papillomavirus. Each of these vaccines raises its own particular issues of safety, parental authority, or even, in the case of the HPV vaccine, sexual mores. Trying to check actual epidemics of smallpox, turn-of-the-century health officials likened their power to the military defense of the nation. Today’s vaccination skirmishes are by comparison a peacetime struggle, mostly fought out in the absence of visible diseases—an absence made possible, in large part, by vaccines. The vaccine politics of the present moment reflect twenty-first-century Americans’ still evolving conceptions of the family, their affective ties to particular local and virtual communities, and their complex views of a modern administrative and welfare state that was still in its infancy a century ago. Antivaccination arguments today often convey an attenuated sense of social responsibility that is all too pervasive in contemporary American culture. Our politics of health must be understood in its own historical context.19

 

‹ Prev