The Vaccine Race

Home > Other > The Vaccine Race > Page 2
The Vaccine Race Page 2

by Meredith Wadman


  In the 1960s and 1970s the cells became the object of a bitter, epochal feud between Hayflick and the U.S. government, first over whether they were safe for vaccine making and then over who owned them. Hayflick’s preternaturally proprietary feelings for the cells—he once described them as “like my children”—led him to defiantly decamp from the Wistar to a new job three thousand miles away at Stanford University with the entire stock of WI-38 cells. His escape infuriated the Wistar’s director, Koprowski, who had his own money-making designs on the cells.

  Hayflick’s flight with the cells would eventually make him the target of a career-derailing investigation by the National Institutes of Health, which had funded his work deriving WI-38.8 Then, just as the tug-of-war over ownership of the WI-38 cells peaked in the second half of the 1970s, profound changes occurred in attitudes and laws governing who could make money from biological inventions. In the space of a very few years, biologists went from being expected to work for their salaries and the greater good—and nothing more—to being encouraged by their universities and the government to commercialize their inventions for the benefit of their institutions, the U.S. economy—and themselves.

  Although the WI-38 cells were launched long before these changes took place—and eighteen years before the Supreme Court decreed that a living entity like a WI-38 cell could be patented—that is not to say that money has not been made from them. The huge drug company Merck in particular has made billions of dollars by using the WI-38 cells to make the rubella vaccine that is part of the vaccine schedule for U.S. babies and preschoolers—ensuring more than seven million injections each year, not including those in more than forty other countries where the Merck vaccine is sold. The Wistar Institute too until the late 1980s enjoyed a handsome royalty stream from vaccines made by its scientists using the WI-38 cells—including a much-improved rabies vaccine that replaced sometimes-dangerous injections. Cell banks today charge several hundred dollars for a tiny vial of the cells.

  But the tale of the WI-38 cells involves much more than money—and more than the highly unusual story of Hayflick, the iconoclastic scientist who launched them. It involves the silent, faceless Swedish woman whose fetus was used to derive the cells without her consent. It involves the dying patients into whose arms the WI-38 cells were injected with the aim of proving that the cells did not cause cancer. It touches on the ordinary American children who perished from rabies before WI-38 cells were used to make a better vaccine, and on the U.S. military recruits who died from adenovirus infections when the Pentagon stopped giving service members the vaccine against that virus, made in WI-38 cells. It involves the abortion opponents who, now as then, harbor a deep moral abhorrence of any vaccines made using human fetal cells.

  It is also about Stanley Plotkin, a young scientist who stubbornly fought powerful competitors by using the WI-38 cells to develop a superior rubella vaccine—and the purely political roadblocks that nearly stopped him. And it is about the one-, two-, and three-year-old orphans on whom Plotkin tested that vaccine, with the blessing of the archbishop of Philadelphia. It involves the irony of the untold millions of miscarriages, stillbirths, and infant deaths that have been prevented by a rubella vaccine made using cells from an aborted fetus.

  • • •

  These pages are full of medical experiments that we find abhorrent today. Young, healthy prisoners are injected with hepatitis-tainted blood serum; premature African American babies with experimental polio vaccine; intellectually disabled children with untried rubella vaccine.

  We recoil in horror. It is easy to condemn out of hand the scientists who conducted these experiments on the most voiceless and powerless among us. And their actions were in many cases horrifying and inexcusable. But it is more instructive—and perhaps more likely to prevent similar betrayals in the future—to try to understand why they did what they did.

  The experiments began, in large part, during World War II.*

  They grew out of the exigencies of the war, when an ends-justify-the-means mentality took over in U.S. medicine in the interest of keeping soldiers healthy at the front, because civilization was at stake.9 Everyone was expected to do their part for the cause—even institutionalized people with grave disadvantages or disabilities. When the war ended, the mentality didn’t. In the two decades following the war and in several cases into the 1970s, medical researchers experimented on people—almost always vulnerable people—making them sick and sometimes killing them.10

  These scientists were perceived and perceived themselves as part of a heroic quest to defeat disease. They were ambitious, driven, and well funded by the U.S. government. And they got results.

  During World War II and in the two decades following it, childhood mortality declined strikingly, in large part because of dramatic inroads against infectious diseases.Antibiotics that became available in the 1940s turned often-lethal diseases like typhoid fever and dysentery into less-grim maladies and slashed both the incidence of tuberculosis and its lethality. Vaccines against diphtheria, polio, and whooping cough hammered these childhood killers. Infectious diseases as a cause of death among children were rare by the middle of the 1960s.

  The men who conducted unethical human experiments in this era were not medical outliers. They were top physicians and researchers operating with the full backing of the U.S. government, private funders, and esteemed medical schools and hospitals. But in 1966, when a landmark paper in the New England Journal of Medicine exposed the harm being done to powerless people in scores of experiments, the government implemented new protections.11 The surgeon general launched a requirement that people give their informed consent to participate in research studies funded by the U.S. government’s health agencies and that researchers win preapproval for their human experiments from an independent committee charged with examining the risks and benefits to participants.12 Since then, those protections have been strengthened, expanded, and written into U.S. law. Today’s system of human-subject protections is not perfect. In fact, it has serious shortcomings and vocal critics.13 But it is worlds better than the feeble effort that existed half a century ago.

  To remove the history of human exploitation from vaccines and medicines that were developed in the postwar era is impossible. The knowledge that allowed their development is woven into them. Should we therefore shun them? Definitely not. Take rubella as a case in point. As I write this in the summer of 2016, 1,700 babies in a dozen countries have been born with abnormally small heads or other brain malformations; their mothers were infected with the Zika virus while pregnant.14 Zika’s emergence is a vivid reminder of what life was like in the United States in 1964. Then, there was no rubella vaccine and tens of thousands of American babies were born gravely damaged by the rubella virus, which selectively harms fetuses in the womb. Like Zika, rubella homes in on the brains of fetuses; it also ravages their eyes, ears, and hearts. But today, thanks to the vaccine that was perfected in experiments on institutionalized orphans and intellectually disabled children, indigenous rubella has been wiped out in the Western Hemisphere. Cases occur only when they are imported from other countries.

  We can’t turn the clock back. The only way we can partially make it up to these children and untold others is to honor their contributions by making them meaningful—by continuing to vaccinate against rubella and the other diseases that made childhood a perilous journey before vaccines against them existed. We also need to strive constantly to enforce and improve the regulations and laws that protect research subjects so that in the future such abuses never happen again. We might also remember, when judging the men who took advantage of vulnerable human beings in order to advance both human health and their own careers, that they were creatures of their time, just as we are of ours. Rather than training our criticism on them, it might be more useful to ask ourselves this: what are we doing or accepting or averting our eyes from today that will cause our grandchildren to look at us and ask, How could you
have let that happen?

  PART ONE

  The Cells

  CHAPTER ONE

  Beginnings

  Philadelphia, 1928–48

  Once upon a time there was a boy. He lived in a village that no longer exists, in a house that no longer exists, on the edge of a field that no longer exists, where everything was discovered and everything was possible.

  —Nicole Krauss, The History of Love1

  When he was about eight years old, Leonard Hayflick had a scare that caused him to run to his mother in tears and imprinted itself vividly on his memory. Hiking one day with friends in Cobbs Creek Park, near his home in southwest Philadelphia, he slipped when crossing the creek on stepping-stones. One of his sneakers got soaked.

  The young boy immediately panicked. Polio is spread through contaminated water, and the terror of the paralyzing, sometimes-fatal disease was acute in the mid-1930s. Hayflick sat down, crying, and took his shoe and sock off, desperately rubbing his foot with the nearest chunk of dirt or grass he could find. He went home to his mother, who tried to comfort him.

  His fears were understandable. What had been a rare disease in the nineteenth century had become all too common in the twentieth. Annual summer surges in polio cases had mothers keeping their children out of public swimming pools. Not even the most privileged Americans were safe. Indeed, because the wealthy grew up in cleaner environments, they were less likely to be exposed to polio and to develop protective antibodies as children. President Franklin D. Roosevelt, the man who was running the nation from a wheelchair, had been paralyzed by the dread disease at age thirty-nine.

  In fact, infectious diseases of all kinds were a real threat in the 1930s. Children died of scarlet fever; of influenza; of tuberculosis; of measles. There were no reliable vaccines to prevent these common maladies. The first antibiotics wouldn’t be prescribed until the late 1930s. Hayflick remembers the orange signs from Philadelphia health authorities that would appear periodically on the front doors of afflicted households, proclaiming in huge black font: This house is quarantined because of the presence of measles—or another infectious disease.

  But Hayflick had not contracted polio from his brush with the water of Cobbs Creek. He was luckier than thousands of American children in the 1930s.

  Hayflick came from humble beginnings. He was born on May 20, 1928, to Nathan and Edna Hayflick, the new, young owners of a narrow brick row house in a working-class neighborhood in southwest Philadelphia. Hayflick’s parents were part of a Jewish migration across the Schuylkill River out of the slums of south Philadelphia that began before World War I and continued in the 1920s.2 The new arrivals launched thriving synagogues and Hebrew schools. The sidewalks were wide and the families young. The schools were less than first-rate, but that did not tamp down the ambitions of many families who were determined to build better lives.

  Hayflick’s own father, Nathan, when he was eight years old, had been living in a south Philadelphia row house, occupied by thirteen family members, abutting a rough red-light district.3 In this densely packed neighborhood of dark, cobblestoned streets and alleys that often lacked pavement or sewers, filth and excrement stuck in the cracks, frozen, during the winter and then thawed in the spring.4 Single outhouses served dozens of people. The Philadelphia authorities, notorious for their indifference and corruption, did virtually nothing to improve conditions. In fact, they paid attention to sanitation in south Philadelphia only when outbreaks of cholera, typhus, or diphtheria blighted the area and threatened to spread.5 The crowding and filth made the slum a perfect incubator for the devastating influenza pandemic of 1917–18, when hospitals, homes, and morgues were overwhelmed and corpses spilled into the street.6 The Hayflick clan seemed to emerge unscathed, although Nathan’s mother soon died of tuberculosis. Nathan, in his midteens, went to work driving the horse-drawn family milk cart.

  Within a few years he had landed a job at the Climax Company, a leading Philadelphia denture-designing firm. He would advance to become a master denture designer, serving clients including Albert Einstein. Lunching one day at a popular diner, he met Edna Silver, a quiet, thoughtful young beauty who was also from south Philadelphia and who, like him, was the child of Eastern European immigrants. The couple married in 1927 and moved across the river.

  Their three-bedroom row house was soon full. Leonard Hayflick was born the year after they married, and eighteen months later Edna gave birth to a daughter, Elaine. The date was November 11, 1929. Two weeks earlier the bottom had fallen out of the U.S. stock market, launching the Great Depression.

  By 1933 half of the city’s banks had failed and just 40 percent of the workforce was fully employed.7 The Philadelphia County Relief Board began distributing shoes to schoolchildren—seven hundred pairs every day.8 Nathan Hayflick’s wages were cut and the Hayflicks became one of ninety thousand city families to lose their houses.9 They moved to a nearby rental. Eventually the family’s finances recovered enough for Leonard’s parents to buy a modest row house in the same neighborhood, where Hayflick spent his teenage years.

  Despite the difficulties his family endured during the Depression, Hayflick says he doesn’t remember ever being hungry or aware of the family’s financial duress.

  “I never had a motivation to make money, ever,” he says. “The Depression and my mother’s and father’s experiences played no role in my outlook on life.”10 But it did, he told another interviewer in 2003, impact him in the following way: “Being brought up in the Depression has a lot to do with my work ethic, my belief in myself, and [the belief] that I should have confidence in what I think is true and correct as long as it is demonstrably so.”11

  In that interview he also recalled that his parents’ broad-mindedness instilled in him a bent for challenging convention. “My mother and father were very liberal. . . . I take enjoyment in challenging dogma. If there is anything that I challenge, it is orthodoxy.”

  Hayflick was exposed to lab life early. On Saturdays he sometimes tagged along with his hardworking father to the lab at the Climax Company. Nathan Hayflick kept his son occupied by sitting the boy in front of a Bunsen burner with plaster molds and easily melted Melotz metal to liquefy and pour into them. Leonard Hayflick learned from his uneducated but talented father to be at home in a lab. He also saw in him the costs of a stunted education. His father was fascinated by all manner of scientific questions but hadn’t the tools or the energy to pursue them: upon returning home after his thirteen-hour workdays, he would fall asleep on the couch.

  Hayflick’s mother taught him not to be afraid to ask questions. Her answers were patient and explicit when he asked why street-corner newspaper vendors were shouting—whether their three-inch headlines were reporting the kidnapping of the Lindbergh baby when Hayflick was not yet four years old or Hitler’s troops marching into Austria when he was nine.

  A chance gift from his favorite uncle ignited a passion when Hayflick was ten or eleven years old. Jacob Silverman, a smart, natty thirtysomething bachelor, gave his nephew a Gilbert Company chemistry set. It came with test tubes, a test tube holder, tongs for grasping the test tubes, and an alcohol lamp with a ground-glass head on it to cover the wick and keep the alcohol from evaporating.

  The young boy was stunned to learn that the universe was composed—as was believed at that time—of a mere ninety-two elements, and that they behaved in such extraordinary ways when they were combined. He was entranced by the color changes, the bursts of flame, and the substances that mysteriously precipitated out in the bottom of his test tubes. At age eighty-six Hayflick would still have the set’s manual and alcohol lamp.

  It wasn’t long before Hayflick had exhausted the chemicals and experiments that came with the set. With a neighborhood friend named Teddy Cooper, he began biking across large swaths of southwest Philadelphia in search of new chemicals and the glassware that would show them off to the best advantage. The duo got to know a kind sales clerk at D
olbey and Company, a chemical and glassware supplier near the University of Pennsylvania. The thin, bespectacled salesman eventually let them look at obsolete supplies in the store’s basement, and they came home laden with outmoded retorts and condensers.

  Soon Hayflick was busy building his own basement lab; he walled in a corner, put in a workbench, and mounted shelves where he proudly displayed his chemicals in labeled bottles. He and Cooper also asked the friendly salesclerk to sell them some metallic sodium—a volatile element the consistency of a hard stick of butter that’s kept under kerosene to mute its explosiveness and that bursts into a flame of burning hydrogen when it’s submerged in water. The clerk told Hayflick he couldn’t do it without a letter from his mother. Hayflick went home and composed one, which his trusting mother signed in her beautiful cursive. Hayflick and Cooper began riding the neighborhood’s back alleys after rainstorms. They deposited chunks of the metallic sodium in the water-filled holes where laundry poles stood in good weather, then sped gleefully away.

  By the time he graduated from high school, Hayflick had developed a keen sensitivity to injustice, especially when it involved him. He won the Bausch + Lomb Honorary Science Award for being the best science student at John Bartram High School but marched into the principal’s office and returned it indignantly after learning he had placed second to a female classmate in the race for the coveted Philadelphia Mayor’s Scholarship. That scholarship would have paid his tuition at any university or college in the country.

 

‹ Prev