Book Read Free

Pandemic

Page 19

by Sonia Shah


  It all happened while the medical establishment continued to repudiate the idea that waste-contaminated water transmitted cholera.48

  The city of New York happened to clean up its water supply around the same time, also while rejecting the idea that dirty water spread cholera. There, the precipitating factor was the demand, from the city’s brewers, for better-tasting water to brew their beers. For fifty years—and two explosive cholera epidemics—the Manhattan Company had distributed contaminated groundwater to New Yorkers, even as residents pled for better-tasting water and a more abundant supply for fire extinguishing and street cleaning. But when the brewers added their voices to the outcry about the dirty water, which put their beers at a competitive disadvantage, the business-friendly city council finally resolved to rectify the problem.49 By then, the Bronx River could no longer feasibly supply sufficient clean water, so the city tapped the distant Croton River, requiring a forty-two-mile-long aqueduct.50

  Croton water started to flow into New York City in 1842.51 Few people signed up at first. But after cholera struck in 1849, they lined up by the thousands. By 1850, the water department’s annual revenue nearly doubled compared to the previous year. The voluminous Croton water allowed the city to flush its once stagnant sewers, too, and it started expanding its network of sewers in the 1850s. By 1865, the city had built two hundred miles of sewer pipes to carry its wastewater out to the rivers. The last cholera outbreak in New York City, in 1866, took fewer than six hundred lives. And then the disease vanished from the city for good.52

  Neither city had any awareness that it had implemented strategies based on Snow’s antimiasmatist insights. Londoners attributed the cessation of cholera to the diversion of stinky sewer gases; in New York, cholera’s demise was attributed to the street-cleaning efforts of the board of health. “If we had had no Health Board,” a newspaper editor noted at the time, “we would probably have had a great deal more cholera.”53

  That may not have mattered much for those two cities, which solved their cholera problems despite miasmatism, but it made a big difference elsewhere. While New Yorkers and Londoners enjoyed cholera-free lives of clean water and sanitation, their revolutionary way of life was as likely to ignite social changes elsewhere as a wet match. Cutting-edge late nineteenth-century architecture in Naples, Italy, for example, focused on elevating buildings above low-lying miasmas, leaving them bereft of easy access to clean drinking water.54 The ultramodern city of Hamburg, Germany, in the late nineteenth century distributed unfiltered river water contaminated with sewage to its residents with proud efficiency.55

  Across much of the European continent, the teachings of the German chemist Max von Pettenkofer, who’d convinced attendees of the International Sanitary Conference in Constantinople in 1866 to disavow John Snow’s drinking-water theory, prevailed. For Pettenkofer, poisonous clouds caused cholera. This belief had all kinds of untoward ramifications, not least in his advocacy, during cholera outbreaks, of flight. Pettenkofer said that when the poisonous cholera clouds formed, “rapid evacuation” was always a “salutary measure.” When cholera broke out in Provence, France, in 1884, Italian authorities went so far as to distribute free rail tickets and charter steamships to rapidly evacuate Italian immigrants—and their cholera—out of infected France. Back in Italy, they proceeded to seed a new outbreak in Naples.56

  Until miasmatism was renounced by the medical establishment, and replaced by a new paradigm that could integrate cholera’s cures into its explanatory framework, such physician-approved practices that abetted cholera would continue.

  * * *

  That new paradigm arrived in the late nineteenth century. “Germ theory” posited the idea that microbes, not miasmas, cause contagions. The theory rested on a spate of discoveries. Microscopy had finally come back into fashion, allowing scientists to revisit the microbial world first spied by Leeuwenhoek two centuries earlier. And then, by conducting experiments on animals, they determined the specific role these microbes played in animal diseases. The French chemist Louis Pasteur discovered the microbial culprit behind a disease of silkworms in 1870; the German microbiologist Robert Koch discovered that Bacillus anthracis caused anthrax in 1876.57 These findings were still incendiary to miasmatists, but they were fundamentally different from the ones they’d rejected in the past. They didn’t arrive on the scene sporadically, seemingly out of nowhere, but with an increasingly steady regularity. And they were encased in a powerful explanatory framework. Germ theory, along with accounting for the nature of contagions, provided a radically new way to think about health and illness more generally. Rather than being the result of complex disequilibria involving amorphous external and internal factors, poor health was now discernible at the microscopic level.

  In 1884, Koch made a splash at a conference on cholera in Berlin by announcing he’d discovered the microbe responsible for causing cholera: Vibrio cholerae. (In fact, Koch wasn’t the first to spy the bacteria—an Italian doctor named Filippo Pacini had isolated what he called a “choleraic microbe” in 1854.) And Koch had developed a method of proving that the bacteria caused the disease. His method, known as “Koch’s postulates” and used up until the 1950s, involved a three-step proof. First, he extracted the offending microbe from a patient ill with the disease. Second, he grew the microbe in the lab on a nutrient-laden petri dish. Third, he administered the lab-reared microbe to a healthy individual. If that person fell ill with the disease in question, the microbe was proved to be the culprit.

  But Koch couldn’t accomplish the proof for Vibrio cholerae and cholera (infecting experimental animals with Vibrio cholerae is notoriously difficult).58

  “Koch’s discovery alters nothing,” leading miasmatists such as Pettenkofer scoffed, “and, as is well known, was not unexpected by me.” Other experts called Koch’s discovery “an unfortunate fiasco.” An 1885 British medical mission (the head of which considered Pettenkofer “the greatest living authority on the etiology of cholera”) reported that the vibrio Koch had discovered had nothing to do with cholera at all.59

  To prove that the bacteria didn’t cause cholera, Pettenkofer and his allies devised a daring demonstration. Pettenkofer procured from a patient dying of cholera a vial of stool, teeming with hundreds of millions of vibrios, and drank it.60 The fluid went down “like the purest water,” he proclaimed. Twenty-seven other prominent scientists, including Pettenkofer’s assistant, did the same. A popular Paris magazine covered their shenanigans with a drawing of a man eating feces while shitting a bouquet of violets. The caption read: “Dr. N. consumes a cholera-ridden feces orally; five minutes later, he produces a bouquet of violets … at the other end.”61 Although both Pettenkofer and his assistant came down with cholera-like diarrhea, the assistant suffering bouts every hour for two days, all of the cholera drinkers survived, which Pettenkofer considered a successful repudiation of Koch’s germ theory.62

  The stand-off between miasmatism and germ theory continued for several more years. Then an 1897 outbreak of cholera in Hamburg sealed miasmatism’s fate. According to miasmatic theory, the city’s western suburb of Altona, which like Hamburg lay along the banks of the Elbe River, should have fallen prey to the miasmas that caused cholera in Hamburg as well. And yet it didn’t. It was impossible for experts to deny the reason why: Altona filtered its drinking water, while Hamburg did not. Strikingly, none of the 345 residents of an apartment block called Hamburger Hof—within the political boundaries of Hamburg but receiving water from Altona’s filtered supply—fell ill at all.63

  With this stark vindication of Koch’s claims (and the long-dead Snow’s), miasmatism’s last advocates were forced to surrender. Hippocratic medicine, after a two-thousand-year-long reign, had been knocked off its throne. In 1901, Pettenkofer shot himself in the head and died. A few years later, Koch received the Nobel Prize in Medicine and Physiology. The germ theory revolution was complete.64

  With the end of miasmatism, cholera’s terrifying tenure in North America and Eur
ope entered its closing years. The gains against cholera in London and New York spread. Municipalities across the industrial world improved their drinking water through filtration and other techniques. After 1909, when liquid chlorine became available, municipalities started chlorine disinfection.65 The few waterborne pathogens that survived twentieth-century water treatment and filtration grew milder.66

  The germ theory revolution improved treatments for cholera as well. The Anglo-Indian pathologist Leonard Rogers proved that salty fluids cut cholera mortality by a third in the early 1900s, after which the once mocked injections of saline gained popularity.67 Scientists steadily refined rehydration therapy over the course of the twentieth century. Today, a saline solution mixed with a bit of lactate, potassium, and calcium is an antidote to cholera as effective as a shot of insulin to a patient in a diabetic coma. Oral rehydration therapy, by simply and quickly curing cholera and other diarrheal diseases, is considered one of the most important medical advances of the twentieth century.68

  That’s not all. There are vaccines against cholera, too, which aim to replicate the protective immunity enjoyed by cholera survivors by delivering whole killed cells and subunits of cholera toxin. Although nobody yet knows how that immunity works, products such as Shanchol, a cheap oral vaccine licensed in 2009, and the traveler’s vaccine Dukoral are nevertheless 60–90 percent effective, at least for a few years, making them useful additions to the anticholera arsenal.69 (Since they require multiple doses and take several weeks to become effective, the WHO recommends that they be used in conjunction with other cholera prevention measures; as of this writing, neither is available in the United States.) Even simpler methods pioneered by the microbiologist Rita Colwell and her colleagues, such as filtering untreated water through a few layers of sari cloth, catches 90 percent of the vibrio bacteria in contaminated water, reducing cholera infections by 50 percent.

  Medicine finally figured out how to cure cholera. It just hadn’t done so fast enough to deliver humanity from nearly a century of cholera pandemics.70

  * * *

  Today, when new pathogens emerge, there’s no decades-long delay in figuring out how they spread. Modern biomedicine quickly identifies new pathogens’ modes of transmission. That HIV spread through sexual contact, and SARS through aerosols was obvious from the very first clusters of cases that occurred. The rapidity of medicine’s ability to unravel modes of transmission allows preventive strategies to be promptly devised, like condoms in the case of HIV, or face masks in the case of SARS, or safe burials in the case of Ebola.71 (Of course, that doesn’t mean those strategies can or will be implemented—HIV infected nearly 75 million people worldwide by 2014 despite the medical establishment’s insights into how to prevent it by practicing safe sex.)

  But we still can’t rely on modern medicine to save us from the threat posed by new pathogens.

  For one thing, even when scientists devise new cures, we are not necessarily able to produce them at the right scale and at the right time. Drug development is slow and constrained by the economic concerns of the for-profit pharmaceutical industry. If the market for a new drug is modest, it doesn’t matter how big the public-health need for it is, or how solid the scientific evidence supporting its effectiveness: that drug is unlikely to get to market. There are precious few drugs developed for diseases, like malaria and Ebola, that selectively afflict the poor. Malaria sickens hundreds of millions of people every year, but since most of those victims have less than $1 a year to spend on health care, the market for new malaria drugs is vanishingly small. Today, the most cutting-edge drugs available for the disease are based on a botanical compound called artemisinin, a two-thousand-year-old Chinese remedy. Ebola affects far fewer people than malaria but poses a much more alarming public-health threat. As of 2014, there were no drugs or vaccines available for Ebola, either. “‘Big Pharma,’” as a headline in London’s The Independent put it in 2014, “failed deadly virus’ victims.”72 That means that pathogens that prey upon the untreated poor can amplify and spread into broader populations.

  Another problem in counting on medicine to save us from emerging pathogens has to do with how new paradigms in medicine have replaced the old. Modern medicine may have no equivalent to the Hippocratic Corpus that is studied with Talmudic intensity, but its guiding philosophy is equally pervasive. Modern biomedicine’s fundamental approach to solving complex problems is to reduce them to their smallest and simplest components. In its estimation, heart disease is a problem of cholesterol molecules in the blood; human consciousness is a chemical reaction in the brain. Each minuscule component of the complex phenomena of health and disease is studied by specialized experts, usually in isolation.73

  When my doctors learned I had contracted MRSA, for example, they didn’t ponder the landscape, or my home environment, or my immune status, or the animals that lived in my house, or my diet. They targeted the bug and only the bug. MRSA existed on one side of an invisible divide, I was on the other, gun in hand.

  Modern medicine’s reductionist approach is the exact opposite of Hippocratic medicine’s approach, which was fundamentally holistic and interdisciplinary, and called upon a range of expertise to elucidate disease processes, from engineering and geography to architecture and the law.74 That is no coincidence. Germ theory and the reductionist approach it represented was a revolutionary new paradigm for medicine. Revolutionary new paradigms generally don’t accommodate old ones, subsuming their principles and approaches. They destroy the old ideas and purge the ranks of loyalists.

  The limits of reductionism became clear to me during my struggle with MRSA. One of the worst abscesses I developed started during a summer holiday, which over the course of a week turned from a stinging pinprick to a slow-motion volcano of pus and blood that debilitated my leg to the extent that I could not comfortably walk or drive. I doused my body daily with hospital-issue disinfectant. I changed the dressing twice a day, and all my clothes, too. When the long-worn bandages started to irritate the skin under them, which had turned itchy and angrily red, I rushed to the store to search for something better, finding new “nonirritating” bandages to replace the old bandages I’d bought, now revealed as the “irritating” kind. (Who knew?)

  My overriding fear was that the MRSA-filled pus would seep into one of the new fissures opening up under the bandages and tape that secured the dressing, allowing the pathogen to establish an even deeper foothold. The words of the microbiologist echoed in my mind: He could have lost a leg.

  The MRSA basket in my bathroom expanded to become a MRSA shelf; my medical supplies to battle the microbe grew to include a messy crowd of boxes of sterile pads, tape, antibiotic cream, and drawing salve, something I’d read about online somewhere.

  The battle continued like this for years. The abscesses kept coming back, in the same mysterious places. And every time, I redoubled my antimicrobial efforts, with more boiling of cloths, more wiping of counters, more drugs, more sprays, more bleach baths to rout out the intruder.

  Finally, by year three of MRSA, I stopped fighting. Not for any good reason. I just got tired. One day a bump appeared, and though I noticed it, I could not bring myself to deal with it. I did not scratch or squeeze or apply ointment or heat or bleach. And, incredibly, it went away on its own. I didn’t revel in the triumph. I figured it was just a one-time thing. But it happened again and again. It was as if once I stopped fighting it, it lost gusto for the fight, too. The abscesses seemed to get smaller and less noticeable. If I was patient enough, in time, without prodding, without any intervention at all, they’d quietly go away on their own.

  I have no idea why this happened. Had my immune system figured out how to quell MRSA’s appetites? Had some other strain of Staphylococcus aureus in my body repressed its growth? Was it my diet or exercise regime that undermined its ability to spread? Or perhaps it had nothing to do with me at all. Perhaps my symptoms had been the result of the anti-MRSA treatments themselves, or something in my environment. Whateve
r happened, I suspect it had to do with more than just the microbe my doctors and I had surgically focused our ire upon. There was some kind of Hippocratic interplay going on, between internal factors and possibly external ones, too.

  Modern medicine, singularly focused on the microscopic, is poorly suited to grasp such interactions. And yet most of our new crop of pathogens similarly cross disciplinary boundaries. Pathogens in animals, studied by veterinarians, spill over into people, studied by physicians. But because the two fields rarely interact, the crossovers escape detection. Ebola virus afflicted chimps and apes before the 2014 epidemic in West Africa. Could the human outbreak have been caught earlier had doctors and vets been collaborating all along? West Nile virus killed crows and birds for a month before the human outbreak in New York City occurred. In that case, it was a veterinary pathologist at the Bronx Zoo who finally linked the two outbreaks and pinpointed the virus as West Nile.75 It’s not just the experts who’ve drifted apart; patients consider the two fields separate and unrelated, too. Less than a quarter of HIV patients ask their vets about the health risks posed to them by their pets. These risks include salmonella (carried by turtles and other reptiles), MRSA (carried by dogs and cats), and, before the importation of African rodents was banned in 2003, monkeypox from pet prairie dogs.76

  Biomedical experts rarely collaborate with social scientists. In one survey of biomedical experts, around half admitted to being “unreceptive” toward social sciences. Most of the others expressed ambivalence.77 (They mostly objected to the messiness of social science research, compared to the controlled experimentation that medicine relies on.) And so when new pathogens cause outbreaks, biomedical causes and solutions are immediately sought, while social and political factors—like John Snow’s sidelined discoveries about contaminated water in the nineteenth century—are treated as minor contributory factors. When West Nile virus broke out in New York City, the containment strategy revolved primarily around attacking the biomedical causes of the disease: the insect vector that carries the virus. The nonbiomedical factors, such as the loss of diversity among bird species, went unaddressed.

 

‹ Prev