The Fever

Home > Other > The Fever > Page 23
The Fever Page 23

by Sonia Shah


  Surveillance workers skipped remote and hard-to-reach villages and took extra blood samples from more accessible residents instead. Back in the labs, there’d be two-and three-month backlogs of unexamined blood slides towering over microscopists.106

  Meanwhile, the Anopheles mosquito, having escaped the attention of antimalarial spray teams, continued to be exposed to low levels of the insecticide on DDT-doused crops, and its populations were growing increasingly tolerant of the toxin.

  DDT use in agriculture, especially in developing countries, soared. International development experts pushed for more DDT coverage on the rice and cotton fields of the developing world, convinced that this would unleash the massive harvests required to end hunger and poverty. In 1952, American chemical companies sold twenty-five million pounds of DDT overseas. Over the following decades, its exportation more than tripled.107

  Mosquitoes alit on the DDT-dusted vegetation, and in the DDT-contaminated streams and puddles, they laid their eggs. And what didn’t kill them only made them stronger.

  Solving these myriad problems required innovative, locally sensitive solutions, but with malariology all but dead, WHO was forced to dispense increasingly equivocal advice. Remembers the organization’s José Nájera, “A solution was sought in oversimplification and standardization.”108

  WHO soft-pedaled the spread of insecticide-resistant mosquitoes, urging the spray teams to continue. In 1962, it claimed that resistant mosquitoes “in no case” put the prospect of eradication “in jeopardy.”109 (The Royal Society of Tropical Medicine and Hygiene reached the opposite conclusion that same year, claiming that resistant mosquitoes were “seriously interfering with progress” in the campaign.)110In 1970, WHO allowed that resistant mosquitoes challenged the outcome of eradication in a few countries, but called the problem “more of an inconvenience than a major obstacle.”111 In 1973, WHO advised countries to continue spraying—but to use more expensive alternatives instead.112

  And the organization rubber-stamped mass medication programs. In Brazil, antimalaria leaders confiscated supplies of table salt from all commercial establishments and homes, replacing it with salt loaded with fifty milligrams of chloroquine per gram.113

  With WHO’s blessing, leaders in Angola, Cambodia, French Guiana, Ghana, Guyana, Indonesia, Iran, Irian Jaya, the Philippines, Sarawak, Suriname, and Tanzania followed suit. Millions of people around the globe, whether infected or not, regularly drugged themselves with their daily bread.114

  In most places, it took about six months of mass medication to trigger the emergence of drug-resistant parasites, sending malaria rates back up to their pre-medicated-salt levels.115

  The United States’s five-year funding commitment for the malaria eradication campaign came to a close in 1963.

  Globally, annual malaria cases had fallen from 350 million to 100 million, a historic low, but every where one looked, poverty and malnourishment and instability still reigned.116 Paul Russell and IDAB had claimed that malaria’s demise would lead to greater prosperity and more land in cultivation, but when WHO hired an economist to describe just how, according to the historian Randall Packard, “no one was able to provide the data he needed.”117

  What Russell and IDAB hadn’t figured on was the countereffect of rising numbers of surviving people. In Western countries, where malaria receded with the onset of industrial development, declining death rates were matched by declining birth rates. But when malaria was surgically excised from countries sprayed with DDT and other chemicals—leaving intact the unelectrified shacks and the landless peasants who lived in them—the result was quite different.

  In Sri Lanka, for example, the population grew by over 3 percent a year between 1921 and 1975; the University of Michigan public health expert Peter Newman attributed as much as 60 percent of this growth to falling death rates thanks to the malaria-eradication effort. But birth rates did not decline, and the growing population, demanding food, medicine, and education, soon outstripped the modest gains in economic growth that malaria’s decline had unleashed. As scholars pointed out, death rates had modernized, but birth rates remained ancient.

  On Sardinia, the lives of the locals lengthened, but agricultural productivity and economic production continued a steady decline that had begun in the 1940s. Wealthy outsiders rented summer residences on the newly malaria-free island, but tourism didn’t enrich the local Sards, many of whom fled the island in search of temporary jobs elsewhere.118 “Their lot in life had improved little,” writes the historian John Farley, “and only noncritical tourists wearing blinders could call what had happened progress . . . Tourists have certainly replaced mosquito vectors in Sardinia but the indigenous population remain second-class citizens.”119 The medical anthropologist Peter Brown calls the transformation of Sardinia “modernization without development.”120

  At the same time, the political calculus that set the funding stream aflow had shifted. In the United States, enthusiasm for bold chemical attacks on pestilence started to give way to fears of poisoning and overpopulation. By the 1960s, public health experts had reached a new consensus, that the most serious problem facing humanity was not excess death from disease, but just the opposite: overpopulation.121 Under the new way of thinking, less malaria didn’t mean more people to produce more food; it meant more people to eat food and use up scarce resources. Saving people from sickness just condemned them to death by starvation.122 Wasn’t malaria really a “blessing in disguise,” the naturalist William Vogt posited, since “the malaria belt is not suited for agriculture, and the disease has helped to keep man from destroying it”?123 Indeed, the United Nations’ Food and Agriculture Organization’s first world food survey, published in 1946, had laid blame for the malnourishment of over half the world’s population on the decline in mortality from infectious disease.124 As public apprehension grew, malaria eradicationists found themselves under attack for their shortsightedness.125 Critics attacked Russell as a “dangerous doctor” whose ideas were “creating problems faster than they are solving them,” as Russell put it.126

  Public enthusiasm for DDT had soured, too. The first off notes sounded a few years after DDT’s public launch, when the USDA admitted that the nation’s milk had been tainted with the toxin. It turned out that many creatures that had at first seemed impervious to DDT had actually absorbed and stored minuscule amounts of it in their fat tissues. So long as fat tissue keeps DDT in stasis, it’s a safe enough locale for the compound. But with a half-life of eight years, fat-ensconced DDT lasted long enough to persist throughout the food chain, and each creature that ate another received a full complement of its prey’s lifetime stores of DDT. Some creatures high on the food chain accrued dangerous concentrations of the stuff in their bodies.127 Robins, for example, had been eliminated entirely from a 185-acre plot at Michigan State University, thanks to DDT spraying against elm dark beetles and mosquitoes. The earthworms fed on the fallen leaves of the sprayed trees, and when robins ate the worms the following spring, they accumulated enough DDT to kill them, or to stymie reproduction for two years in those few who survived.128 Cows that fed on DDT-dusted crops stored the DDT in their fat tissue, and secreted it into the milk that scores of American children poured over their morning bowls of Cheerios.129

  By 1955, the nation had been coated with so much DDT that the typical American diet delivered 184 micrograms of DDT—.0002 of a lethal dose—every day.130 That probably would have been unsettling enough. But that wasn’t all. As the cold war had heated up, so, too, did American and Soviet testing of nuclear weapons, releasing invisible clouds of radioactive material into the stratosphere. By 1956, Newsweek magazine fretted, there was sufficient strontium 90 up there “to doom countless of the world’s children to inescapable and incurable cancer.” In 1959, Consumer Reports reported on a major study of strontium 90 concentrations in milk, and the film On the Beach terrorized mass audiences with its dark vision of a post–nuclear holocaust world.131

  Fears of DDT raining dow
n upon the nation emulsified with larger fears of mass poisoning by secret, invisible toxins. Nobody knew if bioaccumulated DDT actually posed any threat to human health,132 but noting the dead birds rotting on their lawns, many Americans couldn’t help but wonder: Would humans be next? After the publication of the biologist Rachel Carson’s potent 1962 book, Silent Spring, which pointed out the folly of using widespread pesticides without a solid understanding of their health and environmental impacts, then-president Kennedy convened a committee that recommended, over the aggrieved howls of the chemicals companies, that the government phase out the use of their iconic DDT and other similar compounds.133

  The United States’s five-year allocation for the global DDT blitz against malaria ran dry a few months later. Nobody asked for any more.

  With the abrupt end of the U.S. contribution, funds to the WHO special account for the malaria-eradication program “stopped cold,” writes the historian James Webb.134 USAID formally withdrew from the program; UNICEF halved its malaria staff.135 For many countries, burdened by the expense and trouble of the campaign, this was just the excuse they needed. Soon some national governments were spending more on garbage collection.136

  Malaria resurged.

  From a low of 18 cases in 1963, malaria swept over Sri Lanka, sickening more that 500,000 in 1969.137 Over roughly the same period the caseload in India zoomed from 50,000 to more than 1 million138; in Central America, from 70,000 to nearly 120,000139; in Afghanistan, from 2,300 to 20,000.140 A year after Europe was declared free of malaria in 1975, a two-year malaria epidemic roiled Turkey.141

  At their 1969 meeting, the World Health Assembly directed WHO to abandon the eradication effort.142 The “dramatic recrudescence” of malaria “will not be possible to stop” without new national commitments, nowhere to be seen.143 WHO dispatched teams to visit malaria programs around the world, advising them to switch from trying to eradicate malaria to learning to live with it.144 In 1974, PAHO jumped ship, too.145

  Paul Russell and Fred Soper were devastated. Russell avoided the subject altogether, remembers Andrew Spielman, “withdrawing from contact with students and faculty” at Harvard. Soper pretended it hadn’t happened. When he wrote his memoirs, he made “no significant mention” whatsoever of the campaign he’d helped inspire.146

  Between 1957 and 1967, the war against Plasmodium cost $1.4 billion or about $9 billion in 2009 dollars.147 148 To this day, malariolo gists and historians disagree over whether it was worth it. The optimists point out that, in just over a dozen years, malaria had been lifted from the shoulders of 32 percent of the human population, which is no small thing. And the first baby steps toward a public health infrastructure had been taken in some of the most remote corners of the planet. The maps that malaria-eradication teams created, for example, had proven crucial to the success of WHO’s smallpox-eradication campaign, doubtlessly a high point for global public health.149

  Where some see incremental progress, others see what WHO’s Tibor Lepes described as “one of the greatest mistakes ever made in public health.”150 Malaria had been eradicated from just eighteen countries in the world, all of them either prosperous, socialist, or island nations.151 That left some two billion souls still burdened with malaria.

  And the malaria that stalked them was in almost every way more vicious and harder to control than it had been before the eradication effort.152 Our best and cheapest weapons—chloroquine and DDT—had been rendered toothless. By the early 1960s, Plasmodium falciparum parasites resistant to chloroquine distributed in table salt had emerged in Colombia, Brazil, Venezuela, and Thailand.153 Chloroquine-resistant P. falciparum arrived in Kenya and Tanzania by 1978, and spread throughout the continent within a decade.154 Around the globe, thirty-eight species of Anopheles mosquitoes had developed resistance to either DDT or its cousin compound dieldrin.155

  Worse, the shallow reservoir of public attention and political will to fight malaria had been spent. In 1979, WHO announced its triumphal success in wiping smallpox off the face of the earth,156 and reoriented the organization away from attacking specific diseases and toward a commitment to providing basic health care to the masses instead.157 The worldwide malaria-eradication campaign faded away to a soon-forgotten footnote. Not even WHO, which stopped using precise measures of malaria, even bothered seriously tracking the disease anymore.158

  Some experts blamed the failure of the campaign on pesticide-heavy agriculture, which sped up insecticide resistance. Two Columbia University public health experts, for example, called malaria’s post-campaign resurgence in India a “social cost” of growing high-yield crops.159 Others blamed WHO for its lack of sensible leadership, particularly in the years after U.S. funding dried up. “All logic went out the window,” complains Spielman.160 Thanks to the skyrocketing price of oil, upon which its production depended, DDT’s price had spiked near the end.161 What if it hadn’t? Russell blamed people in general. “Resistant strains of Homo sapiens” were the problem, he wrote, “impatient bureaucrats” and “deans of schools of public health,” with their trendy ideas about social medicine.162

  If nothing else, the failure of the spray-gun war showed the folly of treating malaria as a single disease with a single solution. For when the war fell apart, there were a thousand different reasons why.163

  In many places today, the spray teams, surveillance workers, and microscopists first put to work on eradicating malaria continue to go through the motions, shadows of the long-dead program. Properly controlling malaria, as opposed to attempting to exterminate it, requires different kinds of workers with different skills, but in many places government leaders feared the political fallout from firing so many eradication workers. So they continue to spray, a little, and collect blood slides, at least a few. But with minimal financing and less oversight, there’s no sensible purpose to it.

  When chloroquine-resistant P. falciparum arrived in Chepo, Panama’s Kuna village, the eradication-era workers put in their ghostly appearances: a lone, sporadically paid guy from the country’s Vector Control Agency, perhaps, and the occasional sprayman with a canister of insecticide making a few rounds. It’s a tepid response that makes little sense given malaria’s changing epidemiology or Panama’s limited budget. In 1996, Panama collapsed its antimalarial service into the general Health Department, as per WHO’s advice. The government has only nineteen cents per capita to spend to tackle malaria. Now when the disease breaks out, they just flex a tired old muscle. The people get bitten, the sprayers head out. The mosquitoes come back. It starts over again.164

  10. THE SECRET IN THE MOSQUITO

  After the ignominious failure of the 1950s-era DDT blitz, malaria disappeared from the headlines. Books on the topic went out of print. Scientists stopped studying the disease; educators stopped teaching it. So completely did malaria vanish from the public mind that many people in the West grew up thinking that there was, literally, no more malaria in this world.

  Take Lance Laifer, a hedge-fund manager turned antimalaria organizer. He knew nothing of malaria until he happened to catch a television program on it in 2005. “I didn’t know it still existed,” he later told The Wall Street Journal. “I didn’t know it was still killing people. I thought it was eradicated a long time ago. I was just flabbergasted.”1

  In fact, over the course of Laifer’s lifetime, the problem of malaria had worsened considerably, especially in Africa.

  Although the planet’s malarial heartland, sub-Saharan Africa, was excluded from the global malaria-eradication program of the 1950s and 1960s, eradicationists hadn’t given up on the continent altogether. In 1960, WHO decided to launch what it called a “pre-eradication” campaign in Africa, a series of demonstration projects and studies that would lead to a viable eradication strategy for the continent. The plan, WHO said, would eradicate malaria from Africa by 1979.2

  Eradication projects commenced in Liberia, Cameroon, Uganda, and, most extensively, in Nigeria. But it couldn’t be done. Even in Nigeria, after six year
s of near-perfect coverage with proven-effective chemicals, malaria hung on. The evidence appeared irrefutable. Even with the best, most effective tools—and no matter how much money was spent—malaria parasites would always find succor on the African savannah, a stronghold from which to perennially infect and reinfect the rest of the continent.

  With eradication off the table, public health experts had little else to recommend to fight Plasmodium. The fractious dissolution of the global eradication program had left the antimalaria community underfinanced and roiled with misgivings. The prospects of even reducing malaria seemed challenging, given how compromised both DDT and chloroquine, the two cheapest and most effective antimalaria weapons, had become. Chloroquine-resistant parasites and DDT-resistant mosquitoes, grizzled survivors of the eradication era, lurked across the continent. And DDT in particular had become anathema. The United States banned it in 1972 (by then the chemical industry had moved on to more lucrative and often more toxic insecticides) and geared up to push for an international ban as well. When that happened in 2000, many African governments censored DDT, too, despite a last-minute exclusion for its use against public health threats. Few public health leaders stuck their neck out for the internationally despised chemical.3

  That’s not to say there was nothing that could be done about malaria in Africa. Even without DDT and chloroquine, there were many possible strategies and tools that could tame the scourge, from leveling roads and providing electricity and safe water, to improving housing and installing mosquito-proof window screening, strengthening healthcare systems and heightening public awareness. Mosquito habitats could be minimized, managed, and avoided. Simply paying local clinicians more money, or beefing up a local health clinic, could have helped lighten the malarial burden, as the health policy expert Anne Mills has pointed out.

 

‹ Prev