Book Read Free

The Coming Plague

Page 33

by Laurie Garrett


  By 1977 Amin’s government had committed so many atrocities both domestically and against its neighbors that the Western powers and Soviet Union had terminated diplomatic and trade relations. In response to British condemnation of Amin’s human rights practices, said to include wholesale rape of women nationwide, as well as summary executions of tens of thousands of citizens of all ages, the dictator personally executed Anglican archbishop Luwum in front of hundreds of witnesses and television cameras.

  “Thousands of innocent Ugandans have been floating in the river Nile in what the dictator and butcher Amin calls accidents,” charged Radio Tanzania on the day of Archbishop Luwum’s execution. “If black African states condemn white minority rule [in South Africa and Rhodesia], they must also condemn atrocities committed in black-ruled states.”

  By early 1978, according to the International Commission of Jurists in Geneva, Amin had summarily executed some 100,000 of his citizens, the trade agreement of the East African Community of Uganda, Kenya, and Tanzania was formally dissolved, and both Kenya and Tanzania were placed on war-readiness status.

  In October 1978, Amin’s troops invaded the Kagera District in northern Tanzania. A pastoral area lining the western shores of Lake Victoria, Kagera had no industry, only one small city (Bukoba), a scattering of hamlets, and no ability to defend itself against Amin’s marauding forces. The Ugandan Air Force softened up the rolling verdant hillsides of Kagera with bombing raids. Troops followed, laying waste to the thatched huts and wattle structures of villages from one end of the district to another. For two months Amin’s troops occupied a 700-square-mile area of Tanzania, killing hundreds of peasants, practicing deliberate rape of the women that was intended to humiliate their men, slaughtering most of the region’s livestock, and driving some 40,000 peasants into exile.

  Nyerere appealed for support from the Organization of African Unity (OAU) and the United Nations. None was forthcoming.

  In December 1978, Tanzanian troops went to war with Uganda, fighting over the Kagera region for two months. Having beaten back Amin’s troops, the Tanzanians pushed on toward the capital, Kampala.

  On April 11, 1979, the Amin government was toppled. Idi Amin went into exile in Libya, and Tanzania put Lule in power.

  The five-month war between Tanzania and Uganda—which was puny by international standards—devastated the infrastructures of Uganda and northern Tanzania, and left the economies of both nations in a shambles. The combined impact of war and previous years of Amin’s wantonness left Uganda in need of $2.3 billion in emergency reconstruction aid. It hurt Kenya’s coffee trade, which had relied in part on Ugandan beans. And for the tiny, landlocked nations of Burundi and Rwanda it brought all trade to a standstill.31

  When Lule’s staff took over the national bank, they discovered that Uganda was $250 million in debt to foreign interests, and less than $200,000 could be found in the nation’s coffers. During his six-year reign, Amin simply printed more money whenever resources dwindled, causing annual inflation to run at 200 percent a year. Prior to the war gasoline sold in Kampala for $39 a gallon, housing rents increased 41 percent in a single year, while per capita income plummeted.32

  Well before the war erupted, most health professionals who could manage to do so had fled the country, and the severe economic difficulties created by the Amin government prompted wholesale looting of all undefended facilities.

  Widespread famine followed the end of the war, claiming at least 50,000 lives. Wildlife conservation groups throughout the world protested as starving Ugandans slaughtered and consumed elephants, hippos, elands, giraffes, monkeys, and other animals by the thousands.

  Between 1975 and 1980, Uganda, its entire health infrastructure devastated, experienced epidemics of malaria, leprosy, tuberculosis, cholera, visceral leishmaniasis (kala-azar), and virtually every vector-borne ailment known to the continent.33 A French team found evidence of more exotic diseases as well, when they took blood surveys of villagers in western Uganda. Ebola, Marburg, Lassa, West Nile fever, Crimean-Congo hemorrhagic fever, and Chikungunya were among the viruses found in the blood of the region’s populace.34

  Between 1971 and 1977, Uganda had its worst measles epidemic in over forty years, with high death rates among children seen all over the country. So great was the country’s chaos that no agency kept count of the death toll. Gonorrhea soared during the Amin years, particularly among soldiers. Because the country was bereft of antibiotics, most cases went untreated. Routine vaccination for such diseases as whooping cough and tetanus came to a halt, and the incidence of these diseases rose dramatically.

  Starving, sick refugees poured by the tens of thousands across borders to Zaire and Sudan, taking their diseases with them.

  Makerere University, which had been the primary medical training center for East Africa’s doctors, was looted right down to its electrical sockets and bathroom tiles. By the end of the 1970s, the nation of Uganda would be completely out of toilet paper, antibiotics, aspirin, sterilizers, cotton wool, bed linens, soap, clean water, light bulbs, suturing equipment, and surgical gowns.35 Rumors of strange disease outbreaks were rampant, but there was nobody left to investigate these claims.

  Such tragic events, with the resultant epidemics and health crises, were mirrored all over the world. From Pol Pot’s reign of terror in Cambodia to the Cold War-manipulated battlefields of Central America, the world’s poorest countries spent extraordinary amounts of money on domestic military operations and warfare. And the microbes exploited the war-ravaged ecologies, surging into periodic epidemics.

  The World Health Organization, with a staff of only 1,300 people and a budget smaller than that spent on street cleaning every year by the city of New York, tried to combat such seemingly intractable public health problems with donated vaccines, technical assistance, and policy statements. 36

  On September 12, 1978, WHO convened a meeting of ministers of health from over 130 nations in Alma-Ata37 in the U.S.S.R. The conference issued what would be hailed years later as a pivotal document in the international public health movement: the Declaration of Alma-Ata. Inspired in part by U.S. Surgeon General Julius Richmond’s Health Goals 1990, which in 1975 systematically outlined the status of Americans’ health and set goals for improvement, the Alma-Ata Declaration called for “the attainment by all peoples of the world by the year 2000 of a level of health that will permit them to lead a socially and economically productive life.”

  The ten-point Alma-Ata Declaration defined health as “a state of complete physical, mental, and social well-being, not merely the absence of disease or infirmity,” and declared it “a fundamental human right.” It decried health care inequities, linked human health to economic development, and called upon the governments of the world to develop financially and geographically accessible primary health care facilities for all their people.

  Declaring health a human right forced issues of disease control onto the newly powerful agenda of global civil liberties. In 1976 the UN General Assembly voted to enter into force the International Covenant on Civil and Political Rights.38 It was the strongest vilification of tyranny, discrimination, violations of basic freedoms, and injustice ever passed by the UN. Also that year the UN passed the International Covenant on Economic, Social, and Cultural Rights,39 which specifically recognized “the right of everyone to the enjoyment of the highest attainable standard of physical and mental health.”

  John Evans of the World Bank elucidated three key demarcations in health problems that he felt were tied to the economic development and status of each nation: infectious disease stage; mixed phase; and chronic disease state. In the poorest, least developed nations of the world, the majority of the population suffered illness and death due to communicable and vector-borne diseases. With improvements in economic development, Evans said, came a painful period of mixing, in whic
h the poorer members of society succumbed to infectious diseases while the wealthier urban residents lived longer, disease-free lives that were eventually cut short by chronic ailments such as cancer and heart disease.

  In the most developed nations, Evans argued, infectious diseases ceased being life-threatening, some disappeared entirely, and the population generally lived into its seventh decade, succumbing to cancer or heart disease. The bottom line, from Evans’s perspective, was that infectious diseases would no longer pose a significant threat to postindustrial societies.

  “We must never cease being vigilant,” Richmond said, “but it is altogether proper to shift resources towards prevention of chronic diseases. With political will, tremendous strides can be made.”

  Though the World Bank perspective informed most long-term planning, there were voices within the academic public health community who loudly questioned the three-phase assumptions. While not disputing that curative medicine had made genuine strides, particularly since the 1940s, and agreeing that control of disease was linked to societal wealth, they refuted the idea that there might be a direct correlation between stages of national development and individual disease. In their view, the ecology of disease was far more complex, and waves of microbial pestilence could easily occur in countries with enormous gross national products. Conversely, well-managed poor countries could well control pestilence in their populations.

  The debate centered on a two-part question: when and why did most infectious diseases disappear from Western Europe, and what relevance did that set of events have for improving health in the poorest nations in the last quarter of the twentieth century?

  University of Chicago historian William H. McNeill spent the early 1970s studying the impact epidemics had on human history since the beginning of recorded time, and then reversed his query to ask which human activities had prompted the emergence of the microbes. In 1976, his book Plagues and Peoples40 created a sensation in academic circles because it argued with the force of centuries of historical evidence that human beings had always had a dramatic reciprocal relationship with microbes. In a sense, McNeill challenged fellow humans to view themselves as smart animals swimming in a microbial sea—an ecology they could not see, but one that most assuredly influenced the course of human events.

  Like Evans, McNeill saw stages over time in human relations with the microbes, but he linked them not so much to economic development as to the nature at any given moment of the ecology of a society. He argued that waterborne parasitic diseases dominated the human ecology when people invented irrigation farming. Global trade routes facilitated the spread of bacterial diseases, such as plague. The creation of cities led to an enormous increase in human-to-human contact, allowing for the spread of sexually transmitted diseases and respiratory viruses.

  Over the long course of history, McNeill said, pathogenic microbes sought stability in their relationships with hosts. It was not to their advantage to wipe out millions of nonimmune human beings in a single decade, as happened to Amerindians following the arrival of Columbus and Cortez. With the Europeans came microbes to which the residents of the Americas had no natural immunity, and McNeill estimated, “Overall, the disaster to Amerindian populations assumed a scale that is hard for us to imagine. Ratios of 20:1 or even 25:1 between pre-Columbian populations and the bottoming-out point in Amerindian population curves seem more or less correct.”41

  This was not an ideal state for the microbes, he argued, because such massive death left few hosts to parasitize. After centuries of doing battle with one another, humans and most parasites had settled into a coexistence that, if not comfortable for humanity, he argued, was rarely a cause of mass destruction. Still, he sternly warned, “no enduring and stable pattern has emerged that will insure the world against locally if not globally destructive macroparasitic excesses.”

  Other historians of disease had tried to link the emergence of epidemics to the social and ecological conditions of human beings,42 but none had presented as lucid an argument as McNeill’s, and it promoted widespread reappraisal of both historic events and contemporary public health policy.

  Nobel laureate Sir MacFarlane Burnet was moved from his perspective as an immunologist to issue similar warnings about humanity’s overconfidence. True, he said, vaccines and antibiotics had rendered most infectious diseases of the Northern Hemisphere controllable. But, he cautioned, “it is almost an axiom that action for short-term human benefit will sooner or later bring long-term ecological or social problems which demand unacceptable effort and expense for their solution. Nature has always seemed to be working for a climax state, a provisionally stable ecosystem, reached by natural forces, and when we attempt to remold any such ecosystem, we must remember that Nature is working against us.”43

  The policy implications were clear, Burnet said. Start by looking at the ecological setting of disease transmission. If the ecology could be manipulated without creating some untoward secondary environmental impact, the microbe could be controlled, even eradicated.

  René Dubos, who served in the 1970s as a sort of elderly patron saint of disease ecology because of his vast contributions to research on antibiotics and tuberculosis during the pre-World War II period, also favored an ecological perspective of disease emergence, but laid most of the blame for epidemics on Homo sapiens rather than on the microbes. In Dubos’s view, most contagious disease grew out of conditions of social despair inflicted by one class of human beings upon another. Dubos believed tuberculosis, in particular, arose from the social conditions of the poor during Europe’s Industrial Revolution: urban crowding, undernutrition, long work hours, child labor, and lack of fresh air and sunshine.

  “Tuberculosis was, in effect, the social disease of the nineteenth century, perhaps the first penalty that capitalistic society had to pay for the ruthless exploitation of labor,” Dubos argued.44

  For Dubos, unbridled modernization could be the enemy of the poor, bringing development and freedom from disease to the elites of societies, but consigning their impoverished citizens—particularly those living in urban squalor—to lives of microbial torture.

  “The greatest strides in health improvement have been achieved in the field of disease that responded to social and economic reforms after industrialization,” he wrote.45 He strongly felt that infectious diseases remained a major threat to humanity, even in the wealthy nations, and warned physicians not to be fooled into complacency by what he termed “the mirage of health.”

  At the University of Birmingham in England, Thomas McKeown led a team of researchers who reached the conclusion that rapid urbanization, coupled with malnutrition, was the key factor responsible for the great epidemics of England and Wales from medieval times to the beginning of the twentieth century. Conversely, McKeown credited improvements in access to nutritious food for England’s lower classes with at least half the reduction in premature mortality in the country between 1901 and 1971, and insisted that the bulk of all improvements in survival preceded the advent of modern curative medicine.46 McKeown based his assertions on a meticulous scanning of English and Welsh government medical records maintained over the period, which indicated that premature mortality rates decreased radically before the age of antibiotics.

  Joe McCormick had heard it all, argued one position or another over beers with CDC colleagues, and recognized grains of truth scattered through each position, from the World Bank to the angry socialist dependency theorists. But all the hand-wringing and theorizing wasn’t going to provide the resources needed to get rid of Lassa.

  For nearly three years he had been tramping around West African villages testing residents and rats for Lassa virus infection. By 1979 McCormick had reached the conclusion that Lassa was an entrenched endemic disease, causing thousands of cases of illness of varying degrees of severity each year. The only way to rid Sierra Leone of human Lassa cases would be
to eliminate contact between the rats and humans—an option he considered doable if millions of dollars were spent improving the country’s rural housing and hospitals.

  The alternative was mass education about rat avoidance and ribavirin therapy for those who suffered Lassa fever. That prospect was also orders of magnitude too expensive for the impoverished state.

  In late June 1979, McCormick returned to CDC headquarters to take over Karl Johnson’s job as chief of the Special Pathogens Branch, leaving Webb in charge of the Sierra Leone laboratory. For many years to come, he would return to the West African country to further study the Lassa virus, hoping to find ways to limit the impact of the disease on the developing countries of West Africa.

  Shortly after his return to Atlanta, the World Health Organization called to formally request McCormick’s assistance in investigating a suspect epidemic in Sudan. It was believed that Ebola was the culprit.

  According to Sudanese epidemiologist Osman Zubeir, the outbreak began sometime in early August in N’zara, spread quickly, and was still raging when he notified WHO in mid-September. He placed the area under quarantine, and Zubeir was preparing a surveillance effort.

  McCormick hastily gathered supplies and the first assistant he could get his hands on—a new EIS officer, Dr. Roy Baron. Within a matter of hours, the pair were on board a flight to Khartoum, and McCormick was giving Baron a rapid-fire lesson on Ebola, Sudan, field operations, and self-protection.

  Joe tugged at his dark brown goatee with anticipatory excitement, relishing a second chance to crack the mysteries of Ebola. McCormick showed Baron the only available maps of the region, made in 1955. He described the difficulty of finding villages, which were deliberately hidden in the ten-foot-tall Sudan grass and swamps.

 

‹ Prev