Book Read Free

The Coming Plague

Page 32

by Laurie Garrett


  “This approach had certain notable successes,” he said in a key policy address:

  During the last quarter century, per capita GNP growth in developing countries has averaged three percent—nearly the same growth rate as the rich countries. Average life expectancy in developing countries has increased from 35 years to 50 years—the level attained in Western Europe only at the beginning of the twentieth century … . Some developing countries have achieved such high rates of growth that our grant aid to them has ended, and our principal form of economic interaction with them is now largely in trade and private investment.

  These overall gains, however, have masked a crucial fact: that while some developing countries have achieved dramatic per capita GNP growth—some at rates of over 7 percent—many others have made very little progress. These averages also conceal wide differences in the extent to which various groups within the poor countries have benefitted from development. For in most less developed countries, the so-called modern sector of urban areas and large farms have been the major beneficiaries of growth, while the urban and rural poor—whose numbers have been rapidly increasing and who form the majority in most developing countries —have generally been left behind.13

  The World Bank didn’t begin to view health care as a specific part of its mission until 1975, when its Health Sector successfully argued that trickle-down modernization would never adequately remedy the acute needs of the poorest of the poor. Between 1975 and 1978 the World Bank gave loans or provided technical assistance for seventy health-related projects in forty-four countries, ending up the world’s biggest health lender. During that three-year period, the World Bank loaned poor countries $400 million for primary health care facilities and mosquito control; $160 million for family planning and nutrition projects; and $3.9 billion for water sanitation efforts.14

  At the close of the decade, the World Bank again assessed its efforts, deciding to shift policy further toward provision of financing the development of primary health care infrastructures for, among other things, “promotion of proper nutrition, provision of maternal and child health care, including family planning, prevention and control of endemic and epidemic diseases.”15

  As the twentieth century drew to a close, the majority of the world’s population still suffered and died from diseases due to unclean water.16 During the 1970s one out of every four people on earth suffered diseases due to roundworms, acquired from polluted waters or foods. A World Bank study found that 85 percent of the residents of Java had hookworm. Some 1.7 billion people annually suffered some additional parasitic infection acquired from polluted water, according to WHO.17

  Sometimes a major water development project could directly increase the incidence of disease by changing the local ecology in ways that were advantageous to the microbes. The most often cited example of this was the Aswan High Dam, with its apparent association with an increased incidence of schistosomiasis.18

  Schistosomes are parasitic organisms with a complex life cycle in which, at different stages of the organism’s development, the creature grows inside snails, on the surface of freshwater plants, and inside human beings. Its eggs are excreted via human waste into water supplies and are taken up by riverbank and lakeside snails. Inside the snails the eggs hatch and the organisms advance into the larval stage. Those larvae are excreted by the snails back into the lake or river, where they come to rest on the stems and leaves of underwater plants, usually along banks. People who bathe, play, or work in the watery area brush against these plants, and the larvae readily pass through their skin into the bloodstream.

  Depending which species of schistosome is involved (Schistosoma japonicum, S. haematobium, S. mekongi, S. mansoni, or S. intercalatum), the larvae make their way into the human liver, spleen, urinary tract, kidney, rectum, or colon, where they grow into worms. The worms may remain indefinitely, secreting their eggs, which the human then passes on into water supplies, repeating the cycle.

  The worms can produce an enormous range of illnesses in people, from minor local skin infections and virtually unnoticeable mild fatigue to life-threatening heart disease, epilepsy, kidney failure, and malignant cancer in the organs in which they reside. Because the range of symptoms is so vast, it is virtually impossible to say with certainty how many people in an endemic area have schistosomiasis: indeed, the definition of schistosomiasis has always been a matter of dispute.

  Given the uncertainties inherent in schistosomiasis diagnosis, it was always difficult to prove specific trends in the incidence of the disease. Nevertheless, there was scientific agreement that the enormous Aswan High Dam radically changed the ecology of the Nile, slowing the flow of the once uncontrolled river, preventing annual floods, and creating the huge Lake Nasser. And those changes prompted shifts in the schistosome population.

  For millennia nearly every Egyptian had lived in close proximity to the Nile, the rest of the country being largely desert, so the potential of human exposure to any changed disease risk along the river was very high. Yet at no stage of the 1950s planning or construction of the Aswan High Dam was the ecology of human disease taken into consideration, by Egyptian authorities, Western financial interests that initiated the project, or the Soviet government, which, with much fanfare, completed the dam.

  The slowing of the Nile flow rates caused a marked shift in the types of schistosome species prevalent in Egypt, from S. haematobium to S. mansoni. For the Egyptian people this meant a shift from organisms that primarily attacked young children, mostly producing urinary tract disorders, to organisms that targeted young adults, causing often severe disorders of the spleen, liver, circulatory system, colon, and central nervous system.19 Similar shifts in schistosome populations and human disease followed construction of the Sennar Dam in Sudan and the Akosombo Dam in Ghana.

  The Aswan High Dam’s impact on schistosomiasis was questioned by some because there was a lack of sound comparative data on the incidence of the disease in Egypt prior to construction. But there was an additional reason to challenge the wisdom of building massive water projects without first assessing their potential health impact: Rift Valley fever.

  Carried by mosquitoes (Aedes pseudoscutellaris), the Rift Valley fever virus was, prior to 1977, considered largely a veterinary disease that primarily attacked bovine and ovine livestock, though sporadic cases among ranchers were seen. First noticed in 1930, when an outbreak of spontaneous abortions, stillbirths, and adult die-off occurred among sheep and cattle in Kenya,20 Rift Valley fever epidemics occurred throughout Africa wherever European livestock species, which had no immunity to the virus, were introduced to the continent.21

  The virus produced hemorrhagic disease similar to yellow fever, with marked lethal effects on developing fetuses and newborns. In nonimmune animals its impact could be devastating: intravenous injections of minute quantities of the virus into laboratory mice produced death within less than six hours in 100 percent of the test animals.22

  In 1977, six years after completion of the Aswan Dam, James Meegan and his colleagues with the U.S. Navy Medical Research Unit based in Egypt proved that a widespread human epidemic in the Aswan area was due to Rift Valley fever. Over 200,000 people fell ill, 598 died of hemorrhagic disease, and livestock losses were so great that the country experienced severe meat shortages.23 The scientists concluded that the epidemic began as an isolated outbreak among livestock in northern Sudan, but spread—either via human migration or wind-carried mosquitoes—to Aswan. Once in Aswan, the infected mosquitoes thrived in the 800,000 hectares of dam-created floodlands. The disease had never previously been seen in Egypt.

  Similar dam-related epidemics of Rift Valley fever would occur during the 1980s in Mauritania, Senegal, and Madagascar, and in the 1990s the disease would revisit Aswan, causing a severe epidemic.24

  By the mid-1980s major donor groups, particularly the World Bank, would acknowledge the health care downside to dam
construction and instruct applicants for major water project funding to submit disease impact studies as part of their project proposal. In all cases, however, it would be decided that the benefits to society of hydroelectricity and flood control far outweighed the disease potential, particularly if steps were taken to improve local primary health infrastructures.

  By 1980 the World Bank would conclude, belatedly, that the worldwide malaria eradication campaign had failed, noting that cases of the disease had increased an astonishing 230 percent on the Indian subcontinent over a mere four years’ time (1972–76). Most other vector-borne diseases, just a decade earlier considered easy to eliminate, had experienced “a startling increase in their incidence over the last decade.”25 Sleeping sickness (trypanosomiasis), bilharzia (schistosomiasis), river blindness (onchocerciasis), and Chagas’ disease were all increasing in frequency, often in the very countries that had been recipients over the period of billions of donated and loaned U.S. dollars.

  Something was clearly amiss. The world’s leading agencies were forced to retreat from the grand optimism of the fifties and sixties. Explanations had to be found, blame fixed, solutions suggested.

  By the end of the 1970s the World Bank’s solution was to urge poor nations to spend more on primary health care and disease prevention. This was done mostly through persuasion, such as the World Bank implying that “because of the emotional appeal of health issues, it may be politically attractive to redistribute welfare through government provision of health care.”26

  Reaching U.S. health care expenditure levels, even as a function of per capita annual spending, would, however, represent an extraordinary feat for most of the world’s poor nations. According to the Carter administration, in 1976 in the United States there was a 1:600 ratio of physicians to the general population; virtually 100 percent of drinking water supplies were considered free from infectious disease; people consumed, on average, 133 percent of their minimum caloric need every day; 99 percent of adults were literate; 3.3 percent of the federal GNP was directed toward health care spending for a per capita spending rate of $259.

  In contrast, Tanzania, for example, had one physician for every 18,490 citizens; safe drinking water was available to less than 40 percent of the population; the average citizen consumed only 86 percent of the minimum daily caloric need; 34 percent of the adult population was illiterate; and the government spent 1.9 percent of its GNP on health care for a total of $3 annually per capita. Even if Tanzania doubled the percentage of its GNP devoted to health care, reaching U.S. percentage levels, it would still be spending less than $10 a year on each of its citizens. To reach U.S. annual expenditure rates of $259 per citizen, the Tanzanian government would have to rob nearly every other program in the government.27

  “It is stupid to rely on money as the major instrument of development when we know only too well that our country is poor,” Tanzania’s one-party state proclaimed in its historic Arusha Declaration of 1967. “It is equally stupid, indeed it is even more stupid, for us to imagine that we shall rid ourselves of our poverty through foreign financial assistance rather than our own financial resources.”

  Tanzania sought to create an infrastructure of modestly trained paramedics who worked out of tiny concrete or wattle clinics dispersed throughout the villages inhabited by most of the nation’s ten million citizens. Between 1967 and 1976, the Tanzanian Mtu ni Atya Chakula ni Uhai village health campaigns increased the numbers of maternal/child health clinics by 610 percent, rural paramedics by 470 percent, and built 110 new medical facilities (for a total of 152 clinic structures nationwide by 1976). Life expectancy over that time increased seven years, reaching 47 (compared to 70 in Europe in 1976). Infant mortality also showed modest improvement, decreasing to 152:1,000 babies, compared to a 1967 level of 161:1,000 (with 1976 European infant mortality at 20:1,000).28

  Recognizing its acute need for physicians, the government built Muhimbili Medical School in Dar es Salaam and sent many bright young Tanzanians overseas for medical training, hoping to increase its national physician population by about 65 doctors a year. By 1975 the paramedicto-patient ratio was 1:454, but the physician-to-patient ratio had actually worsened, in part due to anti-Asian bigotry. Many of East Africa’s besteducated residents were Indians, brought decades earlier as indentured labor by British colonialists in need of a literate bureaucratic class. In 1972 Uganda’s dictator, Idi Amin (whose proclaimed hero was Adolf Hitler), ordered all Asians, numbering some 50,000 to 80,000, to leave the country immediately or face execution. No hue and cry of protest was raised by any other African government. Thousands of Indians, most of whom had spent all their lives in East Africa, fled not only Uganda but the continent as a whole.29

  Though such problems plagued all the poor nations on the planet, they were particularly acute in Africa because of its severe political and military instability. Nowhere else in the world were governments so recently freed from centuries of European colonialism. The Portuguese colonies of Guinea-Bissau, Angola, Mozambique, and Cape Verde only gained independence in the mid-1970s, after more than a decade of bloody civil war. In the southern part of the continent, warfare and instability would persist until the fates of Rhodesia, South Africa, Angola, and Southwest Africa were decided.

  CENTRAL EAST AFRICA

  To the north of those countries (which would eventually be named Zimbabwe, South Africa, Angola, and Namibia, respectively), lay a string of majority-ruled independent states sworn to boycott the still white-ruled southern states and support their various liberation movements. The Frontline States, as they were called, included Tanzania, Zambia, Mozambique, and, to a less militant degree, Lesotho and Botswana. Guerrilla troops representing the future governments of the region freely moved inside the Frontline States, and Lusaka was a sort of command post for SWAPO (South-West Africa People’s Organization), ZAPU (Zimbabwe African People’s Union), ZANU (Zimbabwe African National Union), and South Africa’s ANC (African National Congress). Political exiles from the troubled south poured into the Frontline States, exacerbating their already acute economic difficulties. Furthermore, trade was severely impaired by the states’ self-imposed boycott of South African ports and markets.

  Elsewhere on the continent, civil instability was legion. Mobutu brutally smashed all dissent within Zaire. Self-appointed Emperor Bokassa ruled the Central African Republic with such brutality that he would eventually be overthrown by French paratroopers and tried for cannibalism and genocide. In an alleged anti-corruption cleanup campaign, junior elements of the military violently seized power in Ghana. Civil unrest due to religious and tribal disputes raged through Sudan, Morocco, Ethiopia, Mauritania, Angola, and Rwanda. Much of the warfare stemmed from the artificial national boundaries created by colonial powers in the seventeenth and eighteenth centuries, dividing ancient tribal lands, extended families, and traditional power structures.

  The superpowers, as well as the People’s Republic of China, sought to manipulate these seemingly endless battles, hoping to align African governments with either the United States, the U.S.S.R., or China. As a result, obscene amounts of money were spent on the military and police forces of impoverished countries, squandered by dictators who made “gifts” to their nation’s power elites in exchange for support, wired to the bank accounts of arms dealers worldwide.

  Clearly, those funds were not spent on health care. Consider the examples of Tanzania and Uganda.

  In 1979 Tanzania was celebrating its recent military victory over Uganda. Though the world’s seventh pandemic30 of cholera had struck Dar es Salaam and the lethal Vibrio bacteria coursed through the open sewer lines that crisscrossed the streets of the capital, little attention was paid to anything but the war. Pretty young girls proudly proclaimed victory across their rear ends, wearing kangas made from fabric emblazoned with the news. Young men wore their military uniforms as they strutted,
heads held high, along Independence Avenue or past ANC headquarters on Nkrumah Street.

  On his way to the Dar es Salaam airport in April 1979, Yusufu Lule anxiously cast his eyes about, looking for what he suspected was his last time at the city’s street scene. After years of exile, he was about to take the reins of government in Uganda. Though he had agitated for Idi Amin’s overthrow for years, the prospect of returning was frightening.

  “It is chaos. We have a whole generation who don’t know right from wrong. For years they have seen such brutality—rape, murder, theft, torture. I am going to a place where morality has no meaning,” Lule said with apparent dread.

  Sixty-eight days later, Lule would be overthrown and Uganda would spin into a cycle of short-lived and vengeful governments.

  It all began in 1971, when the Ugandan military overthrew the elected government of Milton Obote, putting a semi-literate, temperamentally violent man named Idi Amin in charge of the nation of some 18 million. Ten years earlier, Uganda had been considered one of the finest jewels in the British Empire’s crown; a rich cornucopia of agricultural wealth with a well-established infrastructure of colonial and missionary schools, hospitals, roads, and trade. But Obote’s government was also marked by corruption that fueled unrest and the 1971 military coup.

  Amin destroyed the nation’s prosperity and drove his country into a state of hellishness unlike anything it had previously experienced.

  In 1975 Tanzanian President Julius Nyerere denounced Amin as “an oppressor, a black fascist, and a self-confessed admirer of fascism.” A few months later, Amin declared that, by ancient tribal rights, parts of Sudan, Kenya, and Tanzania belonged to Uganda. To drive home his point, Amin publicly executed a group of Kenyan students studying at universities in Entebbe and Kampala.

 

‹ Prev