The Coming Plague

Home > Other > The Coming Plague > Page 38
The Coming Plague Page 38

by Laurie Garrett


  Many urban diseases, including tuberculosis, declined in the cities of the Northern Hemisphere at about the same time as these social reform campaigns emerged. In addition to these changes in physical ecology, urban residents’ lives were improved through such political and Christian reform efforts as elimination of child labor, establishment of public school systems, shortening of adult workweeks, creation of public health and hospital systems, and a great deal of boosterism about “sanitation.”

  At the peak of the Industrial Revolution, before such reforms were widely instituted, life in the cities had become so unhealthy that the birth rates were lower than the death rates. For a city like London this meant that the child and adult workforce of nearly one million people could be maintained only by recruiting fresh workers from the countryside. But by 1900 the birth rates soared, the death rates plummeted, and life expectancies improved markedly. Nearly all contagious diseases—including tuberculosis—steadily declined, reaching remarkably low levels well before curative therapies or vaccines were developed. In England and Wales, for example, the tuberculosis death rate dropped from a high of 3,000 per million people in 1848–54 to 1,268 cases per million in 1901 to 684 per million in 1941, just before antibiotic treatment became available.20

  A similar pattern could be seen for infectious diseases, particularly tuberculosis, in the United States. In 1900, TB killed about 200 of every 100,000 Americans, most of them residents of the nation’s largest cities. By 1940, prior to the introduction of antibiotic therapy, tuberculosis had fallen from being the number two cause of death to number seven, claiming barely 60 lives in every 100,000.21

  By 1970, tuberculosis was no longer viewed as the scourge of the industrialized world’s cities.22 The World Health Organization then estimated that about 3 million people were dying annually of the disease, some 10 to 12 million were active TB cases, and with antibiotic therapy the mortality rate had dropped to about 3.3 deaths in every 100,000 TB cases. Most new infections were then occurring not in the industrial cities of the Northern Hemisphere but in villages and cities of the developing world. The microbe’s ecology had changed geographically, but continued to be concentrated in urban areas.23

  The enormous decline of tuberculosis in the Northern Hemisphere was viewed as a great victory, even though at the time TB raged across Africa, Asia, and South America.

  Why this apparent victory over a microbe had occurred—what specific factors could be credited with trouncing tuberculosis—was a matter of furious debate from the 1960s through the 1990s. Resolution of the debate could have been useful in two ways: in helping public health authorities anticipate problems in their cities that might promote the emergence or reemergence of infectious diseases in the future, and in guiding urban development in the Third World by identifying which expenditures—drawn from ever-shrinking national reserves—might have the biggest impact on their public’s health.

  But the waters were muddy. British researcher Thomas McKeown argued that nutrition was the key—improved diets meant working people could withstand more disease.24 Rene Dubos was equally certain that it was elimination of the horrendous working conditions of the men, women, and children of the Industrial Revolution, coupled with improved housing, that merited credit for the decline in TB.25

  Medical historian and physician Barbara Bates, of the University of Kansas, skillfully asserted that the bold TB control programs of the early twentieth century, sparked by German scientist Robert Koch’s 1882 discovery of the M. tuberculin bacterium, which led to mandatory quarantines in medical sanitariums, had little or no impact on the decline of the disease.26

  Bates insisted: “The goal of prevention was frequently compromised. Physicians often discharged still infectious patients, and men and women with communicable disease left institutions against advice. Instead of a system that cured and prevented disease, society had built one that met some needs of sick and dependent people, spared families some of the burdens of care at home, and reduced the public’s fear of infection, if not the actual threat. These unanticipated results grew out of political, social, and economic transactions in which medical understanding of tuberculosis played only a subordinate part.”27

  Another way to answer the question of what factors had been key to Europe’s TB decline was to study the disease in an area that was making a transition during the twentieth century that was roughly equivalent to Northern Europe’s Industrial Revolution a century earlier. South Africa fit the bill, and was a good place to test the hypotheses of McKeown, Dubos, and others: the country’s European descendants had a standard of living and disease patterns analogous to their counterparts in the Northern Hemisphere. But the African and Indian residents suffered from marked economic and social deprivation. Their communities bore a striking resemblance to the squalid living conditions endured by London’s working classes in the 1850s.

  Though antibiotics and curative medicine existed, coupled with scientific knowledge of the modes of transmission of the bacteria, tuberculosis death rates rose 88 percent in South Africa between 1938 and 1945. Cape Town’s rate increased 100 percent; Durban’s 172 percent; and Johannesburg’s 140 percent. In the rural areas, despite poverty and hunger, active TB cases and deaths never exceeded 1.4 percent in any surveyed group. But in the cities, incidence rates as high as 7 percent were commonplace by 1947. Nearly all TB struck the country’s black and so-called colored populations.

  The key to the increase in TB seemed not to rest with the health care system, for little had changed during those years. Nor were the diets of black South Africans much altered—they had been insufficient for decades.

  The answer, it seemed, was housing. From 1935 to 1955, South Africa underwent its own Industrial Revolution, having previously been a largely agrarian society. As had been the case a century earlier in Europe, this required recruitment of a cheap labor force into the largest cities. But South Africa had an additional factor in its socioeconomic paradigm: racial discrimination. The recruited labor was of either the black or the colored race, prescribed by law to live in designated areas and carry identity cards which stipulated their sphere of mobility. The government subsidized an ambitious program of urban housing development for white residents of the burgeoning cities, but government-financed housing construction for black urbanites during the period of metropolitan expansion actually declined by 471 percent.28

  By the 1970s, South Africa was generating 50,000 new tuberculosis cases a year, and the apartheid-controlled public health agencies were arguing that blacks had some unidentified genetic susceptibility to the disease. In 1977 the government made many of its worst TB statistics disappear by setting new boundaries of residence for blacks. The TB problem “went away” when hundreds of thousands of black residents of the by then overly large cities were forcibly relocated to so-called homelands, or when their urban squatter communities were declared outside the city’s jurisdiction and, therefore, its net of health surveillance.29

  If the South African paradigm could be applied broadly, then, it lent some support to Dubos’s theories, underscoring human squalor as the key ecological factor favoring M. tuberculin transmission, but it failed to support Dubos’s other assertions about the role of working conditions.

  One thing that cities of the wealthy industrialized world and the far poorer developing world had in common was an ecology ideal for the emergence of sexually transmitted diseases, particularly syphilis. Certainly people had sex regardless of where they lived, but cities created options. The sheer density of Homo sapiens populations, coupled with the anonymity of urban life, guaranteed greater sexual activity and experimentation. Since ancient times urban centers had been hubs of profligacy in the eyes of those living in small towns and villages.

  Houses of both male and female prostitution, activities mainstream societies often labeled “deviantâ�
� such as homosexuality, orgies, even religiously sanctioned sexual activity, were common in the ancient cities of Egypt, Greece, Rome, China, and the Hindu empires and among the Aztecs and the Mayans. The double standard of chastity in the home and risque behavior in the anonymity of the urban night dates back as far as the beginning of written history.

  It seemed surprising, then, that syphilis did not reach epidemic proportions until 1495, when the disease broke out among soldiers fighting on behalf of France’s Charles VIII, then waging war in Naples. Within two years, however, it was known the world over. Syphilis seemed to have hit Homo sapiens as something completely new, because in the fifteenth and sixteenth centuries it struck in a far more fulminant and deadly form than it would take by the dawn of the twentieth century.

  Syphilis was caused by a spirochete—or spiral-like—bacterium called Treponema pallidum, the same organism that caused the childhood skin disease known as yaws. Evidence of the existence of yaws clearly dated back to ancient times, yet of syphilis there was no hint prior to the fifteenth century.

  Several twentieth-century theories were offered to explain this puzzle. The most obvious solution was to blame Amerindians, Christopher Columbus, and his crew. Their 1492 voyage to the Americas and subsequent return to Spain coincided with the 1495 emergence of the disease during the Franco-Italian wars. So it seemed circumstantially convenient to conclude that syphilis originated among the Native American peoples, was picked up by Spanish sailors, and carried back to Spain.

  There were, however, two problems with that suggestion. First of all, yaws was an ancient disease on both continents, passed by skin contact between people. If yaws had existed on every continent, certainly the potential for syphilis had also always been present worldwide.

  Second, syphilis wiped out Amerindians in the fifteenth century with a ferocity equal to the force of its attack on North Africans, Asians, and Europeans. If it had been endemic in the Americas, the Amerindians should have developed at least partial immunity to it.

  The most likely explanation for the apparently sudden emergence of syphilis came in the late 1960s from anthropologist-physician Edward Hudson, who argued that syphilis was a disease of “advanced urbanization,” whereas yaws was “a disease of villages and the unsophisticated.”30 In Hudson’s view the spirochete could best exploit the ecology of the village by taking advantage of the frequent cuts and sores on the legs of children, coupled with the close leg-to-leg contact of young people who slept together in rural hovels and huts.

  When the spirochete settled under the skin it produced only a localized infection that eventually healed. Transmission could occur only during the few weeks when the sore was raw and skin-to-skin contact could allow the organism to jump from one person to another. This occurred most easily among children who played or bedded down together.

  Sexual transmission of the spirochete, however, required a far more complex human ecology in which many hundreds or thousands of people interacted intimately every day and a large percentage of the population regularly had sexual intercourse with a variety of partners.

  Other twentieth-century theorists went further, arguing that the sexually transmitted microbes could only emerge in a population of Homo sapiens or animals in which a critical mass—perhaps even a definable number—of adults in the population had frequent intercourse with more than one partner. Clearly, they argued, a strict society in which every adult had sex only with their lifetime mate would have an extremely low probability of providing the Treponema spirochete with the opportunity to switch from a skin contact yaws producer to sexual syphilis.

  Conversely, in cities where social taboos were less enforceable or respected, the possibility of multiple-partnering and, therefore, sexual passage of disease was far greater.

  Following the Black Death of the fourteenth century, most of Europe experienced two or three generations of disarray and lawlessness. Death had taken a toll on the cities’ power structures and, in many areas, the worst of the survivors—the most avaricious and corrupt—swept in to fill the vacuums.

  “The crime rate soared; blasphemy and sacrilege was a commonplace; the rules of sexual morality were flouted; the pursuit of money became the be-all and end-all of people’s lives,” Philip Ziegler wrote.31 The world was suddenly full of widows, widowers, and adolescent orphans; none felt bound by the strictures of the recent past. Godliness had failed their dead friends and relatives; indeed, the highest percentage of deaths had occurred among priests. Europe, by all accounts, remained so disrupted for decades.

  One could hypothesize the following scenario for the emergence of syphilis: the spirochete was endemic worldwide since ancient times, usually producing yaws in children. But on rare occasions—again, since prehistory—it was passed sexually, causing syphilis.32 These events were so unusual that they never received a correct diagnosis and may well have been mistaken for other crippling ailments, such as leprosy. But amid the chaos and comparative wantonness that followed the Black Death, that necessary critical mass of multiple-partner sex was reached in European cities, allowing the organism to emerge within two or three human generations on a massive scale in the form of syphilis.

  In the late twentieth century similar debates about the emergence of other sexually transmitted diseases would take place—debates that might have been easier to resolve if questions regarding the sudden fifteenth-century appearance of syphilis had been settled.

  II

  By 1980 there were five billion people on the planet, up from a mere 1.7 billion in 1925.

  The cities became hubs for jobs, dreams, money, and glamour, as well as magnets for microbes.

  Once entirely agrarian, Homo sapiens was becoming an overwhelmingly urbanized species. Overall, the most urbanized cultures of the world were also, by 1980, the richest; and with the notable exception of China, the richest individual citizens usually resided in the largest cities or their immediate suburbs.

  Propelled by obvious economic pressures, the global urbanization was irrepressible and breathtakingly rapid.

  By 1980 less than 10 percent of France’s population was rural; on the eve of World War II it had been 35 percent. The number of French farms plummeted between 1970 and 1985 from nearly 2 million to under 900,000.33

  In Asia only 270 million people were urbanites in 1955. By 1985 there were 750 million, and that figure was expected to top 1.3 billion by 2000.34

  Worldwide, the percentage of human beings living in cities showed a steady climb, and from less than 15 percent in 1900, was expected to exceed 50 percent by 2010.35 About 60 percent of this extraordinary urban growth was due to babies born in the cities; 40 percent of the new urbanites were young adult rural migrants or immigrants moving from poor countries to the large cities of wealthier nations.36

  The most dramatic rural/urban shifts were occurring in Africa and South Asia, where tidal waves of people poured continuously into the cities throughout the latter half of the twentieth century. Some cities in these regions doubled in size in a single decade.37

  The bulk of this massive human population surge occurred in a handful of so-called megacities—urban centers inhabited by more than 10 million people. In 1950 there were two megacities: New York and London. Both had attained their awesome size in less than five decades, growing by just under 2 million people each decade. Though the growth was difficult and posed endless problems for city planners, the nations were wealthy, able to finance the necessary expansion of such services as housing, sewage, drinking water, and transport.

  By 1980, however, the world had ten megacities: Buenos Aires, Rio de Janeiro, São Paulo, Mexico City, Los Angeles, New York, Beijing, Shanghai, Tokyo, and London. And even wealthy Tokyo found it difficult to accommodate the needs of its new population, which grew from a mere 6.7 million in 1950 to 20 million in 1980.

  But this was only the beginning. Continued urban growth was forecast,
and it was predicted that by 2000 there would be 3.1 billion Homo sapiens living in increasingly crowded cities, with the majority crammed into 24 megacities, most of them located in the world’s poorest countries.38

  Throughout the 1980s a key shift would occur, and most of the nations experiencing the greatest population growth would also rank among the poorest countries in the world. They would be hard pressed to meet the health and service challenges posed by the cities’ extraordinary escalation in need.

  The World Health Organization concluded that “urban growth, instead of being a sign of economic progress, as in the industrialized country model, may thus become an obstacle to economic progress: the resources needed to meet the increasing demand for facilities and public services are lost to potential productive investment elsewhere in the economy.”39

  According to the World Bank, African cities were increasing in size by 10 percent a year throughout the 1970s and 1980s, which constituted the most rapid proportional urbanization in world history.

  In 1970, in the Americas there were three city residents for every rural resident; by 2010 the ratio would be four to one. The same shift was forecast for Europe, both Western and Eastern. Some Asian countries were predicted to have five urban residents for every one rural individual by 2010.

  GLOBAL PER CAPITA EARNIGS

  During the 1970s and 1980s this crush of urban humanity was causing severe growth pains that directly impacted on human health, even in the wealthier nations. Japan, which was quickly becoming one of the two or three wealthiest countries on the planet, was reeling under Tokyo’s expanding needs. By 1985 less than 40 percent of the city’s housing would be connected to proper sewage systems, and tons of untreated human waste would end up in the ocean.40

 

‹ Prev