Book Read Free

The Coming Plague

Page 79

by Laurie Garrett


  As the virus found its way into communities of poverty, the burden on urban public hospitals was critical. Unlike Canada and most of Western Europe, the United States had no system of national health care. By 1990 an estimated 37 million Americans were without any form of either public or private health insurance. Too rich to qualify for government-supported health care, which was intended only for the elderly and the indigent, but too poor to purchase private insurance, millions of Americans simply prayed that they wouldn’t fall ill. Another 43 million Americans were either chronically uninsured or underinsured, possessing such minimal coverage that the family could be bankrupted by the required deductible and co-payments in the event of serious illness.111

  Any disease that hit poor urban Americans disproportionately would tax the public hospital system. But AIDS, which was particularly costly and labor-intensive to treat, threatened to be the straw that broke the already weakened back of the system.112

  “We are fighting a war here,” declared Dr. Emilio Carrillo, president of the New York City Health and Hospitals Corporation, which ran the city’s network of public medical facilities. “People are sick and dying from AIDS, tuberculosis is rampant, malnutrition, drug addiction, and other diseases resulting from poverty are also at epidemic levels, while at every level of government, city, state, and federal, the health care system is facing cutbacks. Only the number of sick people and people in need of basic health care is not being cut back. Among them there have been no reductions, no downsizing. They are still coming in to us for treatment.”

  A 1990 survey of 100 of the nation’s largest public hospitals (conducted by the National Association of Public Hospitals) revealed worsening situations in all American cities and predicted collapse of the “public safety net” offered by the system. A microbe that had emerged in America only a decade earlier was threatening to topple the system.

  By 1987, 3 percent of the women giving birth in hospitals in New York City were HIV-positive, as were some 25 percent of their babies, according to the U.S. Public Health Service. Nearly two-thirds of those mothers and babies were born in public hospitals located in largely African-American or Hispanic neighborhoods of Brooklyn and the Bronx. The following year the state of New York concluded that one out of every 61 babies born in the state was infected with the virus. But that rate varied radically by neighborhood: in posh, semi-rural communities located far from New York City fewer than one out of every 749 babies was born HIV-positive in 1988. But in desperately poor neighborhoods of the South Bronx one out of every 43 newborns, or 2.34 percent, was infected—and every one of them was born in a public hospital.113 Those numbers could only be expected to worsen as the epidemic’s demographics shifted into younger, predominantly heterosexual population groups.114

  A significant percentage of the nation’s HIV-positive population was also homeless, living on the streets of American cities. A 1991 study, led by Andrew Moss, of homeless men and women in San Francisco found that 3 percent of those who had no identifiable risk factors for HIV exposure were infected. Another 8 percent of the homeless were HIV-positive due to injecting drug use, prostitution, or sex with an infected individual. Overall, more than one out of every ten homeless adults in San Francisco carried the virus.115

  HIV wasn’t the only microbe that was exploiting opportunities in America’s urban poor population: hepatitis B (which by 1992 was responsible for 30 percent of all sexually transmitted disease in America), syphilis, gonorrhea, and chancroid were all appearing less commonly in Caucasian gay men and with alarming, escalating frequency in the heterosexual urban poor, particularly those who used crack cocaine or heroin. By 1990 two-thirds of New York State’s syphilis cases, for example, were African-Americans residing in key areas of poverty, and within that population male and female infection rates were equal.

  In 1993 the New York City Health Department announced that life expectancy for men in the city had declined, for the first time since World War II, from a 1981 level of 68.9 years to a 1991 level of 68.6 years. This occurred even though outside New York City life expectancies for men in the state had risen during that time from 71.5 years to 73.4 years. Though rising homicide rates played a role, city officials credited AIDS with the bulk of that downward shift. By 1987 AIDS was already the leading cause of premature death for New York City men of all races and classes; by 1988 it was the number one cause for African-American women as well.

  Well before AIDS was claiming significant numbers of Americans, Harlem Hospital chief of surgery Dr. Harold Freeman calculated that men growing up in Bangladesh had a better chance of surviving to their sixty-fifth birthday than did African-American men in Harlem, the Bronx, or Brooklyn. Again, violence played a significant role in the equation, but it was not critical to why a population of hundreds of thousands of men living in the wealthiest nation on earth were living shorter lives than their counterparts in one of the planet’s poorest Third World nations. Average life expectancy for Harlem’s African-American men born between 1950 and 1970 was just 49 years. Freeman indicted disease, poverty, and inequitable access to medical care as the primary factors responsible for the alarming death rate among African-American men.116

  Well before a new tuberculosis epidemic struck several U.S. cities, the warning signs were there for all to see: rising homelessness, fiscal reductions in social services, complacency in the public health sector, rampant drug abuse, and increases in a number of other infectious diseases. The emergence of novel strains of multiply drug-resistant TB came amid a host of clangs, whistles, and bells that should have served as ample warning to humanity. But the warning fell on unhearing ears.

  During the Ronald Reagan presidency American fiscal policies favored expansion of the investment and monetary sectors of society and simultaneous contraction of social service sectors. Economist Paul Krugman of the Massachusetts Institute of Technology estimated that 44 percent of all income growth in America between 1979 and 1989 went to the wealthiest 1 percent of the nation’s families, or about 800,000 men, women, and children. On the basis of Federal Reserve Board data, Krugman calculated that total wealth (which included far more than the cash income measured above) was more concentrated in the hands of the nation’s super-rich than at any time since the 1920s. By 1989, the top 1 percent richest Americans controlled 39 percent of the nation’s wealth.

  Several studies showed that by the end of 1993 more than 25 million Americans were hungry, consuming inadequate amounts of food. In 1993 one in ten Americans was compelled to stand at least once a week on a breadline, eat in a soup kitchen, or find food through a charitable agency. And the numbers of people living below the federally defined poverty line increased three times faster between 1982 and 1992 than the overall population size. In 1992 some 14.5 percent of all American citizens lived in conditions of legally defined poverty. Most were single mothers and their children.117

  Though difficult to measure precisely, the numbers of homeless people in America rose steadily between 1975 and 1993,118 and the demographics of the population shifted from the traditional hard-core group of older male vagrants and alcoholics to a younger, more heterogeneous contingent that included large numbers of military service veterans, chronically institutionalized mental patients, individuals with severe cocaine or heroin habits, and newly unemployed families and individuals. Estimates of the size of the nation’s homeless population ranged from about 200,000 to 2,200,000, based on head counts in emergency shelters and a variety of statistical approaches to the problem.119

  Even more difficult to calculate was the rise in housing density in urban areas. As individuals and whole families faced hardships that could lead to homelessness, they moved in with friends and relatives. One estimate for New York City during the 1980s suggested that 35,000 households were doubled up in public housing, along with 73,000 double-density private households. Assuming each family averaged four members, that could mea
n that more than 400,000 men, women, and children were packed into double-density housing.120

  Finally, a large percentage of the urban poor population cycled annually in and out of the criminal justice system. Young men, in particular, were frequently incarcerated in overcrowded jails and prisons. In 1982 President Ronald Reagan called for a war on drugs: by 1990 more men were in federal prisons on drug charges alone than had comprised the entire 1980 federal prison population for all crimes combined. The pace of federal, state, and county jail construction never came close to matching the needs created by the high arrest rates. As a result, jail cells were overcrowded, and judges often released prisoners after shortened terms, allowing them to return to the community. This, too, would prove advantageous to the microbes.

  Some of the microbial impact of this urban Thirdworldization might have been controllable had the U.S. public health system been vigilant. But at all tiers, from the grass roots to the federal level, the system was by the mid-1980s in a very sorry state. Complacent after decades of perceived victories over the microbes, positioned as the runt sibling to curative medicine and fiscally pared to the bone by successive rounds of budget cuts in all layers of government, public health in 1990 was a mere shadow of its former self.

  An Institute of Medicine investigation determined that public health and disease control efforts in the United States were in a shambles. Key problems included “a lack of agreement about the public health mission” between various sectors of government and research; a clear failure of public health advocates to participate in “the dynamics of American politics”; lack of cooperation between medicine and public health; inadequate training and leadership; and severe funding deficiencies at all levels.

  “In the committee’s view,” they wrote, “we have let down our public health guard as a nation and the health of the public is unnecessarily threatened as a result.”121

  An example of public health’s disarray that proved painfully embarrassing to officials during the 1980s was provided by measles. In 1963 a safe, effective measles vaccine became widely available in the United States and childhood cases of the sometimes lethal disease plummeted steadily thereafter. In 1962 half a million children in the United States contracted measles; by 1977 fewer than 35,000 cases were reported annually and many experts forecast that virtual eradication of the disease would soon be achieved.

  But problems were already apparent in 1977: many children who were vaccinated before the age of fourteen or fifteen months later developed measles, and researchers soon understood that timing was crucial to achievement of effective immunization. Vaccination schedules were adjusted accordingly, executed nationwide with vigor, and the number of measles cases in the country continued to decline. The only serious emergences of the microbe took place in communities where a significant number of parents refused, for religious reasons, to have their children vaccinated.122

  By the early 1980s the United States had achieved 99 percent primary measles vaccination coverage for young children and fewer than 1,497 measles cases occurred in the country in 1983.

  In 1985, however, a fifteen-year-old girl returned from a trip to England to her Corpus Christi, Texas, home and promptly developed the roseola rash that was characteristic of measles. The virus quickly spread through her high school and the local junior high school. Ninety-nine percent of the students had, during infancy, received their primary live-measles immunizations; 88 percent had also had their recommended boosters. Nevertheless, fourteen students developed measles.123

  Blood tests performed during the outbreak on more than 1,800 students revealed that 4.1 percent of the children, despite vaccination, weren’t making antibodies against the virus, and the lowest levels of antibody production were among those who hadn’t had boosters. All the ailing teens fit that category. The clear message was: (1) primary immunization, in the absence of a booster, was inadequate to guarantee protection against measles; and (2) having even a handful of vulnerable individuals in a group setting was enough to produce a serious outbreak.124

  The crucial importance of proper timing of vaccination and booster follow-up was further supported by other measles outbreaks among groups of youngsters whose primary vaccination rates exceeded 97 percent. 125 In 1989 the measles rate in the United States climbed considerably. More than 18,000 cases of measles occurred, producing 41 deaths: a tenfold increase since 1983. Forty percent of the cases involved young people who had received their primary, but not booster, vaccinations; the remainder had had no shots, or their vaccinations were administered at improper times.

  Though some pediatricians and policy makers found the 1989 numbers worrisome, nobody forecast an epidemic. Measles epidemics were considered Third World problems by 1989.

  But an epidemic did occur. The incidence of measles in the United States leapt by 50 percent between 1989 and 1990. More than 27,000 U.S. children, half of them under four years of age, contracted measles during 1990; 100 died of the disease.

  Hardest hit was New York City, with 2,479 reported measles cases.

  CDC investigators were baffled by the severity of illnesses in the 1990–91 epidemic.

  “These kids are much sicker, and death rates are definitely higher,” the CDC’s Bill Atkinson said. “We don’t know whether it’s because the strain of measles out there is more virulent, or the kids are more susceptible.”

  Many of the ailing children, particularly in New York City, had never been vaccinated. They hadn’t even received their primary shots, much less boosters.

  “Now the majority of cases are in unvaccinated children,” Dr. Georges Peter, chair of the American Academy of Pediatrics, said. “Measles is the most contagious of all the vaccine-preventable diseases. The nature of the problem has clearly changed—it is undoubtedly a failure to vaccinate. And what this really is, is indication of a collapse in the public health system, of lack of access to health care.”

  What was going on? Were parents deliberately keeping their children away from doctors? Were Americans suddenly phobic about immunizations?

  The answers, it turned out, could be found in the demographics of the population of children with measles. The vast majority lived in large cities—New York, Chicago, Houston, Los Angeles—and were nine times more likely to be African-American or Hispanic.

  As the epidemic persisted in 1991, worsening in New York City’s African-American and Hispanic populations, it was evident that the microbe had successfully emerged in populations of poor urban people with little or no access to health care. This underlying social weakness also facilitated surges in whooping cough and rubella cases during 1990–93.126

  In 1978 the U.S. Surgeon General had declared that measles would be eradicated from the country by 1982, and an ambitious immunization campaign was mounted. By 1988, however, conditions of poverty, health care collapse, and public health disarray had grown so acute that the United States had a poorer track record on all childhood vaccination efforts than did war-torn El Salvador and many other Third World countries.127

  In some inner-city areas—notably in New York City—only half of all school-age children had been vaccinated. For much of the urban poor in America the only point of access to the health care system was the public hospital emergency room. Families spent anxious, tedious hours queued up in urban ERs because they felt that they had no choice: there were no clinics or private physicians practicing in the ghettos, few alternative sources of basic care. But few poor families were willing to put up with a daylong line in the ER simply to get their children immunized, particularly if it meant loss of a day’s pay.128

  Further study of the measles crisis revealed that some deaths and many cases—indeed, most at the key hospitals—went unreported. The city of New York uncovered up to 50 percent underreporting in the region’s largest inner-city hospitals during the 1991 epidemic. It was po
ssible that up to 5,000 cases of the disease occurred in New York City, though only half that number were officially reported.129

  In 1993, World Health Organization adviser Dr. Barry Bloom, of the Albert Einstein School of Medicine in the Bronx, announced that the United States had fallen behind Albania, Mexico, and China in childhood vaccination rates. 130

  At the World Summit on Children convened by the United Nations in September 1990, the Bush administration was in the dubious position of having, on the one hand, to pledge sweeping concern for the health and survival of the world’s children while hoping no one would publicly note that the health status of America’s impoverished kids rivaled that of children in much of Africa and South Asia.

  “This society is so wealthy, obviously this country is better off than the Third World. But this country should be ashamed of the child mortality rates and health,” decried Jim Weill, of the Children’s Defense Fund, at the Summit. “The U.S. ranks 19th in the world on infant mortality, 29th in low birthweight babies, 22nd on child mortality for children under five, and, perhaps most amazing, 49th in the world on child immunization, for our non-white children. We kill our children.

  “Let’s face it, when it comes to America’s children we live in the Third World.”

  Not only had America’s cities sunk to Third World levels of childhood vaccination and access to health care, but its surveillance and public health systems had reached states of inaccuracy and chaos that rivaled those in some of the world’s poorest countries. 131

  Weill’s words had barely been uttered when officials at the CDC acknowledged that America’s public health system was also doing a worse job of handling tuberculosis than did many African nations.

 

‹ Prev