Pox

Home > Other > Pox > Page 13
Pox Page 13

by Michael Willrich


  Smallpox patient at the Tampa pesthouse, 1900. COURTESY OF THE STATE ARCHIVES OF FLORIDA

  In his personal papers and public writings, C. P. Wertenbaker was serious, dispassionate, and reserved—a gentleman scholar of the Service stripe. In his field reports to Washington, he dutifully noted whites’ belief that they had a natural immunity to the disease they called “nigger itch,” but he considered this popular belief a sign of ignorance and a bane to scientific smallpox work. He did not normally indulge in expansive statements of racial ideology, “scientific” or otherwise. But in one letter, which he sent to a Mississippi health official in 1910, the federal surgeon revealed some of his assumptions about the state, and fate, of African American health. “There is no question in my mind,” Wertenbaker wrote, “but that the negro constituted the gravest menace to the country in which they lived, from a sanitary standpoint.” “The negro is like a child,” he continued, “incapable of carrying on any effectual sanitary work unless guided and directed by the white people.... Unless there is a marked change in sanitary conditions among the negroes, I believe that within the next 100 years the negro will be almost as scarce in this country as the Indian now is. I believe that the extinction of the race is imminent.”51

  With those few lines Wertenbaker revealed a cast of mind entirely conventional among white medical authorities of his time and place. Such theories had a long lineage. In the antebellum period, southern medical writers had used just such claims to defend the institution of slavery. Observing that African American slaves were less prone than whites to contract malaria and yellow fever (because, we now know, of an inherited genetic resistance to the mosquito-borne viruses that caused those diseases), slaveholders lauded their chattels’ natural fitness for back-breaking labor in the coastal rice and cotton fields. Ideologues claimed the intelligence and moral dispositions of African Americans were so deficient that slaves needed their white masters’ protection and restraint. In the post–Civil War era, white medical experts ridiculed the freed people’s claims to equal citizenship. During the 1890s and 1900s, physicians interpreted African Americans’ high mortality and morbidity rates as evidence of black people’s supposed biological inferiority, insisting that they brought disease upon themselves by sexual vices and intemperance. Using the flawed late nineteenth-century census returns to bolster their case, white experts claimed that the health of African Americans had plummeted since emancipation. This proved, the authorities claimed, that blacks had benefited from slavery and were so ill suited to freedom that they were now destined for extinction. Such medical racism led leading life insurance companies to refuse policies to African Americans.52

  In The Philadelphia Negro (1899), his pathbreaking work of urban sociology, the young African American scholar W. E. B. Du Bois calmly showed that the prevailing theories of African American health rested on sloppy science and wishful thinking. Since little reliable data existed regarding African American health during slavery, Du Bois pointed out, claims that the health of the race had undergone a dramatic decline since emancipation were, at best, unsubstantiated. Of the myth that blacks were doomed for extinction, Du Bois wrote that it represented “the bugbear of the untrained, or the wish of the timid.” But such medical falsehoods had devastating consequences. They inured the nation to the real—and substantially preventable—health problems of poor African Americans in the North and South. The average life expectancy for blacks was thirty-two, compared to nearly fifty for whites. Infant mortality rates were shockingly high. Black men and women were disproportionately struck by many chronic and infectious diseases, including heart disease and consumption (pulmonary tuberculosis), a major killer in the African American population. “In the history of civilized peoples,” Du Bois wrote, rarely had so much “human suffering” been viewed with “such peculiar indifference.”53

  That indifference was not just a cultural phenomenon. It was a systemic feature of the white-dominated medical profession, especially in the South. Reputable physicians refused to treat African Americans. As southern cities built new public hospitals in the late nineteenth century, most excluded blacks or relegated them to inferior Jim Crow wards. Such demeaning treatment, Du Bois observed, intensified the “superstitious” fear of hospitals and medicine that he considered “prevalent among the lower classes of all people, but especially among Negroes.” As a consequence, most poor blacks did not seek medical aid from a white physician until they were desperately ill. “Many a Negro would almost rather die than trust himself to a hospital.”54

  The best hope for African American health care lay with the black medical profession. By 1900 more than 1,700 black physicians practiced in the United States, up from about 900 a decade earlier. African American medical schools, nursing schools, and hospitals opened during the same period. Industrial schools such as Booker T. Washington’s Tuskegee Institute instructed poor blacks in the use of toothbrushes and everyday hygiene. As significant as these developments were, they could not quickly correct a pattern of institutional neglect so long in the making. As late as 1910, the entire state of South Carolina had only 66 professional black physicians, or one physician for every 12,000 black people. The ratio for white people was about 1 to 800. African American professional medicine existed mainly in urban areas. In the rural South, where most African Americans lived, black physicians were scarce. When rural blacks took ill, they still relied, as they had during slavery, on the informal medical knowledge of friends and relatives, root doctors, and practitioners of magical medicine. In a period of explosive growth in the American medical profession, it remained all too common for African Americans to take ill, suffer, and die without receiving any medical attention.55

  Even in an era of such systemic neglect, the realization that smallpox was spreading among African Americans across the South was bound to cause alarm among white public health officials. White officials understood from their own observations in the field that smallpox spread like wildfire through unvaccinated populations, regardless of their color. Since the majority of Southerners, white and black, had never been vaccinated, officials made some effort to explain the early prevalence of the disease among blacks.

  White medical commentators marveled at African Americans’ sociability: their “gregarious habits,” their fondness for going on “excursions” and mingling “promiscuously,” their “close association and intermixing.” And the commentators were not just talking about sex. Many fretted about “religious negroes,” who seemed ever to be gathering in one meeting or another. During an outbreak, African American churches were usually among the first places quarantined—right after the black schools. Even the playfulness of African American children was deemed a threat to the public health. In the autumn of 1899, as sharecroppers in Concordia Parish, Louisiana, brought in the harvest, piling the seed cotton high in their cabins, one white official worried that children would pollute the cotton with smallpox: “On this inviting heap the darky children romp by day and sleep by night with that habitual disregard of cleanliness characteristic of the race.” The writer knew he could count on his readers’ imagination to complete the scenario. With the infected cotton bound for market, and from there to the mills, and from the mills to the homes of unsuspecting white consumers, who could say how far smallpox would travel from those sharecroppers’ shacks?56

  Racial anxieties permeate the official record of the southern epidemics. But the record also contains clues about the deeper causes of the prevalence of smallpox among African Americans. While poor nutrition and overcrowded living conditions made black people especially susceptible to smallpox, institutionalized racism fostered African Americans’ long-standing distrust of white doctors. Neglected and mistreated by the medical profession, the vast majority of southern blacks had never been examined by a physician, let alone been vaccinated, and would just as soon keep it that way. African Americans were understandably reluctant to report cases of smallpox in their homes or neighborhoods to white authorities. As the Atlant
a Constitution noted during the Birmingham epidemic, “[T]he negroes there have a great dread of the pesthouse and use every effort to avoid having their friends and relatives taken there.” In other places, the physical or cultural distance from white medical authority was so great that such subterfuge was unnecessary. Traveling through Georgia in 1899, Wertenbaker kept stumbling upon African American settlements or sections of towns with names like “Hell’s Half Acre,” where smallpox had spread for four or five months, sometimes longer, without attracting the least notice from whites. “The disease became epidemic before it was known,” he said.57

  The close living conditions of African American laborers, even in the most rural of settings, aided the spread of smallpox. Especially efficient carriers, it seemed, were itinerant laborers in the fast-growing rural nonagricultural sector, including men who worked at turpentine stills, in phosphate and coal mines, and on the railroads. Unvaccinated African Americans who slept in crowded cabins, shared tents in mining camps, or huddled for warmth in railroad boxcars were extraordinarily vulnerable to airborne germs. Transient black workers, forbidden by law, custom, and their own poverty from sleeping in a white-owned tavern or inn, frequently stayed overnight in the home of a black family, where they shared rooms and often beds with children and other family members. In February 1899, a white Carrollton, Kentucky, physician named F. H. Gaines examined a transient African American man with a “suspicious eruption on his forehead and wrists.” Dr. Gaines diagnosed the eruption as smallpox. He learned from his patient that he had been put off the Madison and Cincinnati packet three days earlier and had spent the next three nights with three separate black families. When the man realized Gaines intended to take him to a pesthouse, he made a quick escape. Two weeks later, smallpox erupted in all three families.58

  A truism holds that in the Jim Crow South, whites and blacks lived side by side, while in the “promised land” of the urban North de facto racial segregation prevailed in the housing market. But the history of the southern smallpox epidemics suggests just how much social distance actually existed between the races in southern places. Jim Crow laws, which proliferated in the 1890s, stripped most African Americans of the suffrage, forced them into separate compartments on trains and streetcars, and relegated black children to the most poorly funded schools. For all of their flaws, the public health reports reveal some of the collective impact of this emerging regime of white supremacy, even as they attest to the vitality of black social institutions. Reports traced smallpox clusters to African American boardinghouses, schools, churches, restaurants, opera houses, and a few houses of ill fame—including one in Richmond, Kentucky, whose keeper served well-attended court-day dinners to the community.59

  Booker T. Washington had it right. Infectious disease drew no color line. People did—with their customs, practices, institutions, and laws. The color line, in any event, rarely held. Even when local authorities tried to keep smallpox at bay by ordering quarantines of African American sections—as officials did in 1900 in Wertenbaker’s native Albemarle County—smallpox crossed that line. When whites did catch smallpox, a disease that had in some places gone unnoticed for months suddenly attracted public attention. The formerly invisible disease became visible.60

  Which is not to say it became intelligible. For at that point, as Wertenbaker observed time and again, another problem presented itself. The public refused to believe mild type smallpox was the real thing.

  The smallpox came to Stithton, Kentucky, on a winter’s day in 1899, when the Barker boy rode home from Louisville on a bicycle. A peculiar rash speckled the young cyclist’s face, and the town physician who examined him feared the worst. He instructed the boy to ride home and stay there, and then rang the Hardin County health officer. Accompanied by several excited physicians, Dr. C. Z. Aud took a ride out to the Barker place. Aud looked the boy over, ran his fingers over the papules, and in the presence of his attentive colleagues and the boy’s father, diagnosed smallpox. Mr. Barker did not gasp with alarm, he did not plead for a second opinion, he did not ask what could be done to save the boy. He just let the Hardin County health officer know that his opinion wasn’t worth all that much at the Barker place. “I was not very politely told by the old man,” Aud recalled, “that he had had small-pox himself, and knew a great deal more about it than I did, and he would not submit to vaccination.” Barker’s two daughters refused to bare their arms, either. Mrs. Barker said she had already been vaccinated. So Aud and his entourage left. When he got back to his office, Aud learned that Mr. Barker had already called a lawyer to see if he could “get damages from a doctor for saying his son had small-pox when it was a lie.” To Barker, Aud’s diagnosis amounted to libel. Time would tell that Barker did not know so much about smallpox. Two weeks later, he and his daughters broke out in pox.61

  Though most rural Southerners had never come near a case of small-pox, they expected to know it when they saw it. And when their expectations were not met, they did not, as a rule, defer to the professional expertise of public health officers. Dr. J. R. Burchell of Clay County, Kentucky, found himself the object of “many a cursing” when he warned his neighbors that smallpox was spreading among them. “One gentleman’s idea of smallpox,” this health officer reported, “was that when a man had small-pox he was in a hell of a bad fix, and as no one had been in that condition, therefore there had been no small-pox.” It proved a difficult position with which to argue. Public health officers at points across the South agreed that one of the greatest obstacles to smallpox control was the doubt that existed in people’s minds as to the true nature of this new disease. Frequent bouts with naysayers led some officers to wish, in published government health reports, for the appearance of a “fool-killer”: a fatal case of smallpox. As one North Carolina official put it, the best cure for a doubting public was “a good first-class case of small-pox.”62

  That even a second-class case of smallpox could arouse so little public concern speaks to the amount of physical suffering that Americans raised in the nineteenth century expected to endure during their lives. Even in ordinary times, southern newspapers advertised patent medicines promising relief from all kinds of fevers and “itching skin diseases.” It took something stronger than mild smallpox to make people welcome government doctors into their lives. Even in a “mild” outbreak, Wertenbaker might see as many as a dozen grotesque confluent cases and one or two deaths. In December 1900, one of Wertenbaker’s Service colleagues, Assistant Surgeon John D. Long, inspected a gang of African American railroad workers in a Washington train station. The men had just finished digging a tunnel for the new West Virginia Short Line Railroad and were making their way south. For months, a disease—variously called “Cuban itch,” “nigger itch,” or “black measles”—had been spreading among white and black workers in the Short Line construction camps. As Long questioned the men, he jotted down their symptoms: “headache, fever, general weakness, vomiting, and pain in the neck and back,” followed by a rash that went through the usual stages of “vesiculation, pustulation, and desquamation.” Most of the men had been unable to work (or collect wages) for up to two weeks. The camps they had left behind had seen at least 140 cases of smallpox, with 4 deaths. That was “mild” smallpox.63

  Clusters of severe cases occurred during otherwise mild epidemics often enough to keep Wertenbaker in an almost constant state of apprehension. In each fatal outbreak he envisioned smallpox regaining its historical virulence. From a public health perspective, though, the most dangerous thing about mild type smallpox was that it did not lay people low enough. Some people recovered without ever taking to their beds. Particularly in the convalescent stage of the disease—when patients would ordinarily be confined under close quarantine—people with mild type smallpox often felt well enough to go about their business. Children with infectious scabs on their faces and hands played in the streets. Contagious men and women worked in the fields and factories, ran grocery stores, and mingled in the crowd on court day. Secre
tary Lewis of the North Carolina Board of Health complained that a man with mild smallpox was “exactly in the right condition for visiting around among the neighbors, or loafing at the railway station, or above all, attending a gathering of any kind—political preferred.” The eruption might be so insignificant as to attract no notice. Nevertheless, it was “the genuine article,” Lewis warned, “and capable of causing in the unvaccinated the most virulent and fatal form of the disease.”64

  The turn of the century is remembered today as the advent of the modern expert, when university-trained professionals in medicine, the sciences, and law acquired a new authority in American life. But southern health officials often found the public, business interests, and even their own local governments unwilling to accept their warnings or yield them the diagnostic ground. Like Mr. Barker of Stithton, many citizens saw no reason to elevate the medical opinion of a health official above their own.

  Like other Americans of the period, blacks were accustomed to experiencing any number of fevers and skin eruptions during their lives. Their first inclination in naming a new disease was to compare it with others they had known. After inspecting a confluent black patient in a room crowded with “eight or ten negroes” in Princeton, Kentucky, a physician found his diagnosis of smallpox challenged by an “old negro” who said he had survived smallpox himself. “Dat nigger nebber had no small-pox,” the man declared, insisting that the “little bumps on him” were caused by “big-pox” (syphilis).65

  As local health authorities raised the pressure—making proclamations, ordering quarantines, calling for compulsory vaccination—critics raised their protests. Some citizens denounced the government officials as capricious and corrupt. Others relied, as rural blacks had since slavery, on the power of rumor. As Wertenbaker frequently witnessed in the field, nothing outran a rumor. Communities of cotton mill workers, who notwithstanding their claims to white privilege were among the most exploited and marginalized of southern laboring people, were deeply distrustful of medical authority. In Charlotte, Danville, and other places in the throes of industrial change, Wertenbaker found the expert claims of health authorities undone by rumors circulating among the mill workers that no smallpox existed.66

 

‹ Prev