At Home

Home > Nonfiction > At Home > Page 39
At Home Page 39

by Bill Bryson


  Syphilis, as we have seen, had been around for a long time. As early as 1495, just three years after the voyage of Christopher Columbus that introduced it to Europe, some soldiers in Italy developed pustules “like grains of millet” all over their faces and bodies, which is thought to be the first medical reference to syphilis in Europe. It spread rapidly—so rapidly that people couldn’t agree where it came from. The first recorded mention of it in English is as “the French pox” in 1503. Elsewhere it was known as the Spanish disease, the Celtic humors, the Neapolitan pox, or, perhaps most tellingly, “the Christian disease.” Syphilis was coined in a poem by the Italian Hieronymus Fracastorius in 1530 (in his poem Syphilis is the name of a shepherd who gets the disease) but does not appear in English until 1718.

  Syphilis was for a long time a particularly unnerving disease because of the way it came and went in three stages, each successively worse than the last. The first stage usually showed itself as a genital chancre, ugly but painless. This was followed some time later by a second stage that involved anything from aches and pains to hair loss. Like first-stage syphilis, this would also resolve itself after a month or so whether it was treated or not. For two-thirds of syphilis sufferers, that was it. The disease was over. For the unfortunate one-third, however, the real dread was yet to come. The infection would lie dormant for as long as twenty years before erupting in third-stage syphilis. This is the stage nobody wants to go through. It eats away the body, destroying bones and tissue without pause or mercy. Noses frequently collapsed and vanished. (London for a time had a “No-Nose’d Club.”) The mouth may lose its roof. The death of nerve cells can turn the victim into a stumbling wreck. Symptoms vary, but every one of them is horrible. Despite the dangers, people put up with the risks to an amazing degree. James Boswell contracted venereal diseases nineteen times in thirty years.

  Treatments for syphilis were severe. In the early days a lead solution was injected into the bladder via the urethra. Then mercury became the drug of choice and remained so right up to the twentieth century and the invention of the first antibiotics. Mercury produced all kinds of toxic symptoms—bones grew spongy, teeth fell out—but there was no alternative. “A night with Venus and a lifetime with Mercury” was the axiom of the day. Yet the mercury didn’t actually cure the disease; it merely moderated the worst of the symptoms while inflicting others.

  Perhaps nothing separates us more completely from the past than how staggeringly ineffectual—and often petrifyingly disagreeable—medical treatments once were. Doctors were lost in the face of all but a narrow range of maladies. Often their treatment merely made matters worse. The luckiest people in many ways were those who suffered in private and recovered without medical intervention.

  The worst outcome of all, for obvious reasons, was to have to undergo surgery. In the centuries before anesthetics, many ways of ameliorating pain were tried out. One method was to bleed the patient to the point of faintness. Another was to inject an infusion of tobacco into the rectum (which, at the very least, must have given the patient something else to think about). The most common treatment was to administer opiates, principally in the form of laudanum, but even the most liberal doses couldn’t mask real pain.

  During amputations, limbs were normally removed in less than a minute, so the most traumatizing agony was over quickly, but vessels still had to be tied off and the wound stitched, so there remained much scope for lingering pain. Working quickly was the trick of it. When Samuel Pepys underwent a lithotomy—the removal of a kidney stone—in 1658, the surgeon took just fifty seconds to get in and find and extract a stone about the size of a tennis ball. (That is, a seventeenth-century tennis ball, which was rather smaller than a modern one, but still a sphere of considerable dimension.) Pepys was extremely lucky, as the historian Liza Picard points out in Restoration London, because his operation was the surgeon’s first of the day and therefore his instruments were reasonably clean. Despite the quickness of the operation, Pepys needed more than a month to recover.

  More complicated procedures were almost unbelievably taxing. They are painful enough to read about now, but what they must have been like to live through simply cannot be conceived. In 1806, the novelist Fanny Burney, while living in Paris, suffered a pain in her right breast, which gradually grew so severe that she could not lift her arm. The problem was diagnosed as breast cancer and a mastectomy was ordered. The job was given to a celebrated surgeon named Baron Larrey, whose fame was based not so much on his skill at saving lives as on his lightning speed. He would later become famous for conducting two hundred amputations in twenty-four hours after the Battle of Borodino in 1812.

  Burney’s account of the experience is almost unbearably excruciating because of the very calmness with which she relays its horrors. Almost as bad as the event itself was the torment of awaiting it. As the days passed, the anxiety of apprehension became almost crushing, and was made worse when she learned on the morning of the appointed day that the surgeons would be delayed by several hours. In her diary she wrote: “I walked backwards and forwards till I quieted all emotions, and became, by degrees, nearly stupid—torpid, without sentiment or consciousness—and thus I remained till the clock struck three.”

  At that point she heard four carriages arrive in quick succession. Moments later, seven grave men in black came into the room. Burney was given a drink to calm her nerves—she didn’t record what, but wine mixed with laudanum was the usual offering. A bed was moved into the middle of the room; old bedding was placed on it so as not to spoil a good mattress or linens.

  “I now began to tremble violently,” Burney wrote, “more with distaste and horror of the preparations even than of the pain.… I mounted, therefore, unbidden, the bedstead, and M. Dubois placed me upon the mattress, and spread a cambric handkerchief upon my face. It was transparent, however, and I saw through it that the bedstead was instantly surrounded by the seven men and my nurse. I refused to be held; but when, bright through the cambric, I saw the glitter of polished steel—I closed my eyes.” Learning that they intended to remove the whole breast, she surrendered herself to “a terror that surpasses all description.” As the knife cut into her, she emitted “a scream that lasted intermittingly during the whole time of the incision—and I almost marvel that it rings not in my ears still, so excruciating was the agony. When the wound was made, and the instrument was withdrawn, the pain seemed undiminished … but when again I felt the instrument—describing a curve—cutting against the grain, if I may say so, while the flesh resisted in a manner so forcible as to oppose and tire the hand of the operator, who was forced to change from the right to the left—then, indeed, I thought I must have expired. I attempted no more to open my eyes.”

  But still the operation went on. As the surgeons dug away diseased tissue, she could feel and hear the scrape of the blade on her breastbone. The entire procedure lasted seventeen and a half minutes, and it took her months to recover. But the operation saved her life. She lived another twenty-nine years and the cancer never came back.

  Not surprisingly, people were sometimes driven by pain and a natural caution regarding doctors to attempt extreme remedies at home. Gouvernor Morris, one of the signers of the Declaration of Independence, killed himself by forcing a whalebone up his penis to try to clear a urinary blockage.

  The advent of surgical anesthetics in the 1840s didn’t eliminate the agony of medical treatments very often so much as postpone it. Surgeons still didn’t wash their hands or clean their instruments, so many of their patients survived the operations only to die of a more prolonged and exquisite agony through infection. This was generally attributed to “blood poisoning.” When President James A. Garfield was shot in 1881, it wasn’t the bullet that killed him, but doctors sticking their unwashed fingers in the wound. Because anesthetics encouraged the growth of surgical procedures, there was in fact probably a very considerable net increase in the amount of pain and suffering after the advent of anesthetics.

  Even witho
ut the unnerving interventions of surgeons, there were plenty of ways to die in the premodern world. For the City of London, the death rolls—or Bills of Mortality as they were known in England—for 1758 list 17,576 deaths from more than eighty causes. Most deaths, as might be expected, were from smallpox, fever, consumption, or old age, but among the more miscellaneous causes listed (with original spellings) were:

  choaked with fat 1 Itch 2 froze to death 2 St Anthony’s fire 4 lethargy 4 sore throat 5 worms 6 killed themselves 30 French pox 46 lunatick 72 drowned 109 mortification 154 teeth 644

  How exactly “teeth” killed so many seems bound to remain forever a mystery. Whatever the actual causes of death, it is clear that expiring was a commonplace act and that people were prepared for it to come from almost any direction. Death rolls from Boston in the same period show people dying from such unexpected causes as “drinking cold water,” “stagnation of the fluids,” “nervous fevers,” and “fright.” It is interesting, too, that many of the more expected forms of death feature only marginally. Of the nearly 17,600 people whose deaths were recorded in London in 1758, just 14 were executed, 5 murdered, and 4 starved.

  With so many lives foreshortened, marriages in the preindustrial world tended to be brief. In the fifteenth and sixteenth centuries, the average marriage lasted just ten years before one or the other of the partners expired. It is often assumed that because people died young they also married young in order to make the most of the short life that lay in front of them. In fact, that seems not to be so. For one thing, people still saw the normal span of life—one’s theoretical entitlement—as the biblical three score years and ten. It was just that not so many people made it to that point. Nearly always cited in support of the contention that people married early are the tender ages of the principal characters in Shakespeare’s Romeo and Juliet—Juliet just thirteen, Romeo a little older. Putting aside the consideration that the characters were fictitious and hardly proof of anything, what is always overlooked in this is that in the poem by Arthur Brooke on which Shakespeare based the story, the characters were actually sixteen. Why Shakespeare reduced their ages is, like most of what Shakespeare did, unknowable. In any case, Shakespeare’s youthful ages are not supported by documentary evidence in the real world.

  In the 1960s, the Stanford historian Peter Laslett did a careful study of British marriage records and found that at no time in the recorded past did people regularly marry at very early ages. Between 1619 and 1660, for instance, 85 percent of women were nineteen or older when married; just one in a thousand was thirteen or under. The median age at marriage for brides was twenty-three years and seven months, and for men it was nearly twenty-eight years—not very different from the ages of today. William Shakespeare himself was unusual in being married at eighteen, while his wife, Anne, was unusually old at twenty-six. Most really youthful marriages were formalities known as espousals de futuro, which were more declarations of future intentions than licenses to hop into bed.

  What is true is that there were a lot more widowed people out there and that they remarried more frequently and more quickly after bereavement. For women, it was often an economic necessity. For men, it was the desire to be looked after. In short, it was often as much a practical consideration as an emotional one. One village surveyed by Laslett had, in 1688, seventy-two married men, of whom thirteen had been married twice, three had been married three times, three married four times, and one married five times—all as the result of widowhood. Altogether about a quarter of all marriages were remarriages following bereavement, and those proportions remained unchanged right up to the first years of the twentieth century.

  With so many people dying, mourning became a central part of most people’s lives. The masters of mourning were of course the Victorians. Never have a people become more morbidly attached to death or found more complicated ways to mark it. The master practitioner was Victoria herself. After her beloved Prince Albert died in December 1861, the clocks in his bedroom were stopped at the minute of his death, 10:50 p.m., but at the Queen’s behest his room continued to be serviced as if he were merely temporarily absent rather than permanently interred in a mausoleum across the grounds. A valet laid out clothes for him each day, and soap, towels, and hot water were brought to the room at the appropriate times, then taken away again.

  At all levels of society mourning rules were strict and exhaustingly comprehensive. Every possible permutation of relationship was considered and ruled on. If, for example, the dearly departed was an uncle by marriage, he was to be mourned for two months if his wife survived him, but for just one month if he was unmarried or widowed himself. So it went through the entire canon of relationships. One needn’t even have met the people being mourned. If one’s husband had been married before and widowed—a fairly common condition—and a close relative of his first wife’s died, the second wife was expected to engage in “complementary mourning”—a kind of proxy mourning on behalf of the deceased earlier partner.

  Exactly how long and in what manner mourning clothes were worn was determined with equally meticulous precision by the degree of one’s bereavement. Widows, already swaddled in pounds of suffocating broadcloth, had additionally to drape themselves in black crape, a type of rustly crimped silk. Crape was scratchy, noisy, and maddeningly difficult to maintain. Raindrops on crape left whitish blotches wherever they touched it, and the crape in turn ran onto fabric or skin underneath. A crape stain ruined any fabric it touched and was nearly impossible to wash off skin. The amounts of crape worn were strictly dictated by the passage of time. One could tell at a glance how long a woman had been widowed by how much crape she had at each sleeve. After two years, a widow moved into a phase known as “half mourning” when she could begin to wear gray or pale lavender, so long as they weren’t introduced too abruptly.

  Servants were required to mourn when their employers died, and a period of national mourning was decreed when a monarch died. Much consternation ensued when Queen Victoria expired in 1901, because it had been over sixty years since the last regal departure and no one could agree what level of mourning was appropriate to such a long-lasting monarch in such a new age.

  As if Victorians didn’t have enough to worry about already, they developed some peculiar anxieties about death. Edgar Allan Poe exploited one particular fear to vivid effect in his story “The Premature Burial” in 1844. Catalepsy, a condition of paralysis in which the victim merely seemed dead while actually being fully conscious, became the dread disease of the day. Newspapers and popular magazines abounded with stories of people who suffered from its immobilizing effects.

  One well-known case was that of Eleanor Markham of upstate New York, who was about to be buried in July 1894 when anxious noises were heard coming from her coffin. The lid was lifted and Miss Markham cried out: “My god, you are burying me alive!” She told her saviors: “I was conscious all the time you were making preparations to bury me. The horror of my situation is altogether beyond description. I could hear everything that was going on, even a whisper outside the door.” But no matter how much she willed herself to cry out, she said, she was powerless to utter a noise.

  According to one report, of twelve hundred bodies exhumed in New York City for one reason or another between 1860 and 1880, six showed signs of thrashing or other postinterment distress. In London, when the naturalist Frank Buckland went looking for the coffin of the anatomist John Hunter at St. Martin-in-the-Fields Church, he reported coming upon three coffins that showed clear evidence of internal agitation (or so he was convinced). Anecdotes of premature burials featured in even serious publications. A correspondent to the British journal Notes and Queries offered this contribution in 1858:

  A rich manufacturer named Oppelt died about fifteen years since at Reichenberg, in Austria, and a vault was built in the cemetery for the reception of the body by his widow and children. The widow died about a month ago and was taken to the same tomb; but, when it was opened for that purpose, the coffin of her
husband was found open and empty, and the skeleton discovered in a corner of the vault in a sitting posture.

  For at least a generation such stories became routine in even serious periodicals. So many people became morbidly obsessed with the fear of being interred before their time that a word was coined for it: taphephobia. The novelist Wilkie Collins placed on his bedside table each night a letter bearing standing instructions of the tests he wished carried out to ensure that he really had died in his sleep if he was found in a seemingly corpselike state. Others directed that their heads be cut off or their hearts removed before burial, to put the matter comfortably (if that is the right word) beyond doubt. One author proposed the construction of “Waiting Mortuaries,” where the departed could be held for a few days to ensure they really were quite dead and not just unusually still. Another more entrepreneurial type designed a device that allowed someone awaking within a coffin to pull a cord, which opened a breathing tube for air and simultaneously set off a bell and started a flag waving at ground level. An Association for Prevention of Premature Burial was established in Britain in 1899 and an American society was formed the following year. Both societies suggested a number of exacting tests to be satisfied by attending physicians before they could safely declare a person dead—holding a hot iron against the deceased’s skin to see if it blistered was one—and several of these tests were actually incorporated into medical schools’ curricula for a time.

 

‹ Prev