The Pain Chronicles

Home > Other > The Pain Chronicles > Page 12
The Pain Chronicles Page 12

by Melanie Thernstrom


  During the 1840s at least three Americans were experimenting with ether and nitrous oxide: among them, two New England dentists and a southern country doctor. That one of the two dentists ultimately prevailed has sometimes been attributed to the economics of dentistry; unlike most surgeries, dentistry was usually elective, and the fear of pain kept patients away from it. (But, like the idea that anesthesia could only have been invented in America, this theory seems puzzling. After all, even if the surgical profession as a whole maintained a cultural bias against anesthesia, it would have taken only one surgeon to play Morton’s role.)

  The earliest of the three is generally believed to be a doctor from Georgia who, after attending ether parties as a medical student, experimented with using ether during surgery on a patient in 1842. But, concerned that the technique would strike other physicians as improper, to his eternal regret, he did not publish his findings until after Dr. Morton had already claimed credit for ether’s discovery.

  At a popular science demonstration three years later, a Connecticut dentist named Horace Wells noticed that one of the volunteers seemed unaware of having accidentally cut his leg while leaping around the room under the influence of laughing gas. The next day, Dr. Wells had a fellow dentist pull out one of his molars while he inhaled nitrous oxide, and he found it didn’t hurt at all. Dr. Wells successfully experimented on other patients (he tried ether on some as well, but decided nitrous oxide was safer, as ether can cause vomiting and is also easily flammable). Finally, Dr. Wells persuaded the eminent surgeon Dr. John Collins Warren, founder of what became The New England Journal of Medicine and Massachusetts General Hospital, to permit a demonstration during one of Warren’s surgeries in the hospital’s glass-domed amphitheater (today known as the ether dome).

  The demonstration was a failure: the patient hollered at the first cut, and the crowd of surgeons and medical students jeered, “Humbug!” (Dr. Wells might have given the patient insufficient quantities of nitrous oxide, and later the patient said he had felt only a little pain but had been alarmed by the procedure.)

  A year later, Dr. Wells’s former colleague Dr. Morton persuaded Dr. Warren to let him demonstrate gas anesthesia, using ether instead of nitrous oxide, on a young man submitting to the removal of a tumor from his jaw. Accounts of the incident are all so vivid: the audience waiting, the tremulous young man strapped down on an operating chair on the stage of the amphitheater, the surgeon impatiently declaring, “As Dr. Morton has not arrived, I presume he is otherwise engaged,” and Dr. Morton finally bursting in, explaining that he had been delayed because the custom-made device for delivering vapors of an ether-soaked sponge had not been ready. The patient inhaled the vapors from the sponge, lost consciousness, and did not awaken until after Dr. Warren had finished washing the blood off his wounds. When questioned about the pain, the patient answered that he had felt only a peculiar scratching at his cheek, like a hoe raking a field.

  “Gentlemen, this is no humbug,” Dr. Warren announced to the hushed crowd. Later he wrote, “The student who . . . in distant ages may visit this spot will view it with increased interest, as he remembers that here was first demonstrated one of the most glorious truths of science.” Around the world, word swiftly spread that, as a German surgeon put it, “the wonderful dream that pain has been taken away from us has become reality. Pain . . . must now bow before the power of the human mind, before the power of the ether vapor.”

  Anguished with jealousy over Dr. Morton’s success, Dr. Wells suffered a nervous breakdown after trying in vain to petition various boards for recognition of his role in the discovery. In searching for an agent superior to nitrous oxide, he developed an addiction to chloroform. He attacked two prostitutes while under the influence and committed suicide shortly before the arrival of a letter from the Paris Medical Society declaring that he was “due all the honor” for the discovery.

  In fact, he needn’t have been too jealous of his old friend. Although Dr. Morton would receive credit for the discovery of the “greatest gift ever made to suffering humanity,” the alleviation of mankind’s pain came at a personal price. Dr. Morton wasted the rest of his life in a futile attempt to patent ether in order to profit from it (he was unable to patent it, because he hadn’t invented it). He had hoped to disguise ether’s well-known reek by mixing it with orange-scented oils and other fragrances and calling it Letheon, after the Greek river of forgetfulness, but the ingredient was evident. He died bitter and impoverished.

  THE SLAVERY OF ETHERIZATION

  In a corner of the Public Garden in Boston stands an unusual monument, one that is neither to a hero nor a battle, but rather to a medical achievement: “The discovery that the inhaling of ether causes insensibility to pain. First proved to the World at the Mass General Hospital in Boston.” The 1868 forty-foot rose-speckled marble and granite obelisk is the only such statue in the world. The invention of other revolutionary medications, such as the discovery of antibiotics, does not seem to call for the visual tribute of a monument—they are celebrated, one might say, by every life they save.

  The commissioning of the ether memorial was controversial in its time, and its intricate surfaces make clear its real purpose: not merely to mark an achievement but also to address the source of the controversy that surrounded it by attempting to reconcile science’s and religion’s competing perspectives on anesthesia. The monument did so by interpreting the discovery not as a triumph of the former over the latter, but rather as a fulfillment of the prophecy from the book of Revelation inscribed on its east relief: “Neither Shall There Be Any More Pain.” Above those words an angel of mercy descends upon a sufferer, while the south and north faces of the monument depict operations performed with anesthesia. On the west side of the monument, a female allegorical embodiment of Science perches on a throne of lab equipment, and an inscription from Isaiah insists, “This also commeth forth / from the Lord of hosts . . .”

  Yet, critics of anesthesia pointed out that Revelation prophesizes that God—not science—“shall wipe away all tears from their eyes.” And they considered anesthesia “of the devil,” a deliberate flouting of Adam’s curse. The benefit of anesthesia is so absurdly obvious now that it’s hard for us to believe it was ever controversial, yet at the time some physicians saw only “evil” in anesthesia. It was “a questionable attempt to abrogate one of the general conditions of man,” one causing “the destruction of consciousness.” (Advocates countered that ordinary sleep is a nightly destruction of consciousness.)

  Ether, in the view of many, was a party drug that was now being brought into the most serious of arenas, inducing a state that, in its exhilaration stage, suspiciously resembled drunkenness. “Even were the reports of persons who felt no pain during an operation credible, this would not be worth the consideration of a serious-minded doctor,” a prominent surgeon declared. A full seventeen years after the discovery, New York surgeon Valentine Mott wrote a passionate defense of anesthesia, arguing that “the insensibility of the patient is a great convenience to the surgeon” (Mott’s emphasis).

  But some surgeons saw only inconvenience in having to share their operating rooms with a new medical specialist—an anesthesiologist—who might make them “a mere operator, a subordinate instead of a chief, who under all circumstances retains the supreme command,” as one Edinburgh surgeon grumbled. Anesthesia went so far as to represent “the degradation of surgery against which all surgeons should guard with all their might.”

  Moreover, many surgeons considered ether “a remedy of doubtful safety,” a poison that caused unnecessary bleeding, suffocation, TB, depression, insanity, or sometimes death. The experience of pain was thought to be somehow conducive to healing. “Pain during operations is, in the majority of cases, even desirable; its prevention or annihilation is, for the most part, hazardous to the patient,” wrote a British physician. Unsurprisingly, military surgeons were among the most reluctant. “The shock of the knife is a powerful stimulant,” a military surgeon wrote;
“it is better to hear a man bawl lustily” from pain than “sink silently into the grave.”

  By anesthetizing patients, the physician was seen as temporarily “killing” them and imposing a “slavery of etherization.” Perhaps he might even commit a crime such as rape against them. Like drunkenness, anesthesia was thought to induce lascivious dreams in female subjects. After hailing its discovery, the Boston surgeon Henry Bigelow himself soberly warned in his address to the Boston Society for Medical Improvement on November 9, 1846, “It is capable of abuse, and can readily be applied to nefarious ends.” Alarmed by the depiction of such dangers, even patients themselves sometimes refused anesthesia.

  Ether required patience—surgeons had to wait for it to take effect. It did not work on all patients, and even when it did, in lighter doses it created a state of semiconsciousness where patients, alarmingly, talked or sang. It could cause vomiting, and—most perilously—it was highly flammable. How to dose the drug to avoid exhilaration and induce sleep was poorly understood. Different kinds of inhalers produced different results. As an 1847 article in The Lancet noted, “In some cases there is perfect insensibility to pain,” but “there are cases in which ether does not act at all or appears to act as a violent stimulus.”

  That same year, Sir James Young Simpson pioneered the use of chloroform, which caused unconsciousness without exhilaration. It swiftly replaced ether; indeed, its use became so universal that opponents of anesthesia in general were dubbed the Anti-Chloroformers. Hospital records of the era show us that many surgeons worked without any anesthesia, while others used chloroform for their initial cuts, but did without it during the rest of an operation, or limited their use of it to major operations. James Syme—the Scottish surgeon who had amputated poor George Wilson’s foot—said he would use anesthesia only “if the patient has a very great dread of pain”! Factors such as sex, age, and ethnicity were considered in decisions regarding who merited anesthesia and when.

  Many Christian churches strongly opposed using anesthesia during childbirth, on the grounds that it contradicted God’s direct commandment to Eve. The entire city of Zurich banned anesthetics on these grounds. “Pain is the mother’s safety, its absence her destruction,” wrote an obstetrician. “Yet are there those bold enough to administer the vapour of ether even at this critical juncture, forgetting it has been ordered that ‘in sorrow shall she bring forth.’ ”

  Sir James Young Simpson argued that the Genesis commandment was actually to bring forth children not in “pain” but in “toil” (the Hebrew words for Eve’s pain actually carry both meanings). And, Dr. Simpson argued, labor under anesthesia certainly involved toil. In 1853, chloroform received the ultimate imprimatur when Queen Victoria chose to use it to ease the birth of her eighth child, causing masses of women to demand “anesthesia à la Reine.”

  By the close of the nineteenth century—almost one hundred years after Michael Faraday suggested that nitrous oxide could ease pain in surgery—the acceptance of anesthesia was virtually universal, and with its acceptance, the meaning of pain in Western culture was forever altered. For if anesthesia had robbed the craft of surgery of its terrors, as Henry Bigelow put it, it also stole from pain some of its store of ancient meanings.

  Tensions between secular and sacred conceptions of medicine had long existed, but efforts had always been made to reconcile the two. Ambroise Paré, for example, famously ended his case histories with the sentence, “I dressed him, God healed him.” Since ancient times, sufferers had slept in the temples of the gods of healing and taken opium and willow bark, and for the most part, such actions were not considered inconsistent. In the nineteenth century, as Darwin’s theory provided a biological framework for understanding pain and the discovery of anesthesia allowed for its control, the medical alliance between scientific and religious perspectives finally splintered.

  In 1887, H. Cameron Gillies wrote a series of articles in The Lancet claiming that “Pain never comes where it can serve no good purpose,” because pain is God’s way of protecting the body. In reply, W. J. Collins argued that not all pain is protective: “Is this the grim comfort he [Gillies] would bring to a suffering woman, tortured slowly to death by a sloughing scirrhus of the breast, or to a man, made almost inhuman and killed by inches by the slow yet sure ravages of a rodent ulcer?”

  The medical establishment dismissed Gillies. By the end of the Victorian era, the underlying debate was conclusively settled: there was no meaning to pain. Pain was not a metaphor; it was a biological by-product of disease. The body had been claimed as the province of science, the patient dispossessed. Pain was not passion, alchemy, ordeal; in the cosmological contest between demons and deities, it was man who had won. Thousands of years of thinking about pain were swept aside as the biological paradigm of pain displaced the religious one. The telegram went out: just as the consumptives came down the mountain, acute surgical pain could be controlled by anesthesia. Medical science could now turn to chronic pain, and surely it, too, would soon be mastered.

  Surely.

  Pain Diary:

  I Get a Diagnosis

  “I’m ordering scans of both your cervical spine and your right shoulder,” my internist said. “It’s good to get to the bottom of problems before they become chronic.”

  “How long does it take to become chronic?” I said.

  “How long have you had it?” he said.

  ONE’S WHOLE LIFE AND ONE’S FATE

  I hope you get a good result,” Kurt had told me with a measure of anxiety the night before my MRI—anxious for me, but not only for me. I could sense his desire not to have a girlfriend with a health problem. A good result—I should concentrate on hoping for a good result. But what would that be? Ordinarily, it would be no result at all, proving the test to have been superfluous.

  In the basement of the hospital, I shed my earrings and rings and blouse and bra and lay still, as if in a sarcophagus, as the MRI machine illuminated not merely the vertebrae but also the tendons, cartilage, and disks of my spine. The spine: the stem of the body, the vine from which everything flowers. I could feel the pain even then, like a white electrical current in my neck that flowed quickly through my right shoulder and sizzled in my hand—a pain I had come to know so well.

  I tried to calm myself with the Christian Scientist tenet my grandmother taught me: “There is no life, truth, intelligence, nor substance in matter. All is infinite Mind and its infinite manifestation, for God is All-in-all.” The greatest fear of pain patients, doctors sometimes say, is that it’s “all in their head.” Infinitely scarier, I realized as I lay there, is the idea that it isn’t. I knew the machine was seeing my body in a different way and that its record would be irrefutable. My pain would no longer be a tree falling in the forest with no one (but me) to hear it. Through feeling and thought, pretense and denial, hope and despair, the machine knew: the tree crashing, the tent bones breaking, the leper laughing—the truth.

  “There are situations in life in which our body is our entire self and our entire fate. I was my body and nothing else,” wrote the French philosopher Jean Améry of his time in Auschwitz. When I read his account in college, years away from pain, I thought, Let that time never come. Let my body never be my fate.

  The next week, when I was to meet with my internist, I would know my fate; I would know, that is, whether my body was to be my fate.

  “Ah yes, the films and the radiologist’s report,” my internist said. He tacked the films up on a lighted screen set into the wall beside a life-size yellow plastic model of a skeleton. He traced his finger over the films, referring periodically to the structures on the skeleton as he began to explain. The more he talked, the more animated he grew. He liked to explain things, I could tell; he knew the material, he felt confident.

  I pictured my body as a skeleton in a medical school class.

  This is an example of cervical spondylosis, the doctor was explaining to a sea of eager students. Cervical spondylosis is a type of osteoarthr
itis. If you look closely at these vertebrae, you can see osteophyte formations on their surfaces. As the disks degenerated, the unprotected vertebrae rubbed against each other and developed calcium deposits, which are also known as bone spurs. The bone spurs impinge on the nerve roots, causing pain and weakness. Note that the opening of the spinal canal is abnormally narrow as well, the congenital problem of stenosis, which, in this case, aggravated the spondylosis.

  This skeleton belonged to a woman, age thirty-three. She reported right-sided pain and weakness. Looking at the skeleton as well as the MRIs, we can see that the degeneration was, in fact, more significant on the right side.

  Additionally, we can see a problem with the right shoulder here. In some people the space between the undersurface of the acromion—the bone at the top of the shoulder—and the top of the humeral head is narrow. This narrow passageway squeezes the rotator cuff—the tendons that connect the shoulder to the arm and allow it to rotate—causing what is known as impingement syndrome, with which this patient was diagnosed.

  The only unusual thing about this skeleton is the patient’s age. Symptoms of cervical spondylosis typically first appear between the ages of forty and sixty, although cases have been found in people as young as thirty. Normally, osteoarthritis is associated with aging, but in cases of premature wear such as this, the origin is presumably genetic, possibly aggravated by trauma.

 

‹ Prev