Book Read Free

Quackery

Page 16

by Lydia Kang


  The Water Cure Today

  Many of the tenets of the water cure are still with us today. Regular bathing practices were first introduced by hydropathy, and few Americans in the twenty-first century would consider going a single day without taking a bath or shower. (To see how far we’ve come, here is an excerpt from a letter to the Boston Moral Reformer in 1835: “I have been in the habit during this past winter of taking a warm bath every three weeks. Is this too often to follow the year round?”) Non-restrictive clothing has been adopted en masse. Modern spas—and the hydropathic therapies available at most gyms and athletic clubs—are direct descendants of the nineteenth-century hydropathic institutes. Drinking enough water every day is another universal given in modern medicinal practice, although the exact amount of water you should drink still attracts vigorous debate.

  Despite the arrival of quacks on the water cure scene, the original hydropaths were on to something. They came along at just the right time in the progression of history to bring about much needed changes in personal hygiene. By drinking water, getting a lot of exercise, and taking regular baths, people could indeed prevent some diseases and lead healthier lives.

  Next time you count out your water glasses for a given day, you’re in effect participating in a nineteenth-century medical curiosity. Just don’t dump buckets of cold water on your friend claiming that you’re helping him cool the fire in his brain.

  17

  Surgery

  Of Crossbow Scalpels, the Need for Speed, 300 Percent Mortality Rates, the Theater of Surgery, and Pus-Covered Coats

  Chances are you’ve had surgery. If not? Just wait. You probably will someday. What was once a limited discipline saved for the most extreme of medical ailments is now common. Elective, often. We assume all will be sterile and painless, that our surgeons will be skilled (and actually surgeons, duh). But once upon a pus-filled time, surgery was not so tidy and precise.

  Surgery is breaking through the ultimate mortal barrier—the body itself. Slicing through skin, poking through eyeballs, sawing off bones, and ligating blood vessels means altering nature and the natural history of disease and trauma. Is it somewhat godlike? Eh, let’s leave that answer to the psychoanalysts.

  Shooting an arrow out of the patient’s neck with a crossbow … well, it seemed like a good idea at the time.

  Since ancient times, doctors have turned to surgery for fixing broken bones, treating traumatic injuries, and cutting off diseased limbs. We’ve drilled holes in skulls for headaches and epilepsy, cauterized amputations with scalding irons, and even shot arrows back out of bodies. That’s right. Arrow wounds were the chief problem from prehistoric times until the advent of guns. Removing the arrow was a troublesome task, and on occasion, physicians thoughtfully decided that using a crossbow was the ideal method of removal. A medieval illustration shows a poor soul clinging to a pillar while the embedded arrow in his neck is attached to a crossbow to remove it. And you thought you were having a bad week.

  Here, we’ll be focusing on the dawn of modern surgery, beginning in the sixteenth century, when discovery, desperation, and ingenuity (and occasionally egos) all collided. In the grand, bloody, and smelly history of the operating room, there are several stopping points that leave us rather aghast. Through the lens of the present day, the annals of surgery are filled with scientifically unsound practices and charlatans. Let’s scrub in and take a look.

  Seventeenth-century German surgical instrument. Or evil scissors. You choose.

  “Time Me, Gentlemen!” The Costs of Speed and Showmanship

  Amputation was perhaps the most common surgical procedure performed for thousands of years. In the face of leg wounds and deadly gangrene, it was often the best chance of survival, even if the mortality rate was as horrible as 60 percent or more (during the Franco-Prussian war in 1870, the mortality rate for amputations was a staggering 76 percent).

  Amputation instruments and a helpful how-to.

  Until the nineteenth century, there was no reliable anesthesia, which meant that amputations had to be a quick affair to minimize the patient’s waking nightmare. In the name of speed, often everything was cut at the same level, termed a chop or guillotine amputation. As if that term weren’t horrible enough, in World War I, French surgeons called it amputation en saucisson, which likened the procedure to slicing a sausage in half. Tasty.

  A 1793 amputation. Notice how the patient is restrained by people and ropes.

  Although that might sound terrifying, if you were a badly wounded soldier, you would want a speedy sausage-chop, too. From the sixteenth to the nineteenth century, a typical leg amputation would go thusly: The patient was forcibly held down to prevent movement (and perhaps changing their mind and staggering away), while a tourniquet was applied to block off the main artery of the leg. Using a curved blade, the surgeon would cut skin and muscle around the bone, ideally in one single slice, and then saw the bone off. The open vessels would sometimes be cauterized (by hot irons, boiling oil, or chemically with vitriol), and the flesh was either left as is or sewn closed.

  All this was done in less time than it takes to watch a YouTube music video. Eighteenth-century Scottish surgeon Benjamin Bell could amputate a thigh in six seconds. French surgeon Dominique Jean Larrey was comparatively slow. But in his defense, during the Napoleonic wars, he performed two hundred amputations in a twenty-four-hour period—one every seven minutes.

  Sure, speed reduced the time the patient would spend in unendurable pain. But it also led to sloppy work. Often, bone was left protruding because the flesh withdrew from the cutting plane. Sliced flesh could be ragged, slowing the healing process. The speed of the procedure and awkward positioning to get around the limb meant accidental slices elsewhere. And no matter how fast the surgeon, operations were generally accompanied by the bloodcurdling screams of the patient.

  Sometimes, the screams came from someone other than the patient.

  Let us introduce you to Robert Liston, aka “the fastest knife in the West End.”

  Liston was a larger-than-life character, half-surgeon and half-entertainer within the operating theater of 1840s Scotland. His amputation surgeries were well attended by students peering over the galleries. Liston, occasionally with knife clasped in teeth, barked at the onlookers, “Time me, gentlemen, time me!”

  Time him they did. Liston was fast (his amputations generally took less than three minutes from first cut to wound closure). His speed was so mighty that once he accidentally sliced off the testicles of the patient. A free castration, to boot! Another time, he accidentally cut off the fingers of his assistant (who often held the leg in place); during the procedure, one of the onlookers dropped dead from terror when the knife slashed close enough to cut his coat. Unfortunately, the patient died. The poor assistant also later died of gangrene from the finger amputations, and thus Liston became the proud surgeon who could now boast a stunning 300 percent mortality rate from one surgery.

  Robert Liston in the operating theater.

  The atmosphere surrounding Liston’s flamboyant surgeries was not unique; with the advent of modern surgery, audiences were flocking to see these rock stars in action. London and Paris also boasted surgeries that were more akin to Broadway shows. There were ticket sales, with high fees for the most entertaining surgeons, tens to hundreds of spectators, and preoperative celebrity performances. The surgeon was met with applause before and during the procedure. Honoré de Balzac, a contemporary, commented that “the glory of surgeons is like that of actors.” This flavor of showboating is unimaginable today, although the idea of the celebrity physician certainly isn’t.

  Pride of Pus

  Everyone likely has glimpsed a modern-day operating room in real life or on TV—painstakingly sterile and shining, with sharp equipment, as well as masks and gloves intended to be used only once and then incinerated. Operating theaters of the nineteenth century were gross, and people preferred them that way.

  In the early to mid-1800s, you’d see a
table almost blackened from blood and pus from countless preceding surgeries. No surgical gloves were worn—they had yet to be invented. Instruments were often rinsed only in water, if at all, and surgeons’ hands mostly went unwashed. And the coat worn by the doctor? It was often so caked in layers of blood that it was stiff—a sign of a “good surgeon.”

  Even surgeons themselves were not exempt from the dangers that lurked at hospitals and medical schools. Professor Jakob Kolletschka died in 1847 from sepsis after he cut one of his own fingers during an autopsy. Medical students at the Vienna General Hospital in 1840 would bring their unwashed hands directly from autopsies to the obstetrics ward, killing one out of three mothers from childbed fever. In contrast, the ward manned by midwifery pupils had a 3 percent mortality rate. When the students switched wards, the horrible death rates followed the medical students and their bacteria-laden hands. The physician Ignaz Semmelweis, observing this, had the staff do something simple but miraculous: wash their hands with soap and a chlorine solution. Voilà—death rates plummeted. But tragically, no one listened.

  In the nineteenth century, Joseph Lister built upon microbiologist Louis Pasteur’s germ theory of disease and eventually revolutionized surgery by introducing the concept of antisepsis. Many poo-pooed the idea of bacteria. An Edinburgh professor snorted, “Where are these little beasts … has anyone seen them yet?” Another surgeon insisted that “there is good reason to believe that the theory of M. Pasteur, upon which Lister bases his treatment, is unsound.” But Lister’s theories and the facts—there were fewer deaths when antiseptic chemicals such as carbolic acid and general aseptic cleanliness were used—eventually won out by the turn of the twentieth century. They even named a mouthwash after him: Listerine. And now many of us swish and spit in his honor.

  Unfortunately for our twentieth president, James Garfield, his doctors weren’t as impressed by Lister. After suffering a nonlethal bullet wound, Garfield was examined by doctors who probed the area with unwashed fingers and instruments. Pus began to form in the wound while he attempted to recover, and the doctors probed yet again with unwashed hands. He died months later, in 1881, due to complications of infection.

  Soon, even the public operating theaters and their filthy surfaces disappeared. Cleanliness, hand washing, and surgical gloves became de rigueur. Surgery would no longer be a last resort, but a keenly wielded tactical maneuver in the fight against illness.

  A Lithotomy to Remember

  Bransby Cooper was the nephew of the more famous and well-respected surgeon Sir Astley Cooper. The nephew was not a good surgeon, but apparently his uncle insisted on his appointment at Guy’s Hospital in London.

  The procedure was a simple bladder stone removal, called a lithotomy. Normally, they could be done in about five minutes. Normally, the poor patient was tied up with his knees trussed with a cloth behind his neck, genitals all abloom (hence, the modern use of the lithotomy position, by which women give birth in hospitals). Normally, the surgeon cut between the anus and the scrotal sac (an area called the perineum), into the bladder, fished out the stone, and stitched everything up while the patient screamed bloody hell.

  That’s not exactly what happened when Bransby Cooper attempted the procedure. He couldn’t find the bladder. Then he couldn’t find the stone. Every surgical instrument at hand was used before Cooper wormed around with his fingers, trying to fish it out.

  The patient by then yelled, “Oh! Let it go! Pray, let it keep in!” But to no avail. Cooper blamed the patient for having a deep perineum before yelping out at the assistant, “Dodd, have you a long finger?” He eventually found the stone, but after a whopping fifty-five minutes. The next day, the patient died, no doubt with a crater-sized hole in his nether region.

  After the creator of The Lancet, Thomas Wakley, exposed Cooper’s incompetency publicly, the doctor sued him for £2000. He ended up winning, but only a trifling £100.

  It would turn out to be the first malpractice trial in history—but certainly not the last.

  A lithotomy in progress, 1768.

  First, Do No Harm … Oh, Never Mind

  Some surgical innovations were skin-crawling but brilliant. In the Samhita, written by Indian surgeon Sushruta in 500 bce, he recommended, “Large black ants should be applied to the margins of the wound and their bodies then severed from their heads, after these have firmly bitten the part with their jaws.” Voilá. Insect’s mandibles as natural staples for wound closure. Genius, right?

  But many surgical history books tell of the not-so-ingenious stories of surgically altering what probably shouldn’t be altered. Stuttering is a classic example. In the nineteenth century, German surgeon Johann Friedrich Dieffenbach would cut out a triangular wedge near the root of the tongue to cure stuttering. Others tried to “resize” the tongue, or cut the frenulum, the delicate piece of tissue between the tongue and the floor of the mouth. None of these procedures worked.

  In 1831, a Mr. Preston decided that it would be a good idea to tie off a carotid artery on the side where a patient had a stroke. There’s just one problem: Strokes often occur from lack of blood flow to the brain. Trying to help by cutting off the blood supply is like helping a drought by telling a rain cloud to do business elsewhere. The patient somehow survived. Preston also recommended that perhaps both carotid arteries be tied off for treatment of strokes, epilepsy, and insanity. Thankfully, no one took his advice.

  As the fear of autointoxication—the theory that the normal end products of digestion contain poison (see Enemas & Clysters, page 163)—grew, many tried to cure constipation with myriad devices and purgatives at the turn of the twentieth century. One British surgeon, Sir William Arbuthnot Lane, took it a step further. He cut out the colon altogether. He performed more than a thousand colectomies, mostly on women. Surely, slow colons were the cause of women’s mental shortcomings like stupidity, headaches, and irritability. Luckily, you can snip out your colon and survive, but you’ll likely have the side effect of lots of diarrhea. Like most functioning parts of your body, colons are better left intact.

  Urethral probes, circa 1870.

  Lane also was a proponent of fixing misplaced organs. Yes, you heard that correctly. In the early twentieth century, many believed that vague abdominal and whole-body discomfort could be due to “dropped” or “misplaced” organs. The kidney was perhaps the most misplaced organ of all time. Lane blamed dropped kidneys, or nephroptosis, for causing suicidality, homicidality, depression, abdominal pain, headaches, and the more obvious physical symptoms of urinary problems. Removing even one kidney killed too many patients, so surgeons instead performed a “nephropexy”—more or less tacking kidneys back in place using sutures and occasionally rubber bands and wads of gauze. By the 1920s, this procedure was falling out of favor with some surgeons claiming that “the most serious complication of nephroptosis is nephropexy” and that urologists seemed to have “an orgy of fixation of kidneys.” (To be fair, part of a urologist’s job is to fixate on kidneys, but the orgy part was a bit dramatic.)

  Kidneys weren’t the only body part atrociously fiddled with by surgeons. Tonsils, the glands at the back of the throat, were also removed in excessive numbers in an effort to put a stop to all childhood infections—a well-meaning but misguided aim. Of course, the tonsillectomy has its place as a modern-day therapeutic in cases of sleep apnea and recurrent tonsillitis, but as a last resort. Most would be shocked that in 1934, a study in New York showed that of one thousand children, more than six hundred of them had had tonsillectomies. They weren’t risk-free surgeries, either; many children died annually from the procedure. The promise of post-op ice cream was so not worth that.

  And no discussion of unnecessary surgery would be complete without a story of pointless fiddling with men’s tender nether regions. John Harvey Kellogg, health practitioner and cereal inventor, recommended circumcision to quell evil urges to masturbate if other methods—including bandaging genitals, covering them with cages, and tying hands—failed. The
procedure should be done “without administering an anaesthetic, as the brief pain attending the operation will have a salutary effect upon the mind, especially if it be connected with the idea of punishment.” Yow. Well, as any circumcised man can tell you, the procedure doesn’t prevent masturbation.

  The Lure of the Scalpel Continues

  The public has always been lured by the promise of a quick slash and cut to fix everything. Some people love being patients so much they’ll go under the knife for fake symptoms, while others repeatedly return to the OR in search of a phantomlike physical perfection. But unlike in past centuries, surgeons and hospitals are under rigorous scrutiny for cleanliness, quality training, low mortality rates, and results that stand both the test of time and the scientific magnifying lens. And thanks to the development of anesthesia, we no longer require the hastiness of the two-minute slash and saw. Thank goodness for that.

  18

  Anesthesia

  Of Suffocation, Soporific Sponges, Chloroform, Laughing Gas, Doping the Pets, Ether Frolics, and Noxious Farting

  Conquering pain is no easy feat. Anesthesia, from the Greek words for “absence of sensation,” has been sought after since humankind first dared to drill a hole in a head for surgery. Ancient China used hashish. The Egyptians turned to opium. Dioscorides recommended deadly mandrake with wine. In the Middle Ages, there were even recipes for a “soporific sponge,” soaked in mandrake, henbane, hemlock, and opium, then dried in the sun. It was then rinsed in hot water, squeezed but left damp, and applied to the patient’s nose for inhalation.

 

‹ Prev