The Knowledge: How to Rebuild Our World From Scratch
Page 15
A nonsurgical way to help a baby through a difficult birth was developed sometime in the early 1600s. The birthing forceps represent a profound improvement in obstetrics, enabling the midwife or doctor to reach up the birth canal to firmly but carefully grip around the skull of the fetus and realign the head or gently pull the whole baby out.* An important improvement was for the two arms of the tool to be detachable off the pivot so they could be independently slid into position, and over time the design has gradually evolved for the arms of the forceps to follow the anatomical curvature of the mother’s pelvis (so that the instrument works in tandem with the muscular contractions) and the clamp ends to be shaped around the baby’s skull.
BIRTHING FORCEPS.
Premature and low-birth-weight babies are likely to die if they are not kept warm in a hospital incubator until they are able to better regulate their own body temperature. Modern incubators are expensive and sophisticated machines, and like many other items of medical equipment, when donated to hospitals in the developing world today they often soon fall out of service due to power surges, unavailability of spare parts, or lack of specialist technicians to repair them; some studies find that up to 95 percent of medical equipment donated to some hospitals is inoperative within the first five years. A company called Design that Matters is attempting to address this issue, and their ingenious solution is a great example of the sort of appropriate technology that would need to emerge in a post-apocalyptic scenario. Their incubator design uses standard automobile parts: common sealed-beam headlights are used as the heating elements, a dashboard fan circulates filtered air, a door chime sounds alarms, and a motorcycle battery provides the backup electrical supply during power outages or when the incubator is being transported. All of these parts will be readily scavengeable in the aftermath and can be repaired with the know-how of local mechanics.
MEDICAL EXAMINATION AND DIAGNOSIS
The key skill of a doctor is diagnosis—being able to identify the disease or condition that a patient is suffering from, and so to determine the appropriate course of treatment or surgical procedure. The doctor asks patients to describe the details of the onset and background of their complaints and the sets of symptoms they experience. This information is then combined with the signs discovered during physical examination. The likely causes of a complaint suggested by this process help the clinician decide what follow-up investigations to request, such as blood tests, microscopic examination of samples taken from the body, or internal imaging techniques like X-rays or CT scans. The results from these investigative efforts provide the clues for reaching a diagnosis.
After the apocalypse not only will you lose the advanced tests and scanning equipment, but much of the medical expertise itself will also be lost. Medicine and surgery, more than many other areas covered in this book, rely heavily on implicit or tacit knowledge—something you have learned how to do but would find extremely difficult to successfully convey to someone else in just words or pictures. In Britain, it takes up to a decade of medical school and on-the-job learning in a hospital to achieve competency as a registrar doctor (the equivalent of a US fellow in a subspecialty), all of this with training and hands-on demonstrations provided by someone already proficient. If this cycle of knowledge transfer breaks with the collapse of civilization, it will be impossible to teach yourself the necessary practical skills and interpretative expertise from textbooks alone. So let’s look at the very fundamentals of medicine and surgery: if all of the specialist understanding and equipment have disappeared, how can you recover the essential knowledge and skills?
Informed diagnosis relies on a variety of investigations, but until the early nineteenth century the medical profession didn’t possess a single instrument that allowed doctors to assess the internal state of the body; they had to rely on visible external signs, prodding with their fingertips for enlarged organs or masses, or tapping the abdomen and thorax for the differing sounds of underlying air or fluid. (This percussive technique was invented by a physician who was the son of a innkeeper; he is said to have gotten the idea from the method of judging the remaining level of wine in a cask.)
The tool that transformed medical diagnosis is astonishingly simple. The stethoscope need be no more than a hollow wooden tube held to the ear and pushed against the patient’s body, or even a rolled-up bundle of papers, which was how the tool was invented in 1816. René Laennec was uneasy about his ear and cheek touching the chest of a particularly buxom woman and so he improvised, realizing that the makeshift tube was not only perfectly adequate for transferring the sounds of the heart, but actually served to amplify them. A stethoscope can reveal the internal sounds of the body: not only anomalies in the sound of the heartbeat, but the wheeziness or crackling indicative of lung disease, the silence at the point of an obstructed bowel, or the faint heartbeat of a fetus.
Before the end of the nineteenth century, not only the stethoscope, but compact thermometers able to measure body temperature and inflatable cuffs linked to a gauge for measuring blood pressure, were standard items in a doctor’s kit bag. The clinical thermometer can reveal a fever indicative of infection, and the pattern shown by regular readings plotted on a temperature chart can even be suggestive of certain diseases. But the stethoscope will remain your key tool for assessing the internal condition of the human body until the post-apocalyptic civilization has relearned how to generate a very high-energy form of light. Here’s how.
In the closing decades of the nineteenth century, two curious emanations were discovered. The first of these was found to stream off the negative electrode when a high voltage was applied between two metal plates. These emissions were named cathode rays, and we now identify them as electrons: the agents of electric current in a wire that are accelerated away down the steep electric field created by the voltage. Flying electrons are rapidly absorbed by even tenuous matter like air, and so these cathode rays can travel an appreciable distance only inside a container evacuated of gas. Cathode rays could be noticed, therefore, only once scientists were able to produce effective vacuum pumps to suck out practically all the air in sealed glass canisters.
The small amount of gas left inside these early vacuum tubes produced an eerie glow as it was struck by the fast-moving electrons (an effect exploited in neon lights). The German physicist Wilhelm Röntgen wanted to exclude this light so that he could study the cathode rays penetrating through the wall of the vacuum tube, so he wrapped the tube in black cardboard. It was at this point that he noticed a fluorescent screen on the other side of the lab bench glowing a faint green. This was far too distant for cathode rays to be reaching, and Röntgen nicknamed this invisible new radiation X-rays, after their mysterious nature. We now know these X-rays to be ultra-high-energy electromagnetic waves emitted when the accelerated electrons slam into the positive electrode in the vacuum tube.
To his utter astonishment, Röntgen realized that X-rays allow you to see right through solid objects, such as the contents of closed wooden boxes; most eerily of all, in 1895 he was able to use X-rays to take a photograph of the bones inside his wife’s hand. As X-rays are absorbed more easily by dense internal structures like bones than by soft tissues, the image essentially showed the shadow of her bones from energetic light shone straight through her body. X-rays are dangerous, as they are energetic enough to trigger mutations and cause cancer, and so patients should be exposed for only a short burst to capture a snapshot on photographic film, with the doctors shielded behind a lead screen. Despite these health risks, the opportunity offered by radiography to peer inside the living body in order to examine the vital organs, assess bone fractures, or locate tumors provides vastly greater capability for diagnosis than the first diagnostic tool, the stethoscope.
But being able to externally sense the interior condition of the body is only half of the problem you’ll face after the Fall. It is absolutely crucial to be able to link patient examinations to an accurate understanding of how
our body is actually built—to literally know ourselves inside out. So if this detailed knowledge of the intricacies of our own inner structure is lost, how could you rediscover it from scratch, and so recognize what is healthy and what is abnormal?
The internal construction of animals is familiar from butchery, but the human body has important structural differences. and so getting reacquainted with anatomy, gained through human dissection, will be imperative during the reboot. Anatomy and postmortem dissection will be crucial for the redevelopment of pathology—the understanding of the root causes of diseases. The practice of conducting a post-mortem is absolutely vital in correlating the external signs and symptoms of sickness in the patient while alive with internal anatomical faults or defects that can be assessed only after death. The recognition that a particular disease is often caused by a problem in a specific organ, rather than being a systemic issue—as suggested by the premodern belief in the imbalance of bodily humors: blood, phlegm, and black and yellow bile—is pivotal to pathology, and this realization is in turn crucial for our ability to address the underlying cause of a disease, rather than simply trying to treat the symptoms that are manifested.
Once the fundamental cause has been identified, the next step is the prescription of medication or the undertaking of surgical intervention.
MEDICINES
Arriving at the correct diagnosis of a disease is useful only if you’ve already developed a set of pharmaceutical preparations that are known to be effective against particular ailments. For much of human history this has been a real stumbling block, and before the twentieth century the doctor’s medicine bag was largely ineffectual: imagine the frustration at understanding the diseases killing your patients but being powerless to stop them.
Many modern drugs and treatments derive from plants, and the traditions and folklore of herbal medicine are as old as civilization itself. Almost 2,500 years ago Hippocrates—renowned for the Hippocratic oath of the physician’s ethical code—recommended chewing willow to alleviate pain, and ancient Chinese herbal medicine similarly prescribes willow bark to control fever. The essential oil extracted from lavender has antiseptic and anti-inflammatory properties and is therefore useful as an external balm for cuts and bruises, whereas tea tree oil has been used traditionally for its antiseptic and antifungal action. Digitalin is extracted from foxgloves and can slow down the heart rate of those suffering from a fast irregular pulse, while the bark of the cinchona tree contains the antimalarial drug quinine, which gives tonic water its characteristic bitter flavor (and led to the British colonial penchant for sipping gin and tonics).
One particular class of drugs we’ll linger on for a moment are those used for pain relief, or analgesia. While these pharmaceuticals are palliative, targeting the symptom rather than the cause, they are the most commonly taken drugs in the world, for everything from everyday discomforts like headache to more serious injuries. Analgesia is an essential prerequisite for the redevelopment of surgery. Limited pain relief can be achieved by chewing willow bark, and topical analgesia, suitable for superficial injuries or minor surgical procedures such as lancing boils, is provided by chili peppers. The capsaicin molecule that gives chilies their illusory fiery burn in the mouth is known as a counterstimulant, and, like the contrary cooling effect of menthol from mint plants, can be rubbed onto the skin to mask pain signals (both capsaicin and menthol are used in muscle-easing heat patches or ointments like Tiger Balm).
But the universal painkiller, used since antiquity, is provided by the poppy. Opium is the name of the milky pink sap that can be harvested from the poppy after it has flowered, and it has considerable pain-relieving qualities. Traditionally, opium is collected daily by making several shallow slices in the swollen, golf-ball-size seedpod of the poppy plant, allowing the sap to seep out and dry to a black latex encrustation that is scraped off the following morning. Morphine and codeine are the major narcotics in opium: the dried sap can contain up to 20 percent morphine. These opiates are far more soluble in ethanol than in water, and a potent (but addictive) tincture of opium, laudanum, is made by dissolving powdered opium in alcohol. A much less labor-intensive system developed in the 1930s uses several washes of water (often slightly acidic to improve solubility) to extract opiates from the poppy after the plant has been reaped, threshed, and winnowed—the poppy seeds kept for eating or replanting, just as you would do with cereals. In fact, 90 percent of medical opiates today are still harvested from poppy straw.
The risk, though, in taking crude decoctions or tinctures of plant extracts is that, without the capability for chemical analysis, you don’t know the actual concentration of the active ingredient, and taking too much can be dangerous (particularly if, like digitalin, it interferes with your heart rate). You may have a narrow window of opportunity in the dosage: trying to hit the sweet spot of administering enough to be effective, but not so much as to become lethal.
For the vast majority of serious and ultimately fatal conditions, from pervasive infection and septicemia to cancer, no effective treatment at all is available from simple herbal concoctions. The key enabling technology that started the phenomenal medical revolution after the Second World War was prowess in organic chemistry for isolating and manipulating pharmaceutical compounds. Pharmaceuticals today are available in precisely known concentrations, and either have been synthesized artificially, or a plant extract has been modified using organic chemistry to increase the potency or decrease the side effects of the compound. For example, a relatively simple chemical modification is made to the active ingredient in willow bark, salicylic acid, that allows it to retain its efficacy as a fever-reducing painkiller but reduces the side effect of stomach irritation. The result is aspirin, the most widely used drug in history.
The key practice in evidence-based medicine that you’ll need to return to after the Fall is running a fair test to see if a particular compound or treatment actually works*—or whether it should be thrown out alongside useless snake oils, witch-doctor potions, and homeopathic concoctions. Ideally, objectively testing a treatment’s effectiveness in a clinical trial involves a meaningfully large number of patients split into two groups: one to receive the putative therapy and the other, the control group that forms the baseline for comparison, to be given a placebo or the current best drug. The two pillars of successful clinical trials are the random assignment of test subjects to the groups so as to remove bias, and the use of “double blinding”: neither patients nor practitioners know who has been assigned to which group until the results are analyzed. During the redevelopment of medical science after a Fall there will be no shortcuts for meticulous, methodical work, which may also call for disagreeable practices like animal testing for the sake of easing human suffering.
SURGERY
For some conditions, the best course of action is surgery: to physically correct or remove the faulty or troublesome component of the body’s machinery. But before you can even think about attempting surgery (with a reliable chance of patient survival)—intentionally creating a wound to open the body, having a look inside, and tinkering with the workings inside like a car mechanic—there are several prerequisites that a post-apocalyptic society will need to develop. These are the three As: anatomy, asepsis, anesthesia.
We have already seen that you need to know how our body is built so that you can tell a diseased organ from a healthy one. And without a detailed grasp of anatomy, your surgeons are literally poking around in the dark. You need to have a comprehensive map of the body’s internal makeup, the normal forms and structures of each of its components; you need to understand their function and know the paths of major blood vessels and nerves so that you don’t accidentally sever them.
Asepsis is the principle of preventing microbes getting into the body in the first place during surgery, rather than trying to clean the wound later with antiseptics like iodine or ethanol solution (antiseptics are your only option for an accidental, dirty wound). To m
aintain aseptic conditions, scrupulously clean the operating theater and filter the air supply. The site of the operation can be cleansed with 70 percent ethanol solution before the incision is made, and the patient’s body covered with sterile drapes. Surgeons must wear clean surgical gowns and face masks, scrub their hands and forearms, and operate with heat-sterilized surgical instruments.
The third crucial element is anesthetics. These are drugs that don’t cure disease but do something just as valuable: they can temporarily pause all sensitivity to pain, or even induce complete unconsciousness. Without this, surgery is an abominably traumatic experience and can be attempted only as a last resort. The surgeon must work rapidly, slicing through muscular tensions and spasms as the patient writhes in agony, and only simple procedures can be considered: removal of a kidney stone or the brutish amputation of a gangrenous limb with a butcher’s saw. With an insensate patient, however, surgeons can afford to work much more slowly and carefully, and are able to risk invasive operations on the chest and abdomen, as well as exploratory surgery to see what might be the underlying causes of an ailment.
The first gas to be recognized for its anesthetic properties was nitrous oxide, or “laughing gas”: when it is inhaled at high enough doses, its exhilarating sensation can give way to true unconsciousness, suitable for surgery or dental work. Nitrous oxide is generated from the decomposition of ammonium nitrate as it is heated—be careful, though, as the compound is unstable and may explode if it gets much hotter than 240°C—and the anesthetic gas is then cooled and cleaned of other impurities by bubbling through water. Ammonium nitrate can itself be produced by reacting ammonia and nitric acid (see Chapter 11). Nitrous oxide alone is good for dulling the sensation of pain, but it is not very powerful as an anesthetic. If, however, it is administered with other anesthetics, such as diethyl ether (often abbreviated to ether), it acts to potentiate them, enhancing their effectiveness. Ether can be produced by mixing ethanol with a strong acid, such as sulfuric acid, and then extracting ether from the reaction mixture by distillation. It is a reliable inhalation anesthetic, and although ether is relatively slow to work and can produce nausea, it is medically safe (although it is an explosive gas). The advantage of ether is that not only does it induce unconsciousness, but it also acts to relax muscles during surgery and provides pain relief.