Unthinkable

Home > Other > Unthinkable > Page 1
Unthinkable Page 1

by Helen Thomson




  Dedication

  For Mum

  Contents

  Cover

  Title Page

  Dedication

  Introduction—The Strange Life of the Brain

  Bob—Never Forgetting a Moment

  Sharon—Being Permanently Lost

  Rubén—Seeing Auras

  Tommy—Switching Personalities

  Sylvia—An Endless Hallucination

  Matar—Turning into a Tiger

  Louise—Becoming Unreal

  Graham—Waking Up Dead

  Joel—Feeling Other People’s Pain

  Conclusion—Nothing Is Unthinkable

  Acknowledgments

  Notes and Sources

  Index

  About the Author

  Copyright

  About the Publisher

  Introduction

  The Strange Life of the Brain

  It’s not something you easily forget, the first time you see a human head sitting upon a table. The worst part is the smell. The unforgettable stench of formaldehyde, the chemical fixative in which bits of the body are hardened and preserved. It gets up into your nostrils and really sticks around.

  It wasn’t the only head in the room; there were six, all severed at slightly different angles. This particular head had been lopped off just below the chin, and then sliced in half down the center of its face. It had belonged to an elderly gentleman—the deep wrinkles etched into his forehead held whispers of a long life. As I slowly circled the table, I saw a few gray hairs sticking out of a generous nose, an unruly eyebrow and a tiny purple bruise just above the cheekbone. And suddenly there it was, sitting in the middle of a thick bony skull—a human brain.

  It had a grayish yellow tinge and a texture that conjured up thoughts of a shiny panna cotta. The outermost layer swirled around like a walnut. There were lumps and hollows, strands that looked like chewed-up chicken and a region at the back resembling a shriveled cauliflower. I wanted to run my finger over its silky contours, but there was strictly no touching. I satisfied myself by placing my head just inches from his, wondering what life it had once held. I called him Clive.

  I HAVE ALWAYS BEEN interested in people’s lives. Perhaps this is why I was compelled to study the human brain at university. The two are, after all, inextricably linked. Everything that we feel, every story we experience or tell, we owe to that three-pound lump of mush in our heads.

  That may seem obvious today, but it wasn’t always so clear. The first mention of the brain was by the ancient Egyptians in a surgical scroll called the Edwin Smith Papyrus. They wrote that the way to identify the brain was to “probe a head wound and see whether it throbs and flutters under your fingertips.”1 But seemingly it was an organ of little interest. If a head wound had occurred, they would pour oil on it and take the patient’s pulse “to measure his heart . . . in order to learn the knowledge that comes from it.” For it was the heart, not the brain, that was believed to house our mind at the time. After death, the heart was carefully preserved inside the body to allow safe passage into the afterlife, while the brain was fished out piece by piece through the nose.

  It was only around 300 BC, when Plato began grappling with the idea that the brain was the seat of the immortal human soul, that it gained a greater significance in medical thought. But although his teachings would later influence many scholars, his own peers were not convinced. Even Plato’s best student, Aristotle, continued to argue that the mind was contained in the heart. Physicians at the time were reluctant to open human cadavers, fearful of preventing their owners’ souls from reaching the afterlife. So his arguments were largely based on dissections of the animal kingdom, which revealed that many creatures had no visible brain at all. How then could it have any vital role?

  Aristotle declared that the heart carried out the responsibilities of the rational soul, providing life to the rest of the body. The brain was there simply as a cooling system, tempering “the heat and seething” of the heart.2

  (Later we’ll find out that they may both have been correct—that you cannot think or feel without both your heart and brain communicating with each other.)

  The Greek anatomists Herophilus and Erasistratus finally got the chance to dissect the human brain in 322 BC. Less concerned with identifying the soul, they concentrated on basic physiology, discovering the network of fibers that run from the brain to the spine and out around the body—what we now refer to as the nervous system.

  It was down in the gladiator stadium, however, where the brain really came into its own. The philosopher, physician and writer Claudius Galen was forbidden by Roman law from dissecting the human brain for himself, so instead he would head to the dusty arena, where he could gain a snapshot of the brain’s anatomy by treating bloodied soldiers whose skulls had been torn apart in combat.

  But it was his experiments on live, squealing pigs that caused the biggest sensation. In front of a large crowd, he would slice through the laryngeal nerve connecting the pig’s voice box to its brain, and watch as the pig fell silent. The crowds would gasp—Galen had offered the first public demonstration that the mind, not the heart, controlled behavior.

  Galen also discovered four cavities within the human brain, later called ventricles. We now know that ventricles are spaces containing fluid that protects the brain against physical knocks and disease, but Galen’s prevailing view was that all aspects of the immortal soul floated around these ventricles. It then passed into “animal spirits,” which were pumped around the body. This explanation particularly suited those high up in the Christian church, who were growing increasingly concerned about the idea that the brain could provide a physical basis for the soul. How could something be immortal if it was present in such frail flesh? It was much more palatable to place our soul in these “empty” spaces instead.

  GALEN’S THEORIES OF THE BRAIN reigned for fifteen centuries, and religion continued to influence those who built upon his ideas. René Descartes, for instance, famously declared that the mind and the body were separate—what is now known as dualism. The mind was immaterial and did not follow the laws of physics. Instead, he said, it did its bidding via the pineal gland, a small pine-nut-shaped region in the center of the brain. The pineal gland would move, letting out the particular animal spirit required to carry out the soul’s needs. His purpose in showing this distinction was to rebut those “irreligious people” who would not believe in the soul’s immortality without a scientific demonstration of it.

  But it was in the dirty, smoke-filled streets of seventeenth-century Oxford where things really started to get interesting. Down in the bowels of the city’s university, a resourceful young physician called Thomas Willis was sharpening his scalpel.

  In front of a large audience of anatomists, philosophers and interested public, he would carve up the human body and brain, demonstrating its intricate anatomy to anyone who cared to watch. He had been given permission to do so by King Charles I, who allowed him to dissect any criminal sentenced to death within the city. It was thanks to this that he created meticulous illustrations of the human brain, and was said to have become “addicted . . . to the opening of heads.”3

  I mention Willis for it was he who really began to cement the idea that our human identity was connected to the brain. He started to match the altered behavior he observed during his patients’ lives to deformities he discovered during autopsy. For instance, he noted that people who had pains in the back of their head, near to an area of the brain called the cerebellum, also had pain in their heart. To prove that the two were linked, Willis opened up a live dog and clamped the nerves running between the two—the dog’s heart stopped and the animal died almost immediately. Willis went on to examine how the brain’s chemistry might produce other as
pects of our lives: dreams, imagination and memories. It was a project that he called “neurologie.”

  In the nineteenth century, the German anatomist Franz Joseph Gall pulled us closer still to our modern understanding of the brain by advocating the idea of localization. The brain, he said, was comprised of specific compartments, each responsible for a fundamental faculty or tendency, including a talent for poetry and an instinct for murder. He also thought that the shape of the skull could determine personality. Gall had a friend who had big bulging eyes, and because his friend also had a fantastic memory and was great with languages, he believed that the brain regions responsible for these abilities must be located behind the eyes, and had grown so large that they were pushing the eyeballs outward. Despite phrenology later being discredited, Gall’s idea of the brain being made up of discrete regions was prescient—in some cases he was even correct in pinpointing their responsibilities. His “organ of mirthfulness,” for instance, was placed toward the front of the head, just above the eyes. In later years, neurologists would come to stimulate this area and in doing so make a patient burst out laughing.

  Gall’s observations ushered in a new age of the brain—one that separated itself from the philosophy-driven science of prior centuries. Later, the acceptance of atoms and electricity allowed us finally to bid farewell to the animal spirits of the past. Nerves were no longer hollow conduits through which the soul’s desires were driven, but cells that crackled with electrical activity.

  Although scientists in the nineteenth century focused on using electrical stimulation to identify which bits of the brain carried out what functions (no doubt spurred on by the fact that they got to name the regions after themselves), those of the mid- to late-twentieth century placed more emphasis on the ways in which these areas communicate with each other. They discovered that communication between different regions of the brain was more important in bringing about complex behavior than the action of any one region alone. Functional MRI, EEG and CAT scans allowed us to view the brain in intimate detail, even examine its activity while hard at work.

  Through these tools, we now know that there are 180 distinct regions that lie within that three-pound lump of tissue that throbs and flutters within our skulls. And back in the anatomy room at the University of Bristol, I was tasked with gaining an intimate knowledge of each one.

  AS I STARED AT CLIVE, I could easily spot the most recognizable region of the human brain—the cerebral cortex. This forms the outside shell and is divided into two almost identical hemispheres. We tend to carve up each side of the cortex into four lobes, which together are responsible for all our most impressive mental functions. If you touch your forehead, the lobe closest to your finger is called the frontal cortex and it allows us to make decisions, controls our emotions and helps us understand the actions of others. It gives us all sorts of aspects of our personality: our ambition, our foresight and our moral standards. If you were then to trace your finger around either side of your head toward your ear, you would find the temporal lobe, which helps us understand the meaning of words and speech and gives us the ability to recognize people’s faces. Run your finger up toward the crown of your head and you’ll reach the parietal lobe, which is involved in many of our senses, as well as certain aspects of language. Low down toward the nape of the neck is the occipital lobe, whose primary concern is vision.

  Hanging off the back of the brain we have a second “little brain,” that distinctive cauliflower-shaped mass. This is called the cerebellum and it is vital for our balance, movement and posture. Finally, if you were to gently pry open the two hemispheres (a bit like pulling apart a peach to reveal the stone), you would find the brain stem, the area that controls each breath and every heartbeat, as well as the thalamus, which acts as a grand central station, relaying information back and forth between all the other regions.

  Although they are too tiny to see with the naked eye, the brain is full of cells called neurons. These cells act like wires from an old-fashioned telephone system, passing messages from one side of the brain to the other in the form of electrical impulses. Neurons branch out like twigs on a tree, each forming connections with its neighbors. There are so many of these connections that if you were to count one every second, it would take you three million years to finish.

  We now know that the mind arises from the precise physical state of these neurons at any one moment. It is from this chaotic activity that our emotions appear, our personalities are formed and our imaginations are stirred. It is arguably one of the most impressive and complex phenomena known to man.

  So it’s not surprising that sometimes it all goes wrong.

  JACK AND BEVERLY WILGUS, vintage-photography enthusiasts, don’t recall how they came by the nineteenth-century image of a handsome yet disfigured man. They called him “the Whaler” because they thought the pole he held in his hand was part of a harpoon. His left eye was closed, so they invented an encounter with an angry whale that left him with one eye stitched shut. Later, they discovered that it wasn’t a harpoon but an iron rod, and that the photo was the only one known of a man called Phineas Gage.

  In 1848, the twenty-five-year-old Gage was working on a railroad bed when he was distracted by some activity behind him. As he turned his head, the large rod he was using to pack powder explosives struck a rock, caused a spark and the powder exploded. The rod flew up through his jaw, traveled behind his eye, made its way through the left-hand side of his brain and shot out the other side. Despite his somewhat miraculous survival, Gage was never the same again. The once jovial, kind young man became aggressive, rude and prone to swearing at the most inappropriate times.

  As a toddler, Alonzo Clemons also suffered a traumatic head injury, after falling onto the bathroom floor. Left with severe learning difficulties and a low IQ, he was unable to read or write. Yet from that day on he showed an incredible ability to sculpt. He would use whatever materials he could get his hands on—Play-Doh, soap, tar—to mold a perfect image of any animal after the briefest of glances. His condition was diagnosed as acquired savant syndrome, a rare and complex disorder in which damage to the brain appears to increase people’s talent for art, memory or music.

  SM, as she is known to the scientific world, has been held at gunpoint and twice threatened with a knife. Yet she has never experienced an ounce of fear. In fact, she is physically incapable of such emotion. An unusual condition called Urbach-Wiethe disease has slowly calcified her amygdalae, two almond-shaped structures deep in the center of the brain that are responsible for the human fear response. Without fear, her innate curiosity sees her approach poisonous spiders without a second’s thought. She talks to muggers with little regard for her own safety. When she comes across deadly snakes in her garden, she picks them up and throws them away.

  By the end of my degree, it had become clear to me that unfortunate accidents, maverick surgeries, disease and genetic mutations are often the reason we discover how different bits of the brain work. Gage showed us that our personalities were intimately tied up in the front regions of the brain. Studies on autistic savants like Clemons have propelled our understanding of creativity. Even today, scientists continue to try to scare SM, in the hope that they’ll have a better understanding of how to treat those who fear too much. I was enchanted by this concept: The strangest, most unique brains are often those that teach us the most about our own.

  OF COURSE, NOT SO LONG AGO, having an unusual brain would have seen you carted off to an asylum. “Mental illness” is a term that has been in use only for the past two hundred years; prior to that, any strange behavior would have been considered madness, and blamed on anything from curses to demons to an imbalance of humors in the body.4 If you lived in England and were suffering from such madness, you might have found yourself in Bethlem Hospital, popularly known as Bedlam. In his book This Way Madness Lies, Mike Jay refers to Bethlem as the stereotypical eighteenth-century madhouse, later a nineteenth-century lunatic asylum, and now a model example of a
twenty-first-century psychiatric hospital.5

  The different incarnations of the hospital reflect how society has undergone a radical transformation in its treatment of the strange brain. When Bethlem was first founded, it specialized in keeping off the street those referred to as “lunaticke.” Its guests were violent or delusional, had lost their memory, speech or reason. They were locked up among vagrants, beggars and petty criminals.

  Patients were given general treatments aimed at restoring a healthy constitution. These included bloodletting, cold showers and emetics that made them vomit up anything that might be blocking their digestion. It was the madness of King George III that prompted a shift in this attitude. George had been taken ill with a stomach bug but soon started foaming at the mouth and showing signs of insanity. Clergyman Francis Willis was called; he had a formidable reputation for curing such illness. His approach was straightforward: he put George to work in the fields, dressed him well, made him exercise and encouraged “good cheer.” Over three months George’s mental health improved alongside his physical symptoms. The idea that madness was something that could be corrected began to percolate within the medical community. Through the nineteenth century, asylums progressed alongside increasingly rational explanations of how the mind worked. There were still a few bumps in the road—straitjackets were a common sight and many therapies would be considered barbaric by today’s standards—but doctors also began to think about how the wider family might help their patients, how interaction with the outside world could be established and what drugs might help ease pain and subdue anxieties. In the early twentieth century “insanity” was rebranded “mental disease,” and physicians began to conceive of a biological basis for disorders of the mind. Just as Thomas Willis predicted, they were able to look into the brain and start to pinpoint the exact changes that correspond to unusual behaviors and perceptions.

  Today we understand that mental illness, or in fact any mental anomaly, can be the result of small malfunctions in electrical activity, hormonal imbalances, lesions, tumors or genetic mutations—some of which we can fix, some we can’t, and some that we no longer see as a problem.

 

‹ Prev