Memory's Last Breath
Page 8
Redemption has been—is—everywhere, everywhen, as close at hand as the unknown girl we spot out the car window picking up a maple leaf the size of a breakfast plate, as algorithmic as the yellow-gold fall carpet in our yard that we rake into ad hoc nests into which our grandchildren, Kanye and Aliya and Dante, plunge headlong like our children, Marissa and Newton, did in their time, Peter and I in ours, our forebears in theirs, and generations to come will do until the sun becomes a black dwarf and there are no trees to reconsider their leaves, conclude that their verdancy is the price of routing all nutrients to the trunks and roots so they may survive the winter, revoke the photosynthesizing privileges of the foliage (thereby unmasking the golden, orange, and yellow hues of carotenoid pigments hidden all summer by the shocking green of chlorophyll), dispatch a chemical Dear John letter that reads “Go away!” to every leaf, each of which stoically replies with a bumpy line of scissor cells that snips the leaf from the stem until it hangs on by a few threads of xylem so fine that the next breeze launches the golden glider upward on a small, looping whirlwind, which, like a lover on the rebound, soon casts it to the ground.
Where are you going, Gertjie?
I’m going to play in the sand.
Chapter Four
This Is Your Brain on the Fritz
FAMILY LEGEND HAS IT that on a winter day when it was pouring with rain in Cape Town, my parents sent me into our neighborhood café with the task of buying a loaf of bread. “One loaf of brown bread, please,” I was reported to have said. From the shop door Pappa beamed his pride over the cocoon in his arms that was my baby sister Lana, and Mamma formed the words with her lips as when we had practiced them at home. She gestured by her pointing umbrella that I should stretch my arm toward the counter to hand over the money. Exchanging a conspiratorial smile with my parents, the café tannie handed me my purchase and the change. As I toddled toward my parents, shedding coins but keeping a determined grip on the still warm, uncut, and unwrapped loaf against my tummy, Pappa announced to the draggle of customers, in the informative tones of a National Geographic voiceover, that I would only turn two in nine weeks’ time.
Given the nature of legend, it is of course possible that my parents inadvertently misremembered the downpour as winter rain rather than a summer shower, which would have made the event nine weeks after my second birthday, or maybe it hadn’t been raining at all, but my hair was still wet and seaweed-smelling after a day on the beach, or maybe it was my father only who took me on an excursion while my mother had stayed at home with the new baby. Whatever the eyewitness circumstances, the narrative drive of this family fable is clearly my parents’ desire to make known for posterity what they perceived as their first child’s extraordinary cleverness.
This and other similar stories, more familiar from their frequent retellings than my actual memories of the experiences themselves, formed the kernel of my “self” during my formative years. As I grew up, I mercifully got over the notion that a strong intellect in my case—or in others’, exceptional beauty, or athletic prowess, or musical ability—should constitute the core of any respectable identity. For who or what would you be if accident or injury, moth or rust, should destroy that God-given gift? What in you would be left to declare, “I am”?
However, early childhood impressions, particularly those laid down before the acquisition of language, are—as Freud rightly tells us—impossible to erase. Our “mature” conscious mind may rationalize them away, but our unconscious clings to them like Freud’s archetypal mothers to their penis envy. Accordingly, when I received the news that microvascular disease was clogging my brain and gumming up my memory, my first impulse was to use that old standby, my supposedly superior brain—or what was now left of it—to find out what was going on inside my skull. I plunged into a project to find out as much as I could about the human brain.
Note to self: Remember that the poet Virgil who guided Dante through the nine circles of hell did not thereby gain heaven for himself.
Note to readers: If your desire is fixed to follow me, see what is the fate of our cause: All the gods on whom this empire was stayed have gone forth, leaving neither shrine nor altar; the city you aid is in flames. Let us rush into the battle’s midst, and die!
PS from Virgil: The only hope of the vanquished is not to hope.
I began my brain project with a refresher on what I had learned in high school biology. In hindsight, “refresher” was a hubristic understatement—my memory, which feels so clear when recalling buying a loaf of bread sixty-five years ago, needed serious jogging to remember the p’s and q’s of the brain, from pineal gland to quadrigemina. And then there was the whole new alphabet of neuroscience, a branch of brain studies that was only just coming into existence during my high school years in the 1960s. Since science stands or falls by the details, bear with me as I share the know-how that helped me understand my unraveling mind.
The spinal cord, along with the brain, makes up the central nervous system of humans. It partners with the brain to send information around the body as electrical signals. It tells the body what the brain wants and the brain what the body is feeling. In evolutionary terms, we can think of the spinal cord as a proto-brain—it first developed in worms. As more complex organisms evolved, the spinal cord got longer and thicker, eventually needing protection in the form of the spinal column, a tunnel made of vertebrae. All vertebrate animals have spinal cords, from jawless fish to birds and mammals. As animal complexity increased, more nervous system components were needed to send information around larger and more specialized bodies. The resulting new brainware was packed on top of the spinal cord, forming increasingly complex brains. By the time the subspecies Homo sapiens sapiens emerged, our brain had grown to almost forty times the weight of the spinal cord, that is, almost three pounds to the spinal cord’s one ounce.
The modern human brain consists of three major structures, which, in the order they evolved, are the brain stem, the cerebellum, and the cerebrum. To imagine how this complex organ came about through additions to the spinal cord, let’s mimic the evolutionary steps in modeling clay: First, roll a chunk of red clay between your palms into a rope for the spinal column. Next, make a green knob and stick it on top of the spinal column for the brain stem. An orange wad of clay, shaped to look like a peach, becomes the cerebellum, sticking out at right angles from the brain stem. Finally, shape a large dollop of yellow clay into a mushroom-style umbrella big enough to top the brain stem and cerebellum like a helmet. This is the cerebrum. Last, let’s make the newest part of the cerebrum, a thin layer of cells known as the neocortex. Pinch off a plum-sized piece of purple clay and roll it out round and very thin, as you would for a pie crust, until it is almost twice as big as the top surface of the yellow cerebellum-mushroom. Now fit it over the cerebellum, crinkling it up to fit like a shower cap.
Voilà! This is your brain. Sort of.
Now let’s take an earth-to-moon step from kindergarten to anatomy 101.
The oldest structure in the human brain, the brain stem, first appeared in our vertebrate ancestors 450 million years BCE. It is located between the cerebellum and the larynx, right behind where we swallow—think of it as the extension of the nerves in your spinal cord into your brain. Given its evolutionary origins, the brain stem is also known as the reptilian brain. Carl Sagan wasn’t kidding when he said, “Deep inside the skull of every one of us there is something like a brain of a crocodile.” But let’s not dis our “crocodile”: it forms the link between the spinal cord (our worm brain), the cerebellum (our mammalian brain), and the cerebrum (our Homo sapiens sapiens brain). The reptilian brain controls the heart, lungs, and other vital organs. It enables our quick, involuntary reaction to immediate danger and contributes to the control of breathing, sleeping, and circulation. The second oldest brain structure, the cerebellum or mammalian brain, encapsulates the brain stem. (Remember our yellow mushroom umbrella?) First evolved in mammals 200 million years BCE, roughly in parallel with th
e dinosaurs, the cerebellum coordinates movement, balance, and motor learning. For example, it combines sensory input from the inner ear and muscles to provide accurate control of one’s position and movement. The physical slowing associated with depression illustrates how deficits in the cerebellum result in malfunctions we can observe in people’s movements.
The newest part of the human brain is the cerebrum or cerebral cortex, which first appeared 2.5 million years BCE when the genus Homo evolved. The cerebral cortex (cortex is the Latin word for “cap”—ah, our purple shower cap!) is associated with higher brain functions such as complex thought, perception, and action. The neocortex is the outermost layer of the cerebrum, and it’s what doctors see when they crack open someone’s skull to perform surgery or autopsy. Only found in mammals, the neocortex consists of a six-layered structure of neurons, or brain cells, that occupies the bulk of the cerebrum. It varies in thickness from two to four millimeters, but instead of a single sheet, the neocortex folds over on itself, again and again, forming wrinkles. Its surface area is much larger than the surface of the skull, which enables the brain to pack in millions of additional neurons while retaining a head volume that fits the size of the human body. For now, at least, we have not evolved into creatures of science fiction, with huge dome-like heads and withered bodies. Instead—courtesy of the cortex’s wrinkles—we grow from bobble-headed babies to adults capable of supporting and controlling our heads, even as our Homo sapiens sapiens brain enables intellectual functioning on a level never before achieved.
The highly wrinkled neocortex
The more recently a particular mammal species evolved, the more wrinkled its brain appears. In us latecomers, then, our voluminous neocortex has many fissures, grooves, and rounded prominences that bud on the hemisphere surfaces like cauliflower florets.
Human
Monkey
Cat
Rat
Frog
Comparison of the brain surfaces of various species. Note that the frog has no cerebral cortex at all.
The cortex consists of neurons and supporting cells known as glia. A neuron is an excitable cell that processes and transmits information through electrical and chemical signals. A typical neuron consists of a cell body, dendrites, and an axon. Dendrites are thin structures that arise from the cell body, often extending for hundreds of micrometers and branching multiple times, giving rise to a complex dendritic tree. Their job is to receive electrical messages from the axons of neurons. An axon is a long threadlike extension of the cell body that conducts impulses away from the cell. In humans, it can extend as far as one meter.
In a living brain, cortical neurons appear gray. Accordingly, clusters of gray neurons on the surface of or inside the cortex are known as gray matter. The brain tissue underneath the cortex looks white, since it mainly consists of axons protected with a white myelin sheath. Clusters of this type of neuron are therefore known as white matter. Clusters of gray matter, consisting of axons that are not myelinated, appear deep within the white matter. The lesions at the root of my dementia are (so far) located in the white matter of my frontal lobe.
The ability to pack more brain matter into the available space in the skull has allowed primates and especially humans to evolve new functional areas of the neocortex, for example, enhanced cognitive skills such as working memory, speech, and language. Working memory is the system that actively holds information in the mind for tasks such as reasoning and comprehension. To underwrite all these functions, the working memory must actively manipulate information rather than merely store it. Why am I telling you all this? This is where dementia comes into play. My own deficits in working memory are the bane of my life, causing my daily failure at tasks as simple as making a phone call.
[Typical progression of a Gerda phone call: (1) Find the phone number (I no longer remember any phone numbers, except now and then my own); is it in my electronic or paper address book? (2) If electronic, open email to search contacts. (3) Forget why email is open, start catching up with unanswered messages. (4) Eventually remember original intention (sometimes). (5) Retrieve number, go to the phone, hear the message beep. (6) Retrieve messages, return urgent calls. (7) Start cleaning the counter where the phone is plugged in. (8) Damn! Who was I going to call again?]
Damage to gray and white matter can cause profound impairment. One root cause of various kinds of brain damage is reduced oxygen flow to the entire brain or a particular area. Any extended oxygen deprivation, such as that caused by trauma, intoxication, stroke, oxygen deficiency, or carbon monoxide poisoning, can be debilitating, even if the person survives the damage. Deprived of oxygen, neurons will be unable to undergo cell division to produce new neurons to replace the damaged ones. In vascular dementia, the oxygen deficiency results because of the blocking of microscopic blood vessels.
Until the 1990s, one of the commandments of neuroscience went “Thou shalt not grow new neurons once thou art born.” That changed with the discovery of neurogenesis, or the ability of adult human brains to produce brand new cells. According to our current understanding, each of us grows millions of new neurons during our lifetime, “even as we are elderly and dying of cancer.” In order to understand the limits of neurogenesis, however, one has to know most of our daily dose of new neurons do not live very long. To become part of the working brain, a new neuron needs not only support from neighboring glial cells and nutrients from blood, but also, and more importantly, connections with other neurons. Without these connections, neurons wither and die.
During puberty, millions of new neurons arrive in our brains and new connections are formed. Their growth is triggered by sex hormones, and the job of the new arrivals is to prepare the individual for the adult world. The explosion of new neuron growth occurs in the frontal lobe, whose main function is supervising the rear brain. In other words, the neurons that proliferate in the frontal lobe during puberty have the job of developing the so-called executive function, or our capacity to “make social judgments, weigh alternatives, plan for the future, and hold our behavior in check.”
On a biological level, the acquisition of neurons involves molecular rearrangements that enable the formation of miles of new neuronal connections. When we exit puberty, our brains will never again form new connections at the rate and to the extent they do while we are teenagers. Our transition to adulthood is heavily governed by the knowledge and skills the prepubescent child has already stored up and connected to related areas of control. If the prepubescent child has missed important peak phases of development—like obtaining language skills, forming social bonds, or acquiring basic social skills—the frontal lobes would face an almost impossible task in rectifying those deficiencies. As adults we can only fully develop our executive function by building on connections we made in childhood and during adolescence. By the age of twenty-five, it’s pretty well set.
New research confirms what a lot of parents know intuitively: the teenage brain functions very differently from that of a mature adult. The prefrontal cortex—the very part of the brain in which the executive function, which includes impulse control—is, in itself, in a massive state of flux. (From a neurological perspective, trying teenagers as adults in our legal system makes little sense.)
The implication for adults in all this is this daunting fact: damage to the nervous system in adulthood is irreversible for most practical purposes. Our brains just can’t restructure brain matter as quickly or widely as they used to. But all may not yet be lost: research suggests that the most active area of adult neurogenesis is the hippocampus, an area in the medial temporal lobe that distinguishes new events, places, and stimuli from previously experienced ones and appropriately indexes and stores the new ones. The hippocampus is an active area for neurogenesis in adults. Some scientists have proposed that those thousands of new cells produced in the hippocampus each day could be put to marvelous use in restoring the rapidly fading capabilities in my and others’ executive functions, but research has not yet shown
how, or even if, they can indeed do so. In a radio interview, Michael Gazzaniga, one of the founders of contemporary neuroscience, recently stated that the idea of brain plasticity might have been “oversold.”
Dementia Field Notes
6-30-2013
Since our neighbor Bob’s stroke during the summer of 2011, his dementia has been getting steadily worse via a series of little strokes, some so small Diane doesn’t notice—but his MRI sees everything. Last week he had a “crazy” episode. While I was outside at the mailbox, he frantically waved me over. He said he was very mad at “her,” pointing to Diane, who was peeking out the front door. By then, Diane said, Bob had already been shouting abuse at her for over an hour. He was furious because he thought the paper had not been delivered. She had already finished reading it and had put it in the recycling bin. Even after she fetched the paper, he was not appeased. He told her to call the newspaper office to complain. I suggested that Diane go inside to ostensibly call the paper, but instead call the doctor.
Bob and Diane Bond, my across-the-street neighbors. Peter took the photo during a friends-and-neighbors summer party at our house on July 27, 2009, two years before Bob suffered the stroke that triggered his dementia.
With Diane back inside, Bob calmed down a little bit. He let me hug him. I could feel him relax against my body. I got him to sit down so he could show me where Mary’s dog had bitten him a few days earlier. He pulled up his pants legs and pointed out each bandage. Then he remembered being mad, and told the story again. The only thing I understood was this remarkable sentence: “I would prefer that that woman should die.” He kept shouting his curse, so I walked him to our house to show Peter his dog bite. Diane returned with a dose of his “mood medicine,” which the doctor had told her to double up. Bob refused it. Under our combined cajoling, he reluctantly swallowed it—and slept through most of the night and the next day.