Thumbs, Toes, and Tears

Home > Other > Thumbs, Toes, and Tears > Page 12
Thumbs, Toes, and Tears Page 12

by Chip Walter


  Or take another example: A troop comes up with separate words for “ripe apple,” “ripe plum,” and “ripe banana,” each word standing on its own. But later, if separate words evolved for “ripe” and each of the fruits, rather than three fruit-adjective words with limited uses, our predecessors would have found four words they could have used to say the same thing: “ripe,” “banana,” “plum,” “apple,” and ripe anything else, including “ripe old man,” or a person “ripe” for the picking.

  Was the development of true language built on this scaffolding? We can only speculate, intelligently. Perhaps with the arrival of our species 195,000 years ago, the transition from gesture to the spoken word was made, and the blossoming of human speech and culture began to gather speed.22,23 Perhaps.

  But for that to happen, something else deeply bound to the emergence of language would have had to have surfaced first: an awareness of ourselves—the understanding that we existed.

  Chapter 6

  I Am Me: The Rise of Consciousness

  The first human wave was, however, a little wave, threatening to vanish. … Tremendous bodily adjustments were in process, and, in the low skull vault, a dream animal was in the process of development, a user of invisible symbols. In its beginnings, and ever more desperately, such a being walks the knife-edge of extinction.

  —Loren Eisley, “The Angry Winter”

  The prefrontal cortex is the newest part of our brain, the part that sits right behind our foreheads. In evolutionary terms it developed at a remarkably rapid pace. Three hundred thousand years ago, when the last Homo erectus met his fate, it basically did not exist. Today every human has one, which means that the most complex part of our brains evolved and increased our overall brain size 25 to 30 percent in an evolutionary blink.

  The prefrontal cortex is where we do most of our high-end thinking. It is where we worry, symbolize, and process a sense of self and of time, where we recall complex memories and imagine events in the future that haven’t yet happened.1

  Cognitive scientist Terrence Deacon has pointed out that one of the many reasons why the prefrontal cortex is so central to the human experience is that it is deeply wired into every other area of the brain, even extremely ancient ones. This makes it a kind of general contractor, keeping the big picture in mind while staying in close touch with nitty-gritty work such as hearing, moving our limbs, and controlling our breathing. While other parts of the brain might tend to specialize in their fields, the prefrontal cortex is a generalist that plays a role in nearly all cerebral experience.

  This area also houses an ability that is far less developed in other mammals and primates, something scientists call “working memory.” We know that humans can symbolize thoughts and ideas, but working memory enables us to take a thought or a memory, set it aside, shift our attention to something else, then pick up where we left off. As I write this, I am thinking of examples I can provide that will explain what working memory is so I can get them down on paper. If the phone rings, I can set those ideas aside, answer the phone, have a conversation, and then haul the examples I was developing back out of my memory and further develop them.

  This might seem simple. After all, we all do it all the time. But it is not simple. Not only are we symbolizing, encoding, and cerebrally packaging those thoughts or experiences, but we are also then recalling them, reconnecting them with all of the thoughts with which they were previously linked, and then conceiving new ways to develop them. It is as though they were objects we have made and shaped that we literally set aside while we take another object in hand to shape, and then we connect them.

  Several remarkable abilities flow from working memory. First, it helps us prioritize and take better charge of our lives. If we can set knowledge aside and then later recall and reuse it, it follows that we can also decide to deny ourselves something in the short term to accomplish something more important to us in the long term. We might want a second piece of pie after dinner, for example, but we might also understand that having that pie will thicken our waistline and raise our cholesterol. So for the sake of our health, pleasurable as pie would be in the short term, we forgo it. Or we might go into debt to earn a master’s degree, planning that later we will land a better-paying job that enables us to repay the debt and live a more fulfilling life in the bargain. Brain scans have shown that when we decide to delay one action in favor of inaction (in other words, when our working memory prioritizes concepts), sections of the prefrontal cortex activate.2

  The prefrontal cortex’s talent for prioritizing and inhibiting is one reason why we don’t walk into parties and begin sniffing one another the way dogs do at the park. The prefrontal cortex is subduing the part of the brain that wants to act like a dog and guiding it to do more socially acceptable things, like smiling and shaking hands.

  In addition to brain scans, we also know that the forebrain plays this inhibitory role because scientists have studied people who have had this part of their neocortex injured. Neuroscientist Antonio Damasio and his colleagues at the University of Iowa, for example, have described the cases of two people whose forebrains were damaged in infancy: a man by a tumor discovered at three months of age, and a woman who was run over by a car when she was fifteen months old. Both survived their injuries and went on to grow up in stable homes, with educated parents and healthy siblings. Nevertheless, both eventually began to have problems. As teenagers they stole and lied and generally seemed to have lost their moral compass. Though they were often pleasant, they would do and say awful things and then show no remorse for their actions.3 Their forebrain seemed incapable of inhibiting some behaviors.

  The most celebrated case of forebrain damage is the story of Phineas Gage, a railroad foreman who was tamping an explosive charge with a metal rod in Cavendish, Vermont, in September 1848, when the charge accidentally exploded, propelling the inch-and-a-half-wide, thirteen-pound tamping iron through his left cheekbone, behind his eye, and completely through his skull, destroying the front, left side of his brain. Amazingly, Gage survived the accident and even remained conscious as his coworkers got him to his feet and then to a local doctor named John Harlow. Harlow treated him so successfully that Gage went home to New Hampshire ten weeks later.4

  Within a year, Gage actually felt well enough to work, but he was unable to land a job with his old employers, not because he was physically handicapped, but because his personality had changed. Before the accident, Gage had been one of the railroad’s best construction foreman: capable, a clever problem solver, pleasant with his workers, and blessed with a shrewd sense of business. But now he was surly, grossly profane, and showed little respect for anyone he worked with. In fact, his fellow workers said he was “no longer Gage,” but some other man who was stubborn, capricious, and impatient.

  Lobotomies that sever prefrontal connections with the rest of the brain have sometimes had similar effects. They were performed by the thousands in the 1940s to treat everything from schizophrenia to criminal behavior. They often disposed of personalities as well as patients’ disabilities, but many times, rather than calming down these unfortunate people, it made them agitated and disruptive, like Phineas Gage.

  Drinking is a common disinhibitor that works against the prioritizing capabilities of the forebrain. Tests show that alcohol affects the levels of the neurotransmitters dopamine and gamma-aminobutyric acid, or GABA in the prefrontal cortex. After the first drink or two, dopamine levels rise and increase the feeling of elation and confidence. It helps transform quiet people into gregarious ones and can make the shy vivacious. GABA slows neurons from passing along signals that either excite or inhibit other neurons. Since it slows down these activities, it also means that your prefrontal cortex is less likely to stop behavior like putting on a lampshade or dancing on the bar, which is why people sometimes do these things or get into brawls because someone looked at them the wrong way after too much to drink. Long-term research has found that chronic alcoholism affects prefrontal cortical
capabilities like problem-solving and prioritizing. The point is that without working memory and the ability of the prefrontal cortex to delay, inhibit, prioritize, and basically generate its own mental cues, we would not act in a way most of us call human.

  As it turns out, these same capabilities are central to our ability to symbolize and organize language. That, at least, is Terrence Deacon’s argument. The symbols we attach to our thoughts make them portable and modular so that they can be looked over and rearranged like LEGOs. But the nature of holding multiple symbols in mind at once means that they also must be compared and prioritized for the simple reason that every one of them can’t simultaneously be top of mind. Some have to be rapidly shuffled to the side and subordinated to others.

  All of this makes yet something else that is extremely rare in other species possible for us. While the prefrontal cortex is busy processing sights, sounds, and smells that rattle inward along the myelin-encased highways it uses to reach into the deepest recesses of the brain, it is also manufacturing its own input, creating new symbols and new relationships between those symbols, without any stimulus from outside of our own mind. The brain is self-generating its own symbols.

  These form still more new patterns that we “see” before making a decision about what to do next. This is what most of us call thinking—the conscious, deliberate kind. It makes us very unlike other creatures, who, like my dog Jack, react instinctually and serially to whatever happens around them. In Jack’s case he may be sleeping one minute, then the next, when he sees me put on a jacket, runs to the door to bolt outside with his nose to the wind, waiting for the next thing to react to. Jack does not juggle multiple experiences, comparing one to the other to decide whether he should run after the Frisbee, sniff the elm on the left, or urinate on the oak to the right. Jack’s mind is incapable of purposefully prioritizing, or as John Locke put it, “Beasts abstract not.” He just follows his nose and disposes of his experiences as they strike him. Deacon calls this indexical memory; a linear index of experiences or reactions that come in one lobe of the brain and out the other.

  The Brain Explained

  Of course, the brain can’t be explained—that’s what makes it so mysterious. It is a Rube Goldberg machine, a mishmash of primitive cognitive vestiges and new evolutionary additions that together can accomplish amazing things, but in mostly unknown ways. Generally we don’t look at the brain that way. We tend to think that the forces of evolution are terrifically efficient at rooting out all wastefulness to make the brain thoroughly optimized for smooth, clean operation. But the truth is that evolution feels its way toward success, tinkering and puttering until it stumbles across marvelously inventive solutions to the problems that the need for survival presents, and then shambles on. Our brain, amazing as it is, is not an efficient machine, but a maddeningly complicated organ that stubbornly resists analysis. We do know a few things, however.

  Most adult human brains weigh about three pounds, no matter what the IQ of its owner. It has the consistency of firm gelatin, and scientists currently estimate that the most recently evolved part of the brain—the cerebral cortex—consists of about 30 billion neurons, each synaptically linked to a thousand other neurons.

  If we could count all of the particles in the known universe, physicists estimate we would come up with the number 10 followed by 79 zeros. But if you calculate the number of possible neural connections in the brain, you would arrive at the number 10 followed by 1 million zeros, at least. It would take us 32 million years to tick each one off. If you added up the length of myelinated nerve fibers in the average brain it would come to about 100,000 miles. Most human brains have about 186 million more neurons residing in the left hemisphere rather than the right, but considering the numbers we are dealing with, that difference is really trivial.

  But none of these calculations even begins to address the intricate structures and chemistry of the synapses, dendrites, and axons that link all of these cells to one another, or the complex interactions that take place each moment within each of the 50 or so varieties of neurons that perform specialized functions in the brain. In general, though, what neurons do best is store and move information. Each is a masterful engineering accomplishment in itself. A single neuron, for example, holds an average of 1 million sodium pumps in a space a fraction of the meager territory outlined by the period at the end of this sentence. Altogether, that is 100 billion trillion sodium pumps per brain. If they didn’t exist, we couldn’t think a thought or feel a sensation because they make it possible for each brain cell to receive and pass along impulses to the other neurons it is connected to.

  The brain generates its electrical impulses by swapping ions between atoms like bargainers at a flea market. Add ions here and subtract them there, and the atoms they are associated with develop positive or negative charges. When you move a muscle or see a light, neurons predisposed to react to those particular sensations activate ion channels inside themselves. Sodium is pumped through the conduits, and this positively charges a membrane in the neuron. If the sodium channels are highly activated, the electrical signal reaches a “threshold potential” and triggers a nerve impulse, something like the way a loud, insistent hotel customer gets the attention of the bell captain. The impulse is then passed on and travels to its next neuronal destination, eventually gathering up other impulses to generate a thought, a moved hand, or a crippling fear.

  But if the sensation doesn’t result in enough sodium to pass the impulse along, the neuron remains at rest, unaroused, and diffuses whatever sodium has been pumped its way. The sensation stops there. We don’t really know how many flags like this our senses wave at our brain because we aren’t aware of all of them. The brain filters them out, or more accurately, their weaknesses filter the sensations themselves out. Good thing, too. Otherwise we would all suffer from a monumental case of attention deficit disorder, incessantly bombarded by every sound, sight, smell, taste, feeling, and thought we experience.

  Complex as the individual machinery of each neuron is, the truly defining characteristic of brain cells is their outgoing nature. They live (and die) to be in touch with one another. As you read these words, each neuron in your head is exchanging information at the rate of 200 million operations per second. This, by the standards of the average desktop computer, is ponderous, but what neurons lack in speed they make up for in affability. On average, each of our 100 billion neurons reaches out to 1,000 others, often by long and rambling routes, which keep every sector of the brain in close touch.5

  This neuronal need to communicate turns out to be one of the reasons why our brains are so large. Though you could be forgiven for thinking that we have grown cerebrally overweight because our brains are packed cheek to jowl with brain cells—at least compared with other mammals—you would be wrong. A rat’s cortex, for example, is jammed with 100,000 neurons per cubic millimeter, whereas ours has as little as a tenth as many.6But this shortage of neurons doesn’t make our brains simpler, it actually makes them more complex. The reason why our neurons are not as densely packed is because they need more elbow room so their dendrites and axons can reach out and connect with other neurons (axons send impulses, dendrites receive them).

  The brain cells of a rat and a human are virtually identical—we pretty much do our cerebral organizing with the same cellular equipment. But if you were to remove one neuron from a rat’s brain and one from a human with all of its connections intact, and then roll the two into separate balls, the human ball would be ten times larger than the rat’s. Apparently in the human brain, as in the real world, it is all about whom you know. Amazingly, to conduct all of these chemical and electrical conversations, the brain burns about the same energy each day that a 20-watt lightbulb does.

  This need neurons have to be in constant touch means that everything we do and every event we encounter is experienced as one great stream of consciousness. We have a sense that there is someone—ourselves, to be specific—who is seamlessly absorbing the world ar
ound us, and the world within us, rather than simply processing random sensations that have no connection. The source of our unique ability to be self-aware has very likely emerged from the rich chattering that our interconnected neurons constantly do. Just as a beehive emerges from the interactions of thousands of bees, or cities arise from the common needs of interacting humans and their environment, our humanness has materialized from the dense neuronal babbling of our brains

  The shuffling and patterning of symbols, the subordination of one concept or memory in favor of another, and the ability to prioritize, says Deacon, are all crucial to language and, ultimately, speech because language is all about assembling information in organized, hierarchical ways. It is evident in nearly every sentence we utter.

  Linguists call this shifting and subordination process recursion. Recursion is the way we fold one concept, when we speak, inside another. It reflects the way our mind organizes symbols. And it is the source of the sorts of grammatical conundrums that drove us all crazy in middle school English class when we struggled to unravel concepts like prepositional phrases, dependent clauses, and participles. You can find simple examples in the sentences “She walked behind the building,” or “I think he thought of a great idea,” or “He realized George thought of a great idea in the morning behind the building.”

 

‹ Prev