When We Are No More
Page 13
Because emotions are core to assessing value, the healthy brain does not waste space by storing data raw. We always cook data, and the most important information is marinated in heavy solutions of emotional cues. As we process perceptions, we extract the most significant and map them into networks crisscrossing the nervous system. Most of what we learn bypasses our awareness altogether and goes straight to the emotional and instinctual centers of the brain. This is for our own safety. Experience modifies behavior “without requiring any conscious memory content or even the experience that memory is being used.” Directed attention requires effort and takes time we do not have when we are out and about in the world. As noted, our unconscious attention is invariably drawn to what is new, unfamiliar, bright and shiny; or what is in motion, shadowy and sinister; or what triggers some emotional shudder. Without our being aware of it, attention is caught in the half-second it takes for a perception to come into our consciousness. By then we have already “made up our mind” about what we perceive and how to react to it. We can slow down to consider and change our minds. But we seldom do. Our split-second judgments often pass for conscious decisions, but that is an illusion. We like to say that memory plays tricks on us, and often it will present the past differently from what we actually experienced at the time. But this is a reminder that memory has many jobs other than keeping track of the facts as they occur.
Emotions are supremely significant because of their determi-native role in belief and decision making. As the neuroscientist R. J. Dolan writes, “Emotion exerts a powerful influence on reason and, in ways neither understood nor systematically researched, contributes to the fixation of belief” as well as memory. From the time of the Enlightenment onward, Western culture has deemed reason a stronger and more prestigious form of intelligence than emotion. Reason thinks slowly, not intuitively, and is effective as an instrument used to exert human will. But it is not the font of empathy and fellow feeling that social life requires. Computers, on the other hand, can reason with stunning speed, but they cannot simulate human decision-making processes with equal speed because they are not emotional. They can learn to simulate our behaviors by assessing the outcomes of past choices to make probabilistic predictions (“people who liked this also liked …”), and often that is good enough.
Reason is a rarefied creature, demanding concentration and concerted blocks of time. Consequently, we use it sparingly. We rarely expend the time and energy to sort out something unfamiliar, let alone something that creates dissonance with what we already “know.” Information that fits into preexisting categories and values is simple to incorporate. On the other hand, it is clearly self-defeating if acquiring new—let alone contradictory—information is really hard. Evaluating what we perceive depends on deep processing that operates over long periods of time and can be neither rushed nor short-circuited. This explains why a tired brain is less likely to acquire new information than one that is well rested. It also explains what is known as confirmation bias, whereby we pay more attention to information that confirms our existing view of how the world works than to information that contradicts or complicates that mental model. This is the paradox we live with: The mind’s struggle to accommodate more and more information in our inelastic brain can end up making us less open to new information, not more. To a large degree, wisdom is a feature we can acquire over time to compensate for this, developing a heightened ability to assess how valuable any new piece of information may be in the larger context of what we already know.
DISTRACTION: THE STORY OF S.
A man whose brain was far too elastic for his own good was the Russian known to us as S. His story is told with compassion and discretion by the psychologist Aleksandr Romanovich Luria in The Mind of a Mnemonist: A Little Book about a Vast Memory. (Luria refers to this man simply as S., and we shall observe his discretion.) Luria studied his subject over the first three decades after the Russian Revolution. S.’s problem was that he did not have what Luria calls “the art of forgetting,” a faculty of the healthy mind that creates order from excess by converting selected short-term memories into long-term memories and flushing away the rest. Long-term memories in turn align themselves into distinct patterns as they are abstracted into general categories and reused over time. S. remembered a prodigious amount of detailed information that never lost its vividness, but at the high price of never abstracting perceptions into greater patterns of significance.
As a consequence, S. suffered from a disorder of distraction: He could not make things dull, and had a hard time maintaining focus on anything for extended periods. He was unable to sort his impressions for value and emotional salience. To him the world was far too vivid far too much of the time. Luria reports his patient could remember everything but was unable to establish priorities for his memories. As a consequence, his speech was digressive and prolix. He would start out on one subject and end up somewhere very far away, often down a blind alley. He easily confused what he had remembered (because everything he encountered in his daily life triggered a chain of recollections) with what had actually transpired. Memories were so fresh in affect and spun out in his mind so rapidly that he mistook his recollections for reality. There were periods in his youth when he did not get up in the morning to go to school because even thinking about arising stimulated memories of having done so before. He thought that he had gone to school even as he lay still under the covers. After some period of time, S.’s inability to distinguish between what he had recollected and what had actually transpired led to a blurred sense of reality, an attenuated sense of actually being alive.
The proximate cause of his overarticulated memory was likely his synesthesia, an enigmatic neurological phenomenon whereby stimulation of one sense or perception provokes another unrelated one. When he recalled the word for beetle, zhuk, for example, he immediately thought of “a dented piece in the potty, a piece of rye bread,” and the entire sensation of turning the light on in the evening and having only a part of the room—the zhuk—in light. Once he thought of these, he could not let go of them. On the one hand, it was easy for S. to establish a series of associations that would prompt recall. He was able to make a living as a mnemonist, committing vast amounts of detailed information to his memory and recalling them on demand in a performance. On the other hand, the sheer density of sensory associations inadvertently set off by the slightest of provocations made everyday living distressing. Whatever other gifts he may or may not have had, synesthesia exacerbated the effect. Associations among senses effectively superseded any association among events and disconnected him from an ordinary sense of time passing. The net effect, Luria noted, was that S. lived dispossessed of the present, always in expectation of great things about to happen.
His affliction was destined to get worse with time as he accumulated more and more information and had trouble ridding himself of any of it. To compensate for the fact that he was forming memories in an uncontrollable way, he developed a technique to surround a specific memory with as little context as possible. He would narrow the frame of reference of a memory, tightening the context like a noose around a given fact he wished to recall, thus strangling all possible associations. How could this not bring about the effect of making the connection between things opaque, invisible, and dissociative? As a consequence, his sense of narrative broke down. When S. processed new information, everything went into Save and nothing into Delete because it was redundant, or merged into another file of similar memories. He could not consolidate the content of his memories, to “convert encounters with the particular into instances of the general.” The information he took in could not be compressed, abstracted, or generalized through pruning a memory of its redundant or irrelevant data points.
A telltale sign of his disability was his difficulty in reading. Every idea or concept called forth a torrent of imagery, most of it extremely vivid and tangential. “This makes for a tremendous amount of conflict and it becomes difficult for me to read. I’m slowed down, my attention is distra
cted, and I can’t get the important ideas in a passage.” Understanding poetry or in fact any figurative use of language was well-nigh impossible. He could not grasp synonyms or homonyms used in different contexts. He was perplexed by the use of the word “arm,” for example, to describe something both attached to a person and extending from an institution—the arm of his wife, say, versus the arm of the law. An expression such as “to weigh one’s words” confounded him. Poetry, by its very nature graphic and figurative, was beyond his comprehension because he could only understand the images literally. Paradoxically, poetic images so powerfully stimulated his own mental imagery that they could not evoke anything other than the very first meaning that came to mind, which was the literal meaning.
More serious were the implications of his inability to filter out distractions for his ability to follow a narrative. He could hardly track anything that changed over time, and that, according to Luria, was why he had a hard time remembering faces. He could not pick out any prominent or distinguishing features in a face because the face changed and was always so expressive. To him, every face was a narrative he could not follow.
The model of the world that S. carried in his head was never made coherent and continuous in the flow of time. It was simply flooded with more unassimilated data. He was incapable of making plans because he actually had no sense of how things happened. It was just one damned thing after another. “At one point I studied the stock market, and when I showed that I had a good memory for prices on the exchange, I became a broker. But it was just something I did for a while to make a living. As for real life—that’s something else again. It all took place in dreams, not in reality.”
S. was painfully aware of his problem. “I was passive for the most part, didn’t understand that time was moving on,” he said. “All the jobs I had were simply work I was doing ‘in the meantime.’ The feeling I had was: ‘I’m only twenty-five, only thirty—I’ve got my whole life ahead of me.’ … But even now I realize time’s passing and that I might have accomplished a great deal—but I don’t work. That’s the way I’ve always been.” Luria poignantly adds that “he had a family—a fine wife and a son who was a success—but this, too, he perceived as though through a haze. Indeed, one would be hard put to say which was more real for him: the world of imagination in which he lived, or the world of reality in which he was but a temporary guest.” S. felt there was some great and consequential drama transpiring somewhere, some great river of time and meaning that was nearby and he was sure to find at some point in time. But at what juncture in his life could he push off from shore and enter the mighty, relentlessly moving river of life? At the end of his life, S. felt detached from his own life, as if he had never really lived it.
DISTRACTION AND DISRUPTION
It is easy to read S.’s life story as a cautionary tale about the temptation to save all data because our capacity for digital storage keeps growing. The quantity of data we amass does not by itself add up to a grand narrative. Good memory is not data storage. Forgetting is necessary for true memory.
We have created a technologically advanced world that operates at a breathless pace, driven by a culture that demands more innovation (“disruption”). But science tells us that this disruptive and accelerated pace is self-defeating because our bodies and minds still operate at the same tempo as they did in ancient Sumer. Analog memory systems based on objects had a built-in friction that slowed us down and demanded depth of focus and concentration. Digital gets rid of all the friction, speeds things up, and taxes our powers of concentration and discrimination.
We are a culture obsessed with facts—the intrinsic value of a fact. But our brains do not share this reverence for facts. The human brain plays fast and loose with them. For facts are cultural not natural phenomena. What the mind seeks is meaning: a sense of order, of appropriateness, of measure and purpose. To that end, the brain finds impressions most useful for its business. From impressions, what Frith calls “the crude and ambiguous cues that impinge from the outside world onto my eyes and my ears and my fingers,” the mind will create a very faithful diorama of the external world, but it does not correspond to a factual representation. We have to interpret facts and impressions in the context of our environment in order to make sense of them. Facts have no intrinsic meaning. It is culture that creates expectations of what makes sense and what does not. If all the mind does is create literal representations of the external world, then all cultures in all times and all places would have exactly the same working models of the world. They do not. Each culture spawns its own interpretive framework by which people make sense of “the facts on the ground.”
The architect Le Corbusier (1887–1965), who self-confidently placed himself at the very vanguard of innovation, wrote that there are “living pasts and dead pasts. Some pasts are the liveliest instigators of the present and the best springboards into the future.” In the twenty-first century, as we live longer and the pace of change accelerates, our mental models of the world need to be increasingly flexible and easily updatable. Yet the faster things move, the harder it will be to maintain a strong sense of continuity. In periods of information inflation such as ours, it is all too easy to feel like S., constantly assailed by vivid images we have no time to digest, sort through, and disregard and discard if not of lasting value.
Le Corbusier had it right: The past is a plural noun. How do we distinguish between living pasts and dead, between what actually happened and a faux history, full of false facts and narrow windows into the future? With the pace of change accelerating, do we even need to bother with the past? If Thomas Jefferson, ardent revolutionary and futurist, is right, the more we care about the future, the more we need a rich, diverse, accessible record of the past. Because memory is not about the past, it is about the future.
CHAPTER EIGHT
IMAGINATION: MEMORY IN THE FUTURE TENSE
Time without consciousness—lower animal world; time with consciousness—man; consciousness without time—some still higher state.
—VLADIMIR NABOKOV, STRONG OPINIONS
In 2011, a team of scientists published a research report about “How to Grow a Mind: Statistics, Structure, and Abstraction.” They wanted to know how our minds build a world far richer than anything it knows from its own experience. Couched in the language of information processing, they asked:
How do our minds get so much with so little? We build rich causal models, make strong generalizations, and construct powerful abstractions, where the input data are sparse, noisy, and ambiguous—in every way far too limited. A massive mismatch looms between the information coming into our senses and the outputs of cognition.
In 397, Augustine of Hippo asked the same question, using the classical metaphor of the memory palace.
Great is this power of memory, exceedingly great, O my God, a spreading limitless room within me. Who can reach its uttermost depths? Yet it is a faculty of my soul and belongs to my nature. In fact I cannot totally grasp all that I am. Thus the mind is not large enough to contain itself: but where can that part of it be which it does not contain? Is it outside itself and not within? How can it not contain itself? How can there be any of itself that is not in itself?
What is the missing element, then? The mismatch between the information coming into our senses and the outputs of cognition arises from the brain reflexively filling in fleeting perceptual gaps with material that our minds “know” from memory belongs in the picture. Our perception always tends toward prediction: It anticipates what it is seeing. Much of our knowledge is instinctual and comes preprogrammed with our genetic code. A large part is appropriated from the experience of others through culture. And the rest is our personal experience. What we learn in the dozen or more years we spend in school is all content provided to us from our collective memory. Reading, writing, arithmetic, history, music, drawing—these are gifts of the generations, “derived from people,” as Czeslaw Milosz writes, “but also from radiance, heights.” The “sprea
ding limitless room” of memory within each of us is limitless because we have access to the memory of humanity.
If the great feat of memory is to construct a model of the world that approximates reality closely enough—however we do it—the genius of imagination lies in using that model to create alternative orders and models of reality. Memory records the world as so. Imagination transposes it into the key of as if, transforming experience into speculation. That is why to lose one’s memory means losing the future. Because imagination is memory in the future tense.
MAKE-BELIEVE
How does memory become imagination, and why? Imagination cannot come exclusively from memory and experience, as our scientists point out, because if it did, how do we account for the prodigal make-believe of children? Children have no experience to fill their imaginings with. During childhood the brain explodes with neuronal growth. The little neurons need to stretch and wriggle and find their place in the sun by rehearsing their future uses. These neurons spontaneously generate imaginary worlds as the mind builds its capacity to see patterns, make order and sense of a jumble of perceptions. Through observation and imitation, children begin to match their experiences of the world to what exists in their imaginary realms. They play dress-up, stepping into the imaginary clothes of imaginary adults. They act out what the clothes prompt them to do and work from a script they seem to know by heart without prompting. How this happens is one of the great puzzles of early development.
What we see in the imagination of children is a glimpse into the process of the nervous system learning the environment as the child grows. The nervous system rapidly builds extra mental and physical capacity just in case, and that capacity atrophies when not used. Use and experience begin to shape the consciousness of the individual. The neural connections that do not get used are radically pruned in early childhood. Some cognitive disorders such as autism may be caused in part by failures in the pruning process, so that some individuals maintain a physical sensitivity or model of thinking that is inappropriate for the world they grow up in. Windows of learning vary a great deal in their onset and duration. Children can learn many different languages fluently before puberty, but after puberty the capacity to learn a language diminishes. Learning a new language takes greater effort with age, and certain types of mastery may no longer be achievable. The same is true for many physical and artistic skills, such as gymnastics or playing the violin. We can learn these things at any age, but our minds and bodies are not as plastic as they were in youth.