How We Learn

Home > Other > How We Learn > Page 21
How We Learn Page 21

by Benedict Carey


  Deep sleep, on the other hand, pools in the front half of a typical night’s slumber, as you can see from the diagram. That’s the slow wavelength you want when preparing for a test of retention, like new vocabulary, or filling in the periodic table. Arrange your studying so that you hit the sack at your regular time, get a strong dose of the deep stuff—and roll out of bed early for a quick review before dawn.

  All of this is to say that if you’re going to burn the candle, it helps to have some idea of which end to burn.

  Here’s the best part: You may not have to burn it at all.

  Napping is sleep, too. In a series of experiments over the past decade, Sara Mednick of the University of California, San Diego, has found that naps of an hour to an hour and half often contain slow-wave deep sleep and REM. People who study in the morning—whether it’s words or pattern recognition games, straight retention or comprehension of deeper structure—do about 30 percent better on an evening test if they’ve had an hour-long nap than if they haven’t. “It’s changed the way I work, doing these studies,” Mednick told me. “It’s changed the way I live. With naps of an hour to an hour and half, we’ve found in some experiments that you get close to the same benefits in learning consolidation that you would from a full eighthour night’s sleep.”

  • • •

  Learning is hard. Thinking is hard. It’s as exhausting, though in a different way, as physical labor and wears most of us down at a similar rate. Yes, some people can spend fourteen hours a day doing grueling mental work and then relax by solving puzzles or attending poetry readings by some Eastern European exile. Good for them. Me, I fall more squarely in the Michael Gazzaniga camp of learning. Gazzaniga, the neuroscientist who discovered the right brain/left brain specialization we explored in chapter 1, worked long days and nights in the lab at Caltech on his landmark studies. “We had all these people at Caltech back then who became big names—Richard Feynman, Roger Sperry, Murray Gell-Mann, Sidney Coleman—but we weren’t working all the time,” Gazzaniga told me. “We weren’t intellectuals in the sense that we were going out to see people lecturing or cultural events in the evening. That was martini time.”

  And we’re almost there.

  Let’s return to Jerome Siegel’s theory of sleep, the one we described at the beginning of the chapter. He argues that sleep evolved to keep us safe when the hunting and gathering was scarce or too risky. We are awake when the foraging is good, when socializing in the group is important, and asleep when there’s no percentage in pursuing any of the above, when the costs are too high. Sleep occupies so much time because it’s so central to immediate, day-to-day survival.

  It’s no stretch to say, however, that learning—in school, at work, at practice—is equally crucial to the survival game. Mastering a subject or skill may not be as urgent as avoiding some saber-toothed cat, but over a lifetime our knowledge and skills become increasingly valuable—and need to be continually updated. Learning is how we figure out what we want to do, what we’re good at, how we might make a living when the time comes. That’s survival, too. Yet, especially when we’re young, we have a terrible time trying to sort out what’s important from what’s not. Life is confusing, it moves fast, we’re fielding all sorts of often conflicting messages and demands from parents, teachers, friends, and rivals. There aren’t enough hours in the day to think through what it all means.

  That’s reason enough to suspect that what the brain does at night is about more than safety. The sleep-wake cycle may have evolved primarily to help us eat and not be eaten but if that downtime can be put to good use, then evolutionary theory tells us it will. What better way to sift the day’s perceptions and flag those that seem most important? A tracking skill. A pattern of movement in the bushes. An odd glance from a neighbor. A formula for calculating the volume of a cone. A new batting stance. A confounding plot in a Kafka novel. To sort all that variety, sleep might absolutely evolve distinct stages to handle different categories of learning, whether retention or comprehension, thermodynamics or Thucydides. I am not arguing that each state of sleep is specialized, that only REM can handle math and only deep sleep can help store Farsi verbs. Anyone who’s pulled an all-nighter or two knows that we don’t need any sleep at all to learn a pile of new material, at least temporarily. I am saying the research thus far suggests that each of sleep’s five stages helps us consolidate learning in a different way.

  Siegel’s theory tells us that exhaustion descends when the costs of staying up outweigh its benefits. The Night Shift Theory gives us the reason why: because sleep has benefits, too—precisely for sorting through and consolidating what we’ve just been studying or practicing. Seen in this way, it’s yin and yang. Learning crests during waking hours, giving way to sleep at the moment of diminishing returns, when prolonged wakefulness is a waste of time. Sleep, then, finishes the job.

  I’ve always loved my sleep, but in the context of learning I assumed it was getting in the way. Not so. The latest research says exactly the opposite: that unconscious downtime clarifies memory and sharpens skills—that it’s a necessary step to lock in both. In a fundamental sense, that is, sleep is learning.

  No one is sure how the brain manages the sensory assault that is a day’s input, biologically. The science of sleep is still in its infancy. Yet one of its leading theorists, Giulio Tononi of the University of Wisconsin, has found evidence that sleep brings about a large-scale weakening of the neural connections made during the previous day. Remember all those linked neural networks forming every moment we’re awake? Tononi argues that the primary function of sleep is to shake off the trivial connections made during the day and “help consolidate the valuable inferences that were made.” The brain is separating the signal from the noise, by letting the noise die down, biologically speaking. Active consolidation is likely going on as well. Studies in animals have found direct evidence of “crosstalk” between distinct memory-related organs (the hippocampus and the neocortex, described in chapter 1) during sleep, as if the brain is reviewing, and storing, details of the most important events of the day—and integrating the new material with the old.

  I sure don’t know the whole story. No one does, and maybe no one ever will. The properties of sleep that make it such an unreliable companion—often shallow, elusive when most needed, or arriving when least wanted—also make it difficult to study in a controlled way over time. It’s likely that the sleep stages, arbitrarily defined by brain wave changes, may be replaced by more precise measures, like the chemical cocktails circulating during sleep states, or different types of “crosstalk.” My bet, though, is that the vast promise of tweaking sleep as a means to deepen learning will tempt someone into longer-term experiments, comparing the effects of different schedules on specific topics. Those effects will likely be highly individual, like so many others described in this book. Some night owls may find early morning study sessions torturously unproductive, and some early birds get their chakras bent out of joint after 10 P.M. At least with the Night Shift Theory, we have some basis on which to experiment on our own, to adjust our sleep to our advantage where possible.

  Put it this way: I no longer think of naps or knocking off early as evidence of laziness, or a waste of time, or, worst of all, a failure of will. I think of sleep as learning with my eyes closed.

  Conclusion

  The Foraging Brain

  I began this book with the allegation that most of our instincts about learning are misplaced, incomplete, or flat wrong. That we invent learning theories out of whole cloth, that our thinking is rooted more in superstition than in science, and that we misidentify the sources of our frustration: that we get in our own way, unnecessarily, all the time. In the chapters that followed, I demonstrated as much, describing landmark experiments and some of the latest thinking about how remembering, forgetting, and learning are all closely related in ways that are neither obvious nor intuitive. I also showed how those unexpected relationships can be exploited by using specific learning
techniques.

  What I have not done is try to explain why we don’t know all this already.

  If learning is so critical to survival, why do we remain so ignorant about when, where, and how it happens? We do it naturally, after all. We think about how best to practice, try new approaches, ask others we think are smarter for advice. The drive to improve never really ends, either. By all rights, we should have developed pretty keen instincts about how best to approach learning. But we haven’t, and the reasons why aren’t at all apparent. No one that I know of has come forward with a convincing explanation, and the truth is, there may not be one.

  I do have one of my own, however, and it’s this: School was born yesterday. English class, Intro to Trig, study hall, soccer practice, piano lessons, social studies, art history, the Russian novel, organic chemistry, Zeno’s paradoxes, jazz trumpet, Sophocles and sophomore year, Josephus and gym class, Modern Poetry and Ancient Civilizations: All of it, every last component of what we call education, is a recent invention in the larger scheme of things. Those “ancient” civilizations we studied in middle school? They’re not so ancient, after all. They date from a few thousand years ago, no more. Humans have been around for at least a million, and for the vast majority of that time we’ve been preoccupied with food, shelter, and safety. We’ve been avoiding predators, ducking heavy weather, surviving by our wits, foraging. And life for foragers, as the Harvard psychologist Steven Pinker so succinctly puts it, “is a camping trip that never ends.”

  Our foraging past had some not so obvious consequences for learning. Think for a moment about what it meant, that lifelong camping trip. Hunting and tracking were your reading and writing. Mapping the local environment—its every gully, clearing, and secret garden—was your geometry. The science curriculum included botany, knowing which plant had edible berries and which medicinal properties; and animal behavior, knowing the hunting routines of predators, the feeding habits of prey.

  Over the years you’d get an education, all right. Some of it would come from elders and peers, but most of it would be accumulated through experience. Listening. Watching. Exploring the world in ever-widening circles. That is how the brain grew up learning, piecemeal and on the fly, at all hours of the day, in every kind of weather. As we foraged for food, the brain adapted to absorb—at maximum efficiency—the most valuable cues and survival lessons along the way.

  It became a forager, too—for information, for strategies, for clever ways to foil other species’ defenses and live off the land. That’s the academy where our brains learned to learn, and it defines who we are and how we came to be human.

  Humans fill what the anthropologists John Tooby and Irven DeVore called the “cognitive niche” in evolutionary history. Species thrive at the expense of others, each developing defenses and weapons to try to dominate the niche it’s in. The woodpecker evolved an extraordinary bone structure to pound holes in tough bark and feed on the insects hidden in trees. The brown bat evolved an internal sonar, called echolocation, allowing it to hunt insects at dusk. We evolved to outwit our competitors, by observing, by testing our intuitions, by devising tools, traps, fishhooks, theories, and more.

  The modern institution of education, which grew out of those vestigial ways of learning, has produced generations of people with dazzling skills, skills that would look nothing less than magical to our foraging ancestors. Yet its language, customs, and schedules—dividing the day into chunks (classes, practices) and off-hours into “study time” (homework)—has come to define how we think the brain works, or should work. That definition is so well known that it’s taken for granted, never questioned. We all “know” we need to be organized, to develop good, consistent study routines, to find a quiet place and avoid distractions, to focus on one skill at a time, and above all, to concentrate on our work. What’s to question about that?

  A lot, it turns out. Take “concentration,” for example, that most basic educational necessity, that mental flow we’re told is so precious to learning. What is concentration, exactly? We all have an idea of what it means. We know it when we see it, and we’d like more of it. Yet it’s an ideal, a mirage, a word that blurs the reality of what the brain actually does while learning.

  I remember bringing my younger daughter to my newspaper office one weekend a few years ago when she was twelve. I was consumed with a story I had to finish, so I parked her at an empty desk near mine and logged her into the computer. And then I strapped in at my desk and focused on finishing—focused hard. Occasionally, I looked up and was relieved to see that she was typing and seemed engrossed, too. After a couple hours of intense work, I finished the story and sent it off to my editor. At which point, I asked my daughter what she’d been up to. She showed me. She’d been keeping a moment-to-moment log of my behavior as I worked. She’d been taking field notes, like Jane Goodall observing one of her chimpanzees:

  10:46—types

  10:46—scratches head

  10:47—gets papers from printer

  10:47—turns chair around

  10:48—turns chair back around

  10:49—sighs

  10:49—sips tea

  10:50—stares at computer

  10:51—puts on headset

  10:51—calls person, first word is “dude”

  10:52—hangs up

  10:52—puts finger to face, midway between mouth and chin, thinking pose?

  10:53—friend comes to desk, he laughs

  10:53—scratches ear while talking

  And so on, for three pages. I objected. She was razzing me, naturally, but the phone call wasn’t true, was it? Did I make a call? Hadn’t I been focused the whole time, locked in, hardly looking away from my screen? Hadn’t I come in and cranked out my story without coming up for air? Apparently not, not even close. The truth was, she could never have invented all those entries, all that detail. I did the work, all right, and I’d had to focus on it. Except that, to an outside observer, I looked fidgety, distracted—unfocused.

  The point is not that concentration doesn’t exist, or isn’t important. It’s that it doesn’t necessarily look or feel like we’ve been told it does. Concentration may, in fact, include any number of breaks, diversions, and random thoughts. That’s why many of the techniques described in this book might seem unusual at first, or out of step with what we’re told to expect. We’re still in foraging mode to a larger extent than we know. The brain has not yet adapted to “fit” the vocabulary of modern education, and the assumptions built into that vocabulary mask its true nature as a learning organ.

  The fact that we can and do master modern inventions like Euclidean proofs, the intricacies of bond derivatives, and the fret board hardly means those ancient instincts are irrelevant or outmoded. On the contrary, many scientists suspect that the same neural networks that helped us find our way back to the campsite have been “repurposed” to help us find our way through the catacombs of academic and motor domains. Once central to tracking our location in physical space, those networks adjusted to the demands of education and training. We don’t need them to get home anymore. We know our address. The brain’s internal GPS—it long ago evolved internal communities of so-called grid cells and place cells, to spare us the death sentence of getting lost—has retuned itself. It has adapted, if not yet perfectly.

  Scientists are still trying to work out how those cells help us find our way in modern-day learning. One encompassing theory is called the Meaning Maintenance Model, and the idea is this: Being lost, confused, or disoriented creates a feeling of distress. To relieve that distress, the brain kicks into high gear, trying to find or make meaning, looking for patterns, some way out of its bind—some path back to the campsite. “We have a need for structure, for things to make sense, and when they don’t, we’re so motivated to get rid of that feeling that our response can be generative,” Travis Proulx, a psychologist at Tilburg University in the Netherlands, told me. “We begin to hunger for meaningful patterns, and that can help with certain k
inds of learning.”

  Which kinds? We don’t know for sure, not yet. In one experiment, Proulx and Steven J. Heine, a psychologist at the University of British Columbia, found that deliberately confusing college students—by having them read a nonsensical short story based on one by Franz Kafka—improved their performance by almost 30 percent on a test of hidden pattern recognition, similar to the colored egg test we discussed in Chapter 10. The improvements were subconscious; the students had no awareness they were picking up more. “Kafka starts out normally, the first couple pages make you think it’s going to be a standard narrative and then it gets stranger and stranger,” Proulx told me. “Psychologists don’t really have a word for the feeling that he creates, but to me it goes back to the older existentialists, to a nostalgia for unity, a feeling of uncanniness. It’s unnerving. You want to find your way back to meaning, and that’s what we think helps you to extract these very complex patterns in this artificial grammar, and perhaps essential patterns in much more that we’re asked to study.”

  When we describe ourselves as being “lost” in some class or subject, that sentiment can be self-fulfilling, a prelude to failure or permission to disengage entirely, to stop trying. For the living brain, however, being lost—literally, in some wasteland, or figuratively, in The Waste Land—is not the same as being helpless. On the contrary, disorientation flips the GPS settings to “hypersensitive,” warming the mental circuits behind incubation, percolation, even the nocturnal insights of sleep. If the learner is motivated at all, he or she is now mentally poised to find the way home. Being lost is not necessarily the end of the line, then. Just as often, it’s a beginning.

  • • •

  I have been a science reporter for twenty-eight years, my entire working life, and for most of that time I had little interest in writing a nonfiction book for adults. It was too close to my day job. When you spend eight or nine hours a day sorting through studies, interviewing scientists, chasing down contrary evidence and arguments, you want to shut down the factory at the end of the day. You don’t want to do more of the same; you don’t want to do more at all. So I wrote fiction instead—two science-based mysteries for kids—adventures in made-up places starring made-up characters. As far from newspapering as I could get.

 

‹ Prev