by Todd Rose
Still, while many people intuitively grasp that failure is not only part of learning but also can help lead to long-term success, what’s harder to appreciate is why so many of us have trouble failing well. Part of the problem surely has to do with the prevalent human habit of denial. It’s sometimes so painful to accept unwelcome information that our brains literally prevent us from recognizing it.
Now, denial may sometimes be a useful habit of mind, at least in the evolutionary sense—like when it comes to people deciding to have a baby even after observing their friends’ teenagers. In most cases, however, it’s usually worthwhile to recognize the habit and develop ways to open your mind to information even when it makes you uncomfortable. Otherwise you may easily miss learning something useful.
Legends are told about the many great discoveries that came to us by accident. From penicillin to Post-it notes, Popsicles to the microwave, a lot of things we take for granted today first snuck into view when people were looking for something else. A big reason we celebrate these accidents so enthusiastically may be our underlying awareness of how hard it is to accept the unexpected.
Consider the case of a large group of Stanford University biochemists who were closely observed in the 1990s by the social sciences researcher Kevin Dunbar. After monitoring four of the biochemists’ labs, Dunbar was able to document that more than 50 percent of the data collected was notably different from what the scientists had expected. It wasn’t uncommon for the researchers to spend as long as a month on a project and then discard the data because the findings didn’t seem to make sense. When results didn’t square with their initial hypotheses, they routinely at first suspected they had made a mistake, or used the wrong methods, and needed to try the experiment once again. Sometimes they even rejected information that appeared consistently, in multiple trials.
Dunbar began to worry about this. What possibly useful information—what new varieties of penicillin or Post-its—might these excellent scientists be chucking overboard? The problem with science, he suspected, wasn’t the number of experiments that failed, but the tendency to ignore all the products of those failures.
Whyever would successful scientists at one of America’s most prestigious institutions behave in such a self-defeating way?
The problem is at least partly biological, stemming from what you have learned (in chapter 3) is the highly variable capacity for perception.
Over the past few decades, brain scientists have found that, to varying degrees, we humans selectively (and habitually) edit what we learn through our senses, normally welcoming new data that substantiate what we think we already know, while often overlooking information that appears to contradict our beliefs. Researchers who study brain scans believe they have identified the culprit in this regard: the dorsolateral prefrontal cortex, located just behind the forehead. Oversimplification is always a risk when you write about brain regions being “responsible” for something or other, so it’s best to say that this brain area is known to play an important role, together with other parts of the brain, in suppressing unwelcome bits of reality.
Compounding all this is the role of the stress hormone cortisol, which, as I told you about in chapter 4, can sabotage clear thinking. Learning from mistakes is a highly demanding mental process, requiring the capacity to reflect coolly on your immediate experience, comparing it to what you had expected, and also to imagine alternatives. Yet when most people make mistakes, it triggers a cortisol release, thwarting short-term memory in particular, and making all of those mental tasks much harder, especially in people who are already anxious. Instead we may be diverted by thoughts of catastrophe (I’m a failure! I’m going to get fired! My family will be homeless!) or alternative explanations that fit with our established beliefs by absolving us of responsibility. (Roscoe forgot to wash the beakers again!) In short, on these occasions, when you need all your mental powers most—to learn from failure—you’re also at an unusually high risk of losing them.
In the case of the Stanford scientists, it’s likely that these biological factors were at play—combined with a strong cultural influence. They didn’t get their prestigious positions by being lousy students in grade school, which suggests that they’d trained in a system that explicitly teaches us that failure is not an option.
Despite a lot of routine talk by educators about the value of learning from mistakes, the truth is that we are very far from practicing what we preach in schools. In fact, the way students are taught and tested throughout America most emphatically gives the opposite message. The powerful lesson that young minds learn from our school system is that failure betrays an innate deficiency in the person who is failing, and it should be avoided at all cost.
This kind of instruction is not helpful for any child, but I think it’s particularly damaging for the kind of kid that I was then: one bound to make more than his share of embarrassing errors. Rather than being taught to “lean in” to such errors to learn from them, students more often learn to try to cover their tracks, even if that means lying or blaming other people—habits guaranteed to prevent any chance that a boneheaded mistake may lead to someplace fruitful.
The Trouble with Billy
* * *
I witnessed a vivid example of this sad state of affairs during a visit to a third-grade classroom in a southern U.S. city I’m not at liberty to name, due to the protocols of the research. I’d gone there with my colleague Gabrielle Rappolt-Schlichtmann to field-test a prototype of a new learning software program that she had helped design, based on notebooks that students have to keep in science class. I’ll tell you more about this digital notebook in the Epilogue, but basically it’s a high-tech way of helping a wide variety of students to study science by presenting material in lots of different ways, while offering supports to help develop skills such as note-taking, making predictions, and learning from wrong guesses.
In the classroom we visited, we researchers stood at the blackboard while the teacher and her aide introduced us and explained why we were there.
“Scientists love mistakes!” the teacher began. “That’s how they learn. They test what are called hypotheses, and each time one doesn’t work, they learn how to improve the next one.” Scientists use their mistakes to get smarter, she told the class. “Who’s ever made a mistake?” she asked the class. Hands shot up, slowly at first, but eventually every kid’s hand was in the air.
“Me too,” said the teacher. “And I don’t like to make them, but I try to learn from them.” She concluded her pep talk to the students by saying, “In our class, don’t you think we should try and be more like scientists?”
This is perfect, I thought, as I smiled at the teacher. But at that moment, a movement in the back of the room drew my attention to a wiry, disheveled-looking boy I’ll call Billy, whose behavior I immediately recognized. He was sitting at his desk, unable to keep still, kicking his shoes off and then spinning the mouse on his computer around. The soles of his shoes were all worn down, just like mine had been at his age. It was the telltale sign of a restless troublemaker.
And, sure enough, five minutes into our test, one of Billy’s wandering fingers hit the “back” button on the browser of the new software, which, unpredictably, crashed the system.
It was soon clear to everyone in the room who was to blame. Billy’s face turned vermilion as he looked down at his stockinged feet and made small ducking motions, as if trying to figure out if he could hide under his desk. The students groaned, while Billy’s teacher and her aide shot him dark looks. This wasn’t the first time poor Billy had interrupted their routine.
My colleagues and I exchanged a glance. We understood right away that Billy had done us a great service, justifying our trip to the school. The issue with the “back” button was an obvious problem—although it hadn’t been obvious to us until then. If Billy hadn’t inadvertently alerted us to it, someone else surely would have, down the line, and maybe at a much greater cost.
Fortunately, we were
able to explain this to Billy’s teacher and classmates in a way that seemed to get through. Billy wasn’t punished, and for the next three weeks the class devoted itself to finding other “bugs” in our system.
As that day in the third-grade classroom suggests, one reason we all deal so poorly with mistakes must certainly be that, in so many cases, they’re a sign that we need to do more work, both physically and neurologically. Seen in the most positive light, the immediate aftermath of any error is a sweet spot, where all kinds of learning and growth can take place. That said, however, it also requires extra energy. After Billy’s mistake, my colleagues and I had to spend time correcting the “bug” he’d discovered. Similarly, my C+ from Professor Gardner, several years earlier, pushed me to spend scores of hours I had never thought I would have time for doing catch-up work I hadn’t known I’d needed.
What buoyed me all through this process is a self-compassionate strategy known as reframing, which in this case refers to the art of talking to yourself about embarrassing adversity in a more positive light. In this case, it meant I had to reinterpret Professor Gardner’s comments about my writing as his way of telling me not that I was a useless failure but that I simply needed a bit of remedial work.
Fortunately for me, I had learned about the magic of reframing through a previous mistake I’d made, back at Weber State University. At the time, I was working as a research assistant for my beloved mentor, Bill McVaugh. He had given me a data set that he told me he wanted me to analyze using a statistical technique called principal component analysis. I had no idea what principal component analysis was, but assumed I was supposed to be adept at it, because otherwise why would he have given me the job? So, under that assumption, I tried to fake my way through the work. When Professor McVaugh saw my results, he called me into his office and began by saying something on the order of “I should have realized that you couldn’t do this.” Those words set my teeth on edge and immediately caused me to argue and make excuses.
McVaugh, who, fortunately for me, was unusually patient and wise, didn’t argue back, but instead called for a halt, during which he explained that language is always imprecise, and that it was important for me to be open to multiple meanings, rather than automatically assuming the worst. What he had meant to tell me, he said, was that he should’ve realized I couldn’t do that kind of analysis, not because I wasn’t smart, but because Weber State had no courses that taught the method. Once I applied McVaugh’s principle to Gardner’s feedback, my next step was obvious. It was time to improve my writing. Over the next few days, I checked around on Harvard’s website and found that each department offered a free writing lab for its students. This made it clear that not only was I not alone in needing help, but that help was readily available. All I had to do was shed my pride and ask for it.
I went ahead and shed it—or at least some of it. I decided against taking the support offered by the writing lab in my own department, in favor of the relative anonymity of a workshop for undergraduates. The tutor, an undergrad student herself, turned out to be an exceptionally talented writer and teacher. Over the rest of the semester, I reported to her once a week to talk about my writing, to show her first drafts, and get tougher about taking criticism. She taught me that writing itself is a process of learning from mistakes. Very few writers produce polished prose in their first drafts. Just as with science experiments, the best writing is usually the product of many trials and errors.
My story’s happily-ever-after ending is that I recovered from the disastrous first paper and got an A in Gardner’s class. That was all the reward that I needed to convince me that this approach to failure was as useful as it was painful.
A Leading National Expert on Me
* * *
In the years since I bounced back from dropping out of high school, I’ve often been told that I am unusually “resilient.” While I appreciate the compliment, I have to say that I dislike this word just as much as I dislike other labels we use with kids. In the case of this one, it strikes me a lot like someone going to a racetrack and picking a winner—after the end of the race. Resilience isn’t something a child has from birth, as is often implied, but more something that’s acquired. It wasn’t the reason I got to where I am today. Instead, I gradually became resilient both because of the sum of my experiences, which helped me acquire an unusual amount of self-knowledge, and because I had support from others when it mattered the most.
By the time I arrived in Cambridge, for instance, I had recognized that one of my “interesting variabilities” is impulsivity—meaning I’m a lot more prone than most other people I know to say or do something I’ll later regret, particularly in contexts where I am stressed out or otherwise emotionally charged. Yet because I’m convinced that impulsivity can have its advantages, I set out to manage it rather than try to “cure” it, making the most of the upside while minimizing the chance that one silly act or word might cause me or others irreparable harm. By remembering that context is key, I can at least try to anticipate when and where I’m likely to have problems, and plan accordingly, even if it means simply giving myself a little heads-up talk ahead of time.
I do this, for example, almost every time I walk on my way home from work past the gorgeous glass windows of an elegant restaurant called Harvest, which is tucked away under the linden trees in Harvard Square. Without fail, something about the fancy suits inside, and the lobsters being served, and the earth tones of the walls, and the outdoor heaters so the patrons don’t get chilled, all converge to bring out the troublemaker in me—and, I swear to you, it’s all I can do to stop myself from banging on the windows just to startle the rich people feasting inside.
Now, over the years I’ve examined this childish impulse far more times than any reasonable person should, wondering why this instinct is so strong, and warning myself just how stupid I’d look if I gave in to it. (After all, even I know that Harvard faculty members do not ordinarily bang on restaurant windows!) So far, this mature deliberation has kept me from following my heart. Still, whenever I’m in a really bad mood, or I know I’m particularly tired, I take care to choose another route home.
A similar, if more adult-like calculation leads me to be hypercautious (compared to my peers) about the time I might spend using social media such as Facebook or Twitter. Don’t get me wrong; I love these tools, and I really wish I could spend more time interacting with family and friends. However, I also recognize that these are inordinately dangerous places for someone like me, not so much for the time I could waste, but because of the absence of a delete button that might save me from making some off-the-cuff remark that could dog me throughout my career.
In the nonvirtual world, meanwhile, I’ve somewhat similarly trained myself to notice early signs of a problem and remove myself from situations—for instance, with relatives, friends, or colleagues—in which I’m feeling angry or disrespected, on the understanding that I need time to give myself a chance to think reasonably. I also try to follow this practice whenever I make mistakes, with other people or in my work, as of course I continue to do. I’m not great at it, but I keep trying.
The bottom line is that it’s not as if I’ve managed to extinguish all the feelings of anger, sadness, shame, self-blame, and fear, which still reliably arise whenever I do something dumb—and much less that I’ve actually stopped doing dumb things. Nor am I convinced that I’d want to, since sometimes such feelings can be helpful. The difference is that I’m more prepared to cope with them when they do arise, and to remind myself, as often as I need to, that any one behavior or mistake does not define me.
Strategies and Tools
* * *
Besides helping to keep me out of at least some kinds of trouble, becoming aware of my particular strengths and weaknesses and how they play out in different contexts has helped me deal more effectively with information demands and, as a result, has helped me become more organized and productive. Remember how I told you about my appallingly weak working m
emory? Well, to survive at Weber State, and of course also at Harvard, I realized that I needed to build a system to compensate for it. Believe it or not, I relied for several years on writing my most important reminders on my hand. That was right up until a day in 2006 when I happened to share an elevator ride with my statistics professor, a teacher I particularly respected. She caught sight of what I used to call my “organizational tattoos” and snickered, “How old are you, Todd?”
From that point on, I determined to refine my self-management strategies, a quest no doubt aided by the fact that at the time I was focusing my research, and writing my dissertation, on the topic of working memory. In the process, I’ve been able to investigate what sorts of systems best meet my needs, and, by extension, those of millions of other people like me who may be having some trouble remembering what you just said.
One of the main things I’ve learned is how important it is for me to be able immediately to “capture” new information, rather than letting it float in the nether regions of my awareness, making me anxious as I keep trying to remember just what it was that I needed to do. This phenomenon is actually so prevalent that cognitive scientists have a special term for it: “open loops”—an image suggesting a nagging need for closure. To close those loops and keep my focus clear of pestering worries, I’ve programmed my cell phone, laptop, and desktop at home so that wherever I am, I can make a note that will be immediately available in all three places. This simple practice frees my mind to go on to the next task.