Book Read Free

Iconoclast: A Neuroscientist Reveals How to Think Differently

Page 9

by Berns, Gregory


  Reappraisal works well for short-term stressors, such as the fear of public speaking or even the fear of ambiguity. Sometimes it is difficult to implement these cognitive strategies on one’s own. Seeking the advice and counsel of a mentor or colleague will often do the trick. Such strategies need not be complicated. Simply relying on a neutral third party, who can reframe the circumstances in nonemotional terms, may be enough.

  Much of the problem with acute stressors derives from perception. Because perception is a product of the brain, reappraisal works well to change perception in such a way that the fear system is not activated. As we saw previously, fearful perception is also a statistical process. If someone consistently perceives public speaking as an unpleasant event, the brain will default to this interpretation. To extinguish this perception, the person must experience the conditions that lead to the stress response but without the unpleasantness. Reappraisal can help mitigate the unpleasantness. Sometimes more active measures are required to accelerate the extinction of unpleasant memories. Fear of public speaking, for example, can be effectively attenuated with practice. Programs like Toastmasters have proved time and again that any fear, even public speaking, can be managed through practice.

  Fear of the unknown, the other great inhibitor of innovation and iconoclasm, can also be managed through the same techniques of reappraisal and extinction. The Ellsberg paradox comes from the universal aversion to ambiguity. This is quite different from risk aversion. Risk aversion is a value judgment based on known probabilities and outcomes, which I will address in chapter 5. Ambiguity aversion comes straight from the fear of the unknown. This type of fear may be even more deeply entrenched than the fear of public speaking. Every study that has looked at this phenomenon in other animals has found evidence for ambiguity aversion. We are dealing with a deeply ingrained biological tendency. But that does not mean it can’t be inhibited. Humans possess a much larger prefrontal cortex than any other animal and therefore possess the brainpower to keep this fear in check.

  One technique that may be particularly effective is to convert ambiguity into risk. This is a form of reappraisal. For example, in the Ellsberg experiment, it might be useful to imagine the urn with the unknown ratio of marbles with a known ratio. Without any further information, a reasonable guess would be a 50–50 ratio, as with the left-hand urn. This actually creates an opportunity to hone one’s estimate. If you actually chose the right-hand urn, then the marble that you drew would give you a great deal of information about the urn’s contents. For example, if you drew a black marble, then the chances of drawing another black marble from that urn would go up. This is called Bayesian updating, which is the statistical process of using new information to update probability estimates. It is a mathematically sound principle that has been known for over two hundred years but used rarely in daily decision making. The brain is not wired to think in Bayesian terms, but that does not mean that with some effort it can’t be done. The key reappraisal for ambiguous circumstances is to view ambiguity as an opportunity for gaining knowledge. If one has multiple opportunities for knowledge updating, then ambiguity can be converted very quickly into a risk judgment.

  Stress is unavoidable, and one cannot live one’s life running away from stress. The good news is that stress creates opportunity (think: reappraisal!). If individuals reappraise all sources of stress as an opportunity to discover something new or find a market niche that other people are afraid of, stress may itself decrease. If this is not possible, then the strategy of substituting a short-term stress for a chronic one may be very effective. Paradoxically, physical exercise, which is a short-term stressor, is perhaps the best remedy for chronic stress. Similarly, the individual who feels overwhelmed by uncertainty or social stresses in the workplace may benefit from taking on projects that have defined endings. Although these may increase stress in the short term, their completion may actually decrease overall stress.

  So although fear is the great inhibitor of action, its location in the brain is well known. The recent advances in neuroimaging show with increasing precision that cognitive strategies are highly effective at keeping the fear system under control, and these cognitive strategies have their origin in the prefrontal cortex. So rather than people needing to avoid the situations that cause fear or the circumstances that make them stress out, neuroscience is showing how the rational part of the brain can regain control over such toxic emotions like fear.

  FOUR

  How Fear Distorts

  Perception

  The soft-minded man always fears change. He feels security

  in the status quo, and he has an almost morbid fear of the

  new. For him, the greatest pain is the pain of a new idea.

  —Martin Luther King Jr.

  IN THE LAST CHAPTER, we saw how fear can inhibit action. Fear also has another pernicious effect on potential iconoclasts: fear has the potential for interacting with the perceptual system and changing what a person sees (or thinks he sees). This is a far more dangerous scenario than the inhibitory effects of fear. In the last chapter, I focused on how fear prevents people from doing things. But when fear changes perception, the individual is not necessarily inhibited from action. Instead, he might choose the wrong course of action. Sometimes the results are deadly.

  When the space shuttle Challenger exploded shortly after launch on January 28, 1986, the world witnessed the fatal result of a chain of bad decisions. The independent commission that spearheaded the subsequent investigation laid the blame squarely on NASA for poor management practices and a culture that minimized the risks involved in space travel. Although the failure of an O-ring on the solid rocket booster was the immediate cause of the explosion, the commission came to the damning conclusion that the accident itself was “rooted in history.” Specifically: “The Space Shuttle’s Solid Rocket Booster problem began with the faulty design of its joint and increased as both NASA and contractor management first failed to recognize it as a problem, then failed to fix it and finally treated it as an acceptable flight risk.”1

  What is amazing about this statement is how it captures the gradual shift in perception about the shuttle’s design. Even after Morton Thiokol, the contractor that built the booster, discovered the design flaw in the O-rings, “they did not accept the implication of early tests.” NASA engineers, however, raised concerns about the design. One engineer, Leon Ray, submitted a report after a test firing revealed dangerous opening of the O-ring joint, recommending a complete redesign as the best long-term fix. Even so, NASA management minimized this concern in briefings with Thiokol. The commission concluded that costs were the primary concern of the NASA selection board, and that “cost consideration overrode any other objections.”

  Problems with the O-rings continued to mount. Temperature testing revealed that at 50°F the O-ring became so stiff that its seal was nonfunctional. (The air temperature at the Challenger’s launch was 31°F.) Even engineers at Thiokol started becoming afraid that the O-rings could lead to catastrophe: “It is my honest and very real fear that if we do not take immediate action to dedicate a team to solve the problem, … then we stand in jeopardy of losing a flight along with all the launch pad facilities.”2

  If so many engineers, within both Thiokol and NASA, were concerned about the O-rings, then one might reasonably ask why nothing was done. NASA had a safety program in place, but as the president’s commission found, it was largely ineffective. The unrelenting pressure to meet an accelerated flight schedule meant the safety program had to take a backseat. In fact, it was fear itself that changed the perception of risks within NASA management. Early in the booster development program, the O-rings were flagged as a problem of the highest level (criticality 1—potential for loss of vehicle and life if component fails). But somehow, by the time launches were occurring, this perception had changed.

  NASA management was under intense public pressure to maintain a high rate of launches. It had promised almost one a month. It was an unrealisti
c plan. Afraid of losing congressional funding as well as commercial, paying customers, NASA let this fear change the collective perception of risk.

  When the Emperor Has No Clothes

  If there was a single figure who clearly laid the blame for the Challenger disaster on NASA’s management practices, it was the Nobel Prize-winning physicist from Caltech, Richard Feynman. When Feynman demonstrated for Congress what happens to a piece of O-ring when frozen, he became a public hero for his candor. Feynman, however, was already renowned within physics circles for his iconoclasm.

  Unlike many of the other people profiled in this book, Feynman didn’t learn to become an iconoclast. He was born that way. At first, when he was growing up on Long Island, it wasn’t immediately apparent. But by the time he was in high school, Feynman had surpassed most of his teachers in his mathematical ability. When solving mathematical word problems, while others furiously cranked through algebraic formulas with pencil and paper, Feynman would eschew these mechanical approaches and see the problems differently, often blurting out answers without lifting a pencil.3 But it was the exciting developments in European physics that really grabbed the teenager. Finally, after centuries of debate, German physicists such as Bohr, Heisenberg, and Schrödinger had proven that matter was really composed of invisible, discrete particles called atoms. To Feynman, this was the single greatest discovery in all of human history. And it shaped his view of the world.

  It was a short-lived fascination. By the time Feynman was in graduate school at Princeton, his natural mathematical ability had forced him to abandon reading even the basic papers by Bohr and others. Studying physics at Princeton in the 1930s was a sink-or-swim operation. There were no required courses. You just had to pass a qualifying exam. Most students studied from an outline of basic physics: mechanics, electro-magnetism, and atomic physics. Not Feynman. He chose to study things with no answers. In his “Notebook of Things I Don’t Know About,” he began to deconstruct every branch of physics, cataloging, and analyzing gaps in the standard explanations for basic physical phenomena.4 Dissatisfied with algebraic explanations for atomic behavior, Feynman created his own graphical way of representing these types of problems. It was an approach that would foreshadow what he would eventually win the Nobel Prize for.

  And then came World War II. It was Robert Wilson, one of Feynman’s mentors, who let him in on the soon-to-be Manhattan Project. “Feynman’s persistent skepticism, his unwillingness to accept any assertion on authority, would be useful.”5 Indeed it was. Because of his mathematical prowess, he was put in charge of teams who had to manually crank through long calculations. He developed a reputation for seeing differently, being able to spot mistakes even when he didn’t know what the right answer was. He would look at calculations from unique vantage points, such as approaching from infinity or approaching from the other direction, infinitesimal numbers. Even J. Robert Oppenheimer took notice: “He is by all odds the most brilliant young physicist here.”6

  One thing about the Manhattan Project that did make a big impression on Feynman was how iconoclasts make decisions. “It was such a shock to me to see that a committee of men could present a whole lot of ideas, each one thinking of a new facet, while remembering what the other fella said, so that, at the end, the decision is made as to which idea was the best—summing it all up—without having to say it three times. These were very great men indeed.”7

  Feynman already had an innate sense of seeing things his way, and he refused to be intimidated by others. His description of the Trinity test was characteristic, refusing to let unfounded fears of blindness get in his way of seeing the first atomic explosion, and it is this refusal to let fear color his perception that makes him the most iconoclastic physicist ever:

  They gave out dark glasses that you could watch it with. Dark glasses! Twenty miles away, you couldn’t see a damn thing through dark glasses. So I figured the only thing that could really hurt your eyes is ultraviolet light. I got behind a truck windshield, because the ultraviolet can’t go through glass, so that would be safe, and so I could see the damn thing. Everybody else had dark glasses, and the people at 6 miles couldn’t see it because they were all told to lie on the floor. I’m probably the only guy who saw it with the human eye.8

  A Minority of One

  Although Feynman had no problem with his role as perennial iconoclast, for most people the willingness to stand alone for one’s opinion does not come easily. Like the fear of public speaking we saw in the last chapter, the fear of social isolation is deeply woven into the human brain. We readily discount our own perceptions for fear of being the odd one out.

  All our primate cousins, and even the earliest hominids, have depended on their clans for survival. As a result, a million years of mammalian evolution have produced a human brain that values social contact and communication above all else. The way in which we interact with each other is, in many ways, more important than what our own eyes and ears tell us. So much so, that the human brain takes in information from other people and incorporates it with the information coming from its own senses. Many times, the group’s opinion trumps the individual’s before he even becomes aware of it. And while we humans readily ascribe our thoughts and feelings to ourselves the truth is that many of our thoughts originate from other people.

  There is, of course, great value in belonging to a group. Safety in numbers, for one. But there is also a mathematical explanation for why the brain is so willing to give up its own opinions: a group of people is more likely to be correct about something than any individual. Both of these factors—social value and the statistical wisdom of the crowd—explain why so few people end up being true iconoclasts. Understanding these effects can encourage would-be iconoclasts and foster conditions for innovation within organizations. The story begins in the 1950s …

  The men dress with conspicuous purpose.9 They have volunteered for a psychology experiment in visual acuity. Most of the men have never taken a course in psychology, sticking instead to a course of study in history, politics, and economics that is well known to the Wall Street recruiters each spring. At the designated time, a group of eight assemble in an ordinary classroom. Most of the men know each other in some fashion, for the campus is one of the smaller of the Ivies. All have been recruited by a friend, classmate, or fraternity brother, but the recruitment process is strictly sotto voce, lending an air of mystery to the whole enterprise.

  The professor enters the room. Solomon Asch wears a weathered tweed coat over a matching vest and woolen pants. He is shorter than most of the volunteers and points to the two rows of desks. With an Eastern European accent, he asks the men to please take their seats. There ensues some jockeying of position, but one subject of particular interest ends up in the middle of the second row.

  In fact, he is the only subject. The other seven are acting as stooges to go along with the professor’s experiment.

  The subject looks up from his desk and sees an artist’s easel holding a stack of white pieces of cardboard. He begins to wonder why he volunteered for this exercise. He had heard about the experiment from a fellow who lived in a neighboring suite of his dorm, but this dorm mate isn’t someone he would call a friend under most circumstances. The other men in the room seem so at ease with themselves, talking about the recent Dewey-Truman fiasco and smoking cigarettes.

  Asch clears his throat and explains the task.

  “Before you is a pair of cards. On the left is a card with one line. The card at the right has three lines differing in length.” He turns over the first pair of cards to illustrate his point. “They are numbered 1, 2, and 3, in order. One of the three lines at the right is equal to the standard line at the left. You will decide in each case which is the equal line. There will be eighteen such comparisons in all.”

  Asch pauses and looks around the room to make sure that his audience understands the task. He takes in this particular group and adds, “As the number of comparisons is few and the group small, I will call upo
n each of you in turn to announce your judgments, which I shall record here on a prepared form.” The professor points to the person nearest him in the front row and adds, “Suppose you give me your estimates in order, starting here in the first row, proceeding to the left, and then going to the second row.”

  The fellow sitting next to the subject plays the part of a wiry, nervous sort. He chain-smokes and fidgets constantly. He raises his hand and asks, “Will there always be a line that matches?”

  Asch assures him that there will. “Very well. Let’s begin with the card currently showing.”

  The card on the left has a single line, about ten inches long. The right-hand card has three lines, the middle line being substantially longer than the other two and also clearly of the same length as the target.

  The first person says, “Line Two is the correct answer.”

  Asch marks something down on a clipboard and motions to the second person to answer. The men (actors) proceed around the room until the subject—the only subject—must answer.

  The task seems straightforward, and by this point he is somewhat relieved that it isn’t difficult. “Two,” he says.

  The second set of cards contains lines that range in height from one to two inches, but as with the first set, it is not difficult to pick out the ones that match. Everyone gives the correct answer.

  The third set looks like what is shown in figure 4-1.

  The first person says, “One. It’s Line One.”

  The subject does a double take, and while he regards the lines with renewed intensity, the second person says, “One.” So does the third. And the fourth, and everyone in front of him.

 

‹ Prev