Book Read Free

You Are Not So Smart

Page 7

by David McRaney


  Those who are deeply concerned with evacuation procedures—first responders, architects, stadium personnel, the travel industry—are aware of normalcy bias, and write about it in manuals and trade journals. In a 1985 paper published in the International Journal of Mass Emergencies and Disasters, sociologists Shunji Mikami and Ken’Ichi Ikeda at the University of Tokyo identified the steps you are likely to go through in a disaster. They said you have a tendency to first interpret the situation within the context of what you are familiar with and to greatly underestimate the severity. This is the moment, when seconds count, that normalcy bias costs lives. A predictable order of behaviors, they said, will then unfold. You will seek information from those you trust first and then move on to those nearby. Next, you’ll try to contact your family if possible, and then you’ll begin to prepare to evacuate or seek shelter. Finally, after all of this, you’ll move. Mikami and Ikeda say you are more likely to dawdle if you fail to understand the seriousness of the situation and have never been exposed to advice about what to do or been in a similar circumstance. Even worse, you stall longer if you fall back on the old compare-and-contrast tendencies where you try to convince yourself the encroaching peril is not much different than what you are used to—normalcy bias.

  They use a 1982 flood in Nagasaki as an example. Light flooding occurred there every year, and the residents assumed the heavy rainfall was part of a familiar routine. Soon, though, they realized the waters were getting higher and doing so faster than in years past. At 4:55 P.M., the government issued a flood warning. Still, some waited to see just how peculiar the flooding would be, how out of the ordinary. Only 13 percent of residents had evacuated by 9 P.M. In the end, 265 were killed.

  When Hurricane Katrina bore down on my home in Mississippi, I remember going to the grocery store for food, water, and supplies and being shocked by the number of people who had only a few loaves of bread and couple of bottles of soda in their carts. I remember their frustration as they waited in line behind me with all my bottled water and canned goods. I told them, “Sorry, but you can never be too prepared.” Their response? “I don’t think it’s going to be a big deal.” I often wonder what those people did for the two weeks we were without electricity and the roads were impassable.

  Normalcy bias is a proclivity you can’t be rid of. Everyday life seems prosaic and mundane because you are wired to see it as such. If you weren’t, you would never be able to handle the information overload. Think of moving into a new apartment or home, or buying a new car or cell phone. At first, you notice everything and spend hours adjusting settings or arranging furniture. After a while, you get used to the normalcy and let things go. You may even forget certain aspects of your new home until a visitor points them out to you and you rediscover them. You acclimate to your surroundings so you can notice when things go awry; otherwise life would be all noise and no signal.

  Sometimes though, this habit of creating background static and then ignoring it gets in the way. Sometimes you see static when you shouldn’t and yearn for normalcy when it cannot be found. Hurricanes and floods, for example, can be too big, slow, and abstract to startle you into action. You truly can’t see them coming. The solution, according to Mikami, Ikeda, and other experts, is repetition on the part of those who can help, those who can see the danger better than you. If enough warnings are given and enough instructions are broadcast, then those things become the new normal, and you will spring into action.

  Normalcy bias can be scaled up to larger events as well. Global climate change, peak oil, obesity epidemics, and stock market crashes are good examples of larger, more complex events in which people fail to act because it is difficult to imagine just how abnormal life could become if the predictions are true. Regular media over-hyping and panic-building over issues like Y2K, swine flu, SARS, and the like help fuel normalcy bias on a global scale. Pundits on both sides of politics warn of crises that can be averted only by voting one way or the other. With so much crying wolf, it can be difficult to determine in the frenzied information landscape when to be alarmed, when it really is not a drill. The first instinct is to gauge how out of the norm the situation truly is and act only when the problem crosses a threshold past which it becomes impossible to ignore. Of course, this is often after it is too late to act.

  8

  Introspection

  THE MISCONCEPTION: You know why you like the things you like and feel the way you feel.

  THE TRUTH: The origin of certain emotional states is unavailable to you, and when pressed to explain them, you will just make something up.

  Imagine a painting the world considers beautiful, something like Starry Night by Van Gogh. Now imagine you have to write an essay on why it is popular. Go ahead, think of a reasonable explanation. No, don’t keep reading. Give it a shot. Explain why Van Gogh’s work is great.

  Is there a certain song you love, or a photograph? Perhaps there is a movie you keep returning to over the years, or a book. Go ahead and imagine one of those favorite things. Now, in one sentence, try to explain why you like it. Chances are, you will find it difficult to put into words, but if pressed you will probably be able to come up with something.

  The problem is, according to research, your explanation is probably going to be total bullshit. Tim Wilson at the University of Virginia demonstrated this in 1990 with the Poster Test. He brought a group of students into a room and showed them a series of posters. The students were told they could take any one they wanted as a gift and keep it. He then brought in another group and told them the same thing, but this time they had to explain why they wanted the poster they each picked. Wilson then waited six months and asked the two groups what they thought of their choices. The first group, the ones who just got to grab a poster and leave, all loved their choice. The second group, the ones who had to write out why they were choosing one over the others, hated theirs. The first group, the grab-and-go people, usually picked a nice, fancy painting. The second group, the ones who had to explain their choice, usually picked an inspirational poster with a cat clinging to a rope.

  According to Wilson, when you are faced with a decision in which you are forced to think about your rationale, you start to turn the volume in your emotional brain down and the volume in your logical brain up. You start creating a mental list of pros and cons that would never have been conjured up if you had gone with your gut. As Wilson noted in his research, “Forming preferences is akin to riding a bicycle; we can do it easily but cannot easily explain how.”

  Before Wilson’s work, the general consensus was to see careful deliberation as good, but he showed how the act of introspection can sometimes lead you to make decisions that look good on virtual paper but leave you emotionally lacking. Wilson knew previous research at Kent State had shown that ruminations about your own depression tend to make you more depressed, but distraction leads to an improved mood. Sometimes, introspection is simply counterproductive. Research into introspection calls into question the entire industry of critical analysis of art—video games, music, film, poetry, literature—all of it. It also makes things like focus groups and market analysis seem less about the intrinsic quality of the things being judged and more about what the people doing the judging find to be plausible explanations of their own feelings. When you ask people why they do or do not like things, they must then translate something from a deep, emotional, primal part of their psyche into the language of the higher, logical, rational world of words and sentences and paragraphs. The problem here is those deeper recesses of the mind are perhaps inaccessible and unconscious. The things that are available to consciousness might not have much to do with your preferences. Later, when you attempt to justify your decisions or emotional attachments, you start worrying about what your explanation says about you as a person, further tainting the validity of your inner narrative.

  In the Poster Test, most people truly preferred the nice painting to the inspirational cat, but they couldn’t conjure up a rational explanation of wh
y, at least not in a way that would make logical sense on paper. On the other hand, you can write all sorts of bullshit about a motivational poster. It has a stated and tangible purpose.

  Wilson conducted another experiment in which people were shown two small photos of two different people and were asked which one was more attractive. They were then handed what they were told was a larger version of the photo they’d picked, but it was actually a picture of a completely different person. They were then asked why they’d chosen it. Each time, the person dutifully spun a yarn explaining his or her choice. The person had never seen the photo before, but that didn’t make the task of explaining why he or she had preferred it in an imaginary past any more difficult.

  Another of Wilson’s experiments had subjects rate the quality of jam. He placed before them five varieties of jam which had previously been ranked by Consumer Reports as the first, eleventh, twenty-fourth, thirty-second, and forty-fourth best jams on the market. One group tasted and ranked how good they thought the jams were. The other group had to write out what they did and did not like about each one as they tasted it. As with the posters, the people who didn’t have to explain themselves gravitated toward the same ones Consumer Reports said were best. The people forced to introspect rated the jams inconsistently and had varying preferences based on their explanations. Taste is difficult to quantify and put into words, so the explainers focused on other aspects like texture or color or viscosity. None of which in the end made much difference to the non-explainers.

  Believing you understand your motivations and desires, your likes and dislikes, is called the introspection illusion. You believe you know yourself and why you are the way you are. You believe this knowledge tells you how you will act in all future situations. Research shows otherwise. Time after time, experiments show introspection is not the act of tapping into your innermost mental constructs but is instead a fabrication. You look at what you did, or how you felt, and you make up some sort of explanation that you can reasonably believe. If you have to tell others, you make up an explanation they can believe too. When it comes to explaining why you like the things you like, you are not so smart, and the very act of having to explain yourself can change your attitudes.

  In this new era of Twitter and Facebook and blogs, just about everyone is broadcasting his or her love or hate of art. Just look at all the vitriol and praise being lobbed back and forth over Avatar or Black Swan. When Titanic earned its Oscars, some people were saying it might just be the greatest film ever made. Now it’s considered good but schmaltzy, a well-made film but decidedly melodramatic. What will people think in a hundred years?

  It would be wise to remember that many of the works we now consider classics were in their time critically panned. For instance, this is how one reviewer described Moby Dick in 1851:

  This is an ill-compounded mixture of romance and matter-of-fact. The idea of a connected and collected story has obviously visited and abandoned its writer again and again in the course of composition. The style of his tale is in places disfigured by mad (rather than bad) English; and its catastrophe is hastily, weakly, and obscurely managed. We have little more to say in reprobation or in recommendation of this absurd book. Mr. Melville has to thank himself only if his horrors and his heroics are flung aside by the general reader, as so much trash belonging to the worst school of Bedlam literature—since he seems not so much unable to learn as disdainful of learning the craft of an artist.

  —HENRY F. CHORLEY, IN London Athenaeum

  This book is now considered one of a handful of great American novels and is held up as an example of the best pieces of literature ever written. Chances are, though, no one can truly explain why.

  9

  The Availability Heuristic

  THE MISCONCEPTION: With the advent of mass media, you understand how the world works based on statistics and facts culled from many examples.

  THE TRUTH: You are far more likely to believe something is commonplace if you can find just one example of it, and you are far less likely to believe in something you’ve never seen or heard of before.

  Do more words begin with “r” or have “r” as the third letter?

  Think about it for a second—rip, rat, revolver, reality, relinquish. If you are like most people, you think there are more that begin with “r”—but you’re wrong. More words in the English language have the letter in the third position than in the first—car, bar, farce, market, dart. It is much easier to believe the first option because it takes more concentration to think of words with “r” in the third position. Try it.

  If someone you know gets sick from taking a flu shot, you will be less likely to get one even if it is statistically safe. In fact, if you see a story on the news about someone dying from the flu shot, that one isolated case could be enough to keep you away from the vaccine forever. On the other hand, if you hear a news story about how eating sausage leads to anal cancer, you will be skeptical, because it has never happened to anyone you know, and sausage, after all, is delicious. The tendency to react more rapidly and to a greater degree when considering information you are familiar with is called the availability heuristic.

  The human mind is generated by a brain that was formed under far different circumstances than the modern world offers up on a daily basis. Over the last few million years, much of our time was spent with fewer than 150 people, and what we knew about the world was based on examples from our daily lives. Mass media, statistical data, scientific findings—these things are not digested as easily as something you’ve seen with your own eyes. The old adage “I’ll believe it when I see it” is the availability heuristic at work.

  Politicians use this all the time. Whenever you hear a story that begins with “I met a mother of two in Michigan who lost her job because of a lack of funding for . . .” or something similar, the politician hopes the anecdote will sway your opinion. He or she is betting that the availability heuristic will influence you to assume that this one example is indicative of a much larger group of people.

  It’s simply easier to believe something if you are presented with examples than it is to accept something presented in numbers or abstract facts.

  School shootings were considered to be a dangerous new phenomenon after Columbine. That event fundamentally changed the way kids are treated in American schools, and hundreds of books, seminars, and films have been produced in an attempt to understand the sudden epidemic. The truth, however, was that there hadn’t been an increase in school shootings. According to research by Barry Glassner, author of The Culture of Fear, during the time when Columbine and other school shootings got major media attention, violence in schools was down over 30 percent. Kids were more likely to get shot in school before Columbine, but the media during that time hadn’t given you many examples. A typical schoolkid is three times more likely to get hit by lightning than to be shot by a classmate, yet schools continue to guard against it as if it could happen at any second.

  Amos Tverksy and Daniel Kahneman pinpointed the availability heuristic, in their 1973 research. Their subjects had to listen to a tape recording of names being said aloud that included nineteen famous men and twenty that the subjects had never heard before. They repeated the study with names of women as well. After they heard the names, subjects had to either recall as many names as they could or identify them from a word bank. About 66 percent of the people recalled famous people more often than the unfamiliar names, and 80 percent said the lists contained more famous names than non-famous. The word test about how often “r” is in the third position was Tversky and Kahneman’s idea too. In both studies they showed the more available a bit of information is, the faster you process it. The faster you process it, the more you believe it and the less likely you become to consider other bits of info.

  When you buy a lottery ticket, you imagine yourself winning like those people on television who get suddenly famous when their numbers are chosen, because people who don’t win don’t get interviewed. You ar
e far more likely to die in a car crash on the way to buy the ticket than you are to win, but this information isn’t as available. You don’t think in statistics, you think in examples, in stories. When it comes to buying lottery tickets, fearing the West Nile virus, looking for child molesters, and so on, you use the availability heuristic first and the facts second. You decide the likelihood of a future event on how easily you can imagine it, and if you’ve been bombarded by reports or have filled your head with fears, those images will overshadow new information that might contradict your beliefs.

  10

  The Bystander Effect

  THE MISCONCEPTION: When someone is hurt, people rush to their aid.

  THE TRUTH: The more people who witness a person in distress, the less likely it is that any one person will help.

  If your car were to break down and your cell phone had no service, where do you think you would have a better chance of getting help—a country road or a busy street? To be sure, more people will see you on a busy street. On the country road, you might have to wait a long time before someone comes by. So which one?

  Studies show you have a better chance on the country road. Why?

  Have you ever seen someone broken down on the side of the road and thought, “I could help them, but I’m sure someone will be along.” Everyone thinks that. And no one stops. This is called the bystander effect.

  In 1968, Eleanor Bradley fell and broke her leg in a busy department store. For forty minutes, people just stepped over and around her until one man finally stopped to see what was wrong. In 2000, a group of young men attacked sixty women at a Central Park parade in New York City. Thousands of people looked on. No one used a cell phone to call police. The culprit in both cases was the bystander effect. In a crowd, your inclination to rush to someone’s aid fades, as if diluted by the potential of the group. Everyone thinks someone is going to eventually do something, but with everyone waiting together, no one does.

 

‹ Prev