The Art of Thinking Clearly

Home > Other > The Art of Thinking Clearly > Page 4
The Art of Thinking Clearly Page 4

by Rolf Dobelli


  11

  Why We Prefer a Wrong Map to None at All

  Availability Bias

  Smoking can’t be that bad for you: My grandfather smoked three packs of cigarettes a day and lived to be more than a hundred.” Or: “Manhattan is really safe. I know someone who lives in the middle of the Village and he never locks his door. Not even when he goes on vacation, and his apartment has never been broken into.” We use statements like these to try to prove something, but they actually prove nothing at all. When we speak like this, we succumb to the availability bias.

  Are there more English words that start with a k or more words with k as its third letter? Answer: More than twice as many English words have k in the third position than start with a k. Why do most people believe the opposite is true? Because we can think of words beginning with a k more quickly. They are more available to our memory.

  The availability bias says this: We create a picture of the world using the examples that most easily come to mind. This is idiotic, of course, because in reality, things don’t happen more frequently just because we can conceive of them more easily.

  Thanks to the availability bias, we travel through life with an incorrect risk map in our heads. Thus, we systematically overestimate the risk of being the victims of a plane crash, a car accident, or a murder. And we underestimate the risk of dying from less spectacular means, such as diabetes or stomach cancer. The chances of bomb attacks are much rarer than we think, and the chances of suffering depression are much higher. We attach too much likelihood to spectacular, flashy, or loud outcomes. Anything silent or invisible we downgrade in our minds. Our brains imagine showstopping outcomes more readily than mundane ones. We think dramatically, not quantitatively.

  Doctors often fall victim to the availability bias. They have their favorite treatments, which they use for all possible cases. More appropriate treatments may exist, but these are in the recesses of the doctors’ minds. Consequently, they practice what they know. Consultants are no better. If they come across an entirely new case, they do not throw up their hands and sigh: “I really don’t know what to tell you.” Instead, they turn to one of their more familiar methods, whether or not it is ideal.

  If something is repeated often enough, it gets stored at the forefront of our minds. It doesn’t even have to be true. How often did the Nazi leaders have to repeat the term “the Jewish question” before the masses began to believe that it was a serious problem? You simply have to utter the words “UFO,” “life energy,” or “karma” enough times before people start to credit them.

  The availability bias has an established seat at the corporate board’s table, too. Board members discuss what management has submitted—usually quarterly figures—instead of more important things, such as a clever move by the competition, a slump in employee motivation, or an unexpected change in customer behavior. They tend not to discuss what’s not on the agenda. In addition, people prefer information that is easy to obtain, be it economic data or recipes. They make decisions based on this information rather than on more relevant but harder-to-obtain information—often with disastrous results. For example, we have known for ten years that the so-called Black-Scholes formula for the pricing of derivative financial products does not work. But we don’t have another solution, so we carry on with an incorrect tool. It is as if you were in a foreign city without a map, and then pulled out one for your hometown and simply used that. We prefer wrong information to no information. Thus, the availability bias has presented the banks with billions in losses.

  What was it that Frank Sinatra sang—something about loving the girl I’m near when I’m not near the girl I love? A perfect example of the availability bias. Fend it off by spending time with people who think differently than you do—people whose experiences and expertise are different from yours. We require others’ input to overcome the availability bias.

  12

  Why “No Pain, No Gain” Should Set Alarm Bells Ringing

  The It’ll-Get-Worse-Before-It-Gets-Better Fallacy

  A few years ago, I was on vacation in Corsica and fell sick. The symptoms were new to me, and the pain was growing by the day. Eventually I decided to seek help at a local clinic. A young doctor began to inspect me, prodding my stomach, gripping my shoulders and knees, and then poking each vertebra. I began to suspect that he had no idea what my problem was, but I wasn’t really sure so I simply endured the strange examination. To signal its end, he pulled out his notebook and said: “Antibiotics. Take one tablet three times a day. It’ll get worse before it gets better.” Glad that I now had a treatment, I dragged myself back to my hotel room with the prescription in hand.

  The pain grew worse and worse—just as the doctor had predicted. The doctor must have known what was wrong with me after all. But, when the pain hadn’t subsided after three days, I called him. “Increase the dose to five times a day. It’s going to hurt for a while more,” he said. After two more days of agony, I finally called the international air ambulance. The Swiss doctor diagnosed appendicitis and operated on me immediately. “Why did you wait so long?” he asked me after the surgery.

  I replied: “It all happened exactly as the doctor said, so I trusted him.”

  “Ah, you fell victim to the it’ll-get-worse-before-it-gets-better fallacy. That Corsican doctor had no idea. Probably just the same type of stand-in you find in all the tourist places in high season.”

  Let’s take another example: A CEO is at his wit’s end: Sales are in the toilet, the salespeople are unmotivated, and the marketing campaign sank without a trace. In his desperation, he hires a consultant. For $5,000 a day, this man analyzes the company and comes back with his findings: “Your sales department has no vision, and your brand isn’t positioned clearly. It’s a tricky situation. I can fix it for you—but not overnight. The measures will require sensitivity, and, most likely, sales will fall further before things improve.” The CEO hires the consultant. A year later, sales fall, and the same thing happens the next year. Again and again, the consultant stresses that the company’s progress corresponds closely to his prediction. As sales continue their slump in the third year, the CEO fires the consultant.

  A mere smoke screen, the it’ll-get-worse-before-it-gets-better fallacy is a variant of the so-called confirmation bias. If the problem continues to worsen, the prediction is confirmed. If the situation improves unexpectedly, the customer is happy, and the expert can attribute it to his prowess. Either way he wins.

  Suppose you are president of a country and have no idea how to run it. What do you do? You predict “difficult years” ahead, ask your citizens to “tighten their belts,” and then promise to improve the situation only after this “delicate stage” of “cleansing,” “purification,” and “restructuring.” Naturally you leave the duration and severity of the period open.

  The best evidence of this strategy’s success is the religious zealot who believes that before we can experience heaven on earth, the world must be destroyed. Disasters, floods, fires, death—they are all part of the larger plan and must take place. These believers will view any deterioration of the situation as confirmation of the prophecy and any improvement as a gift from God.

  In conclusion: If someone says, “It’ll get worse before it gets better,” you should hear alarm bells ringing. But beware: Situations do exist where things first dip, then improve. For example, a career change requires time and often incorporates loss of pay. The reorganization of a business also takes time. But in all these cases, we can see relatively quickly if the measures are working. The milestones are clear and verifiable. Look to these rather than to the heavens.

  13

  Even True Stories Are Fairy Tales

  Story Bias

  Life is a muddle, as intricate as a Gordian knot. Imagine an invisible Martian decides to follow you around with an equally invisible notebook, recording what you do, think, and dream. The rundown of your li
fe would consist of entries such as “drank coffee, two sugars,” “stepped on a thumbtack and swore like a sailor,” “dreamed that I kissed the neighbor,” “booked vacation, Maldives, now nearly out of money,” “found hair sticking out of ear, plucked it right away,” and so on. We like to knit this jumble of details into a neat story. We want our lives to form a pattern that can be easily followed. Many call this guiding principle “meaning.” If our story advances evenly over the years, we refer to it as “identity.” “We try on stories as we try on clothes,” said Max Frisch, a famous Swiss novelist.

  We do the same with world history, shaping the details into a consistent story. Suddenly we “understand” certain things, for example, why the Treaty of Versailles led to the Second World War, or why Alan Greenspan’s loose monetary policy created the collapse of Lehman Brothers. We comprehend why the Iron Curtain had to fall or why Harry Potter became a bestseller. Here, we speak about “understanding,” but these things cannot be understood in the traditional sense. We simply build the meaning into them afterward. Stories are dubious entities. They simplify and distort reality and filter things that don’t fit. But apparently we cannot do without them. Why remains unclear. What is clear is that people first used stories to explain the world, before they began to think scientifically, making mythology older than philosophy. This has led to the story bias.

  In the media, story bias rages like wildfire. For example: A car is driving over a bridge when the structure suddenly collapses. What do we read the next day? We hear the tale of the unlucky driver, where he came from, and where he was going. We read his biography: born somewhere, grew up somewhere else, earned a living as something. If he survives and can give interviews, we hear exactly how it felt when the bridge came crashing down. The absurd thing: Not one of these stories explains the underlying cause of the accident. Skip past the driver’s account—and consider the bridge’s construction: Where was the weak point? Was it fatigue? If not, was the bridge damaged? If so, by what? Was a proper design even used? Where are there other bridges of the same design? The problem with all these questions is that, though valid, they just don’t make for a good yarn. Stories attract us; abstract details repel us. Consequently, entertaining side issues and backstories are prioritized over relevant facts. (On the upside, if it were not for this, we would be stuck with only nonfiction books.)

  Here are two stories from the English novelist E. M. Forster. Which one would you remember better? (a) “The king died, and the queen died.” (b) “The king died, and the queen died of grief.” Most people will retain the second story more easily. Here, the two deaths don’t just take place successively; they are emotionally linked. Story A is a factual report, but story B has “meaning.” According to information theory, we should be able to hold on to A better: It is shorter. But our brains don’t work that way.

  Advertisers have learned to capitalize on this, too. Instead of focusing on an item’s benefits, they create a story around it. Objectively speaking, narratives are irrelevant. But still we find them irresistible. Google illustrated this masterfully in its Super Bowl commercial from 2010, “Google Parisian Love.” Take a look at it on YouTube.

  From our own life stories to global events, we shape everything into meaningful stories. Doing so distorts reality and affects the quality of our decisions, but there is a remedy: Pick these apart. Ask yourself: What are they trying to hide? Visit the library and spend half a day reading old newspapers. You will see that events that today look connected weren’t so at the time. To experience the effect once more, try to view your life story out of context. Dig into your old journals and notes, and you’ll see that your life has not followed a straight line leading to today, but has been a series of unplanned, unconnected events and experiences, as we will see in the next chapter.

  Whenever you hear a story, ask yourself: Who is the sender, what are his intentions, and what did he hide under the rug? The omitted elements might not be of relevance. But, then again, they might be even more relevant than the elements featured in the story, such as when “explaining” a financial crisis or the “cause” of war. The real issue with stories: They give us a false sense of understanding, which inevitably leads us to take bigger risks and urges us to take a stroll on thin ice.

  14

  Why You Should Keep a Diary

  Hindsight Bias

  I came across the diaries of my great-uncle recently. In 1932, he emigrated from a tiny Swiss village to Paris to seek his fortune in the movie industry. In August 1940, two months after Paris was occupied, he noted: “Everyone is certain that the Germans will leave by the end of year. Their officers also confirmed this to me. England will fall as fast as France did, and then we will finally have our Parisian lives back—albeit as part of Germany.” The occupation lasted four years.

  In today’s history books, the German occupation of France seems to form part of a clear military strategy. In retrospect, the actual course of the war appears the most likely of all scenarios. Why? Because we have fallen victim to the hindsight bias.

  Let’s take a more recent example: In 2007, economic experts painted a rosy picture for the coming years. However, just twelve months later, the financial markets imploded. Asked about the crisis, the same experts enumerated its causes: monetary expansion under Greenspan, lax validation of mortgages, corrupt rating agencies, low capital requirements, and so forth. In hindsight, the reasons for the crash seem painfully obvious.

  The hindsight bias is one of the most prevailing fallacies of all. We can aptly describe it as the “I told you so” phenomenon: In retrospect, everything seems clear and inevitable. If a CEO becomes successful due to fortunate circumstances, he will, looking back, rate the probability of his success a lot higher than it actually was. Similarly, following Ronald Reagan’s massive election victory over Jimmy Carter in 1980, commentators announced his appointment to be foreseeable, even though the election lay on a knife edge until a few days before the final vote. Today, business journalists opine that Google’s dominance was predestined, even though each of them would have snorted had such a prediction been made in 1998. One particularly blundering example: Nowadays it seems tragic, yet completely plausible, that a single shot in Sarajevo in 1914 would totally upturn the world for thirty years and cost fifty million lives. Every child learns this historical detail in school. But back then, nobody would have dreamed of such an escalation. It would have sounded too absurd.

  So why is the hindsight bias so perilous? Well, it makes us believe we are better predictors than we actually are, causing us to be arrogant about our knowledge and consequently to take too much risk. And not just with global issues: “Have you heard? Sylvia and Chris aren’t together anymore. It was always going to go wrong, they were just so different.” Or: “They were just so similar.” Or: “They spent too much time together.” Or even: “They barely saw one another.”

  Overcoming the hindsight bias is not easy. Studies have shown that people who are aware of it fall for it just as much as everyone else. So, I’m very sorry, but you’ve just wasted your time reading this chapter.

  If you’re still with me, I have one final tip, this time from personal rather than professional experience: Keep a journal. Write down your predictions—for political changes, your career, your weight, the stock market, and so on. Then, from time to time, compare your notes with actual developments. You will be amazed at what a poor forecaster you are. Don’t forget to read history, too—not the retrospective, compacted theories compiled in textbooks, but the diaries, oral histories, and historical documents from the period. If you can’t live without news, read newspapers from five, ten, or twenty years ago. This will give you a much better sense of just how unpredictable the world is. Hindsight may provide temporary comfort to those overwhelmed by complexity, but as for providing deeper revelations about how the world works, you’ll benefit by looking elsewhere.

  15

  Why You Systematically O
verestimate Your Knowledge and Abilities

  Overconfidence Effect

  My favorite musician, Johann Sebastian Bach, was anything but a one-hit wonder. He composed numerous works. How many there were I will reveal at the end of this chapter. But for now, here’s a small assignment: How many concertos do you think Bach composed? Choose a range, for example, between one hundred and five hundred, aiming for an estimate that is 98 percent correct and only 2 percent off.

  How much confidence should we have in our own knowledge? Psychologists Howard Raiffa and Marc Alpert, wondering the same thing, have interviewed hundreds of people in this way. Sometimes they have asked participants to estimate the total egg production in the United States or the number of physicians and surgeons listed in the Yellow Pages of the phone directory for Boston or the number of foreign automobiles imported into the United States, or even the toll collections of the Panama Canal in millions of dollars. Subjects could choose any range they liked, with the aim of being wrong no more than 2 percent of the time. The results were amazing. In the final tally, instead of just 2 percent, they proved incorrect 40 percent of the time. The researchers dubbed this amazing phenomenon the overconfidence effect.

  The overconfidence effect also applies to forecasts, such as stock market performance over a year or your firm’s profits over three years. We systematically overestimate our knowledge and our ability to predict—on a massive scale. The overconfidence effect does not deal with whether single estimates are correct or not. Rather, as Taleb puts it, “it measures the difference between what people actually know and how much they think they know.” What’s surprising is this: Experts suffer even more from the overconfidence effect than laypeople do. If asked to forecast oil prices in five years’ time, an economics professor will be as wide of the mark as a zookeeper will. However, the professor will offer his forecast with certitude.

 

‹ Prev