The Art of Thinking Clearly

Home > Other > The Art of Thinking Clearly > Page 18
The Art of Thinking Clearly Page 18

by Rolf Dobelli

You invest money in the stock market. Year in, year out, the Dow Jones rises and falls a little. Gradually, you grow accustomed to this gentle up and down. Then, suddenly, a day like October 19, 1987, comes around and the stock market tumbles 22 percent. With no warning. This event is a Black Swan, as described by Nassim Taleb in his book with the same title.

  A Black Swan is an unthinkable event that massively affects your life, your career, your company, your country. There are positive and negative Black Swans. The meteorite that flattens you, Sutter’s discovery of gold in California, the collapse of the Soviet Union, the invention of the transistor, the Internet browser, the overthrow of Egyptian dictator Mubarak, or another encounter that upturns your life completely—all are Black Swans.

  Think what you like of former U.S. secretary of defense Donald Rumsfeld, but at a press conference in 2002, he expressed a philosophical thought with exceptional clarity when he offered this observation: There are things we know (“known facts”), there are things we do not know (“known unknowns”), and there are things we do not know that we do not know (“unknown unknowns”).

  How big is the universe? Does Iran have nuclear weapons? Does the Internet make us smarter or dumber? These are “known unknowns.” With enough effort, we can hope to answer these one day. Unlike the “unknown unknowns.” No one foresaw Facebook mania ten years ago. It is a Black Swan.

  Why are Black Swans important? Because, as absurd as it may sound, they are cropping up more and more frequently and they tend to become more consequential. Though we can continue to plan for the future, Black Swans often destroy our best-laid plans. Feedback loops and nonlinear influences interact and cause unexpected results. The reason: Our brains are designed to help us hunt and gather. Back in the Stone Age, we hardly ever encountered anything truly extraordinary. The deer we chased was sometimes a bit faster or slower, sometimes a little bit fatter or thinner. Everything revolved around a stable mean.

  Today is different. With one breakthrough, you can increase your income by a factor of ten thousand. Just ask Larry Page, Usain Bolt, George Soros, J. K. Rowling, or Bono. Such fortunes did not exist previously; peaks of this size were unknown. Only in the most recent of human history has this been possible—hence our problem with extreme scenarios. Since probabilities cannot fall below zero, and our thought processes are prone to error, you should assume that everything has an above-zero probability.

  So, what can be done? Put yourself in situations where you can catch a ride on a positive Black Swan (as unlikely as that is). Become an artist, inventor, or entrepreneur with a scalable product. If you sell your time (e.g., as an employee, dentist, or journalist), you are waiting in vain for such a break. But even if you feel compelled to continue as such, avoid surroundings where negative Black Swans thrive. This means: Stay out of debt, invest your savings as conservatively as possible, and get used to a modest standard of living—no matter whether your big breakthrough comes or not.

  76

  Knowledge Is Nontransferable

  Domain Dependence

  Writing books about clear thinking brings with it many pluses. Business leaders and investors invite me to give talks for good money. (Incidentally, this is in itself poor judgment on their part: books are much cheaper.) At a medical conference, the following happened to me. I was speaking about base-rate neglect and illustrated it with a medical example: In a forty-year-old patient, stabbing chest pain (among other things) may indicate heart problems as well as stress. Stress is much more frequent (with a higher base rate), so it is advisable to test the patient for this first. All this is very reasonable, and the doctors understood it intuitively. But when I used an example from economics, most faltered.

  The same thing happens when I speak in front of investors. If I illustrate fallacies using financial examples, most catch on immediately. However, if I take instances from biology, many are lost. The conclusion: Insights do not pass well from one field to another. This effect is called domain dependence.

  In 1990, Harry Markowitz received the Nobel Prize in Economics for his theory of “portfolio selection.” It describes the optimum composition of a portfolio, taking into account both risk and return prospects. When it came to Markowitz’s own portfolio—how he should allot his savings into stocks and bonds—he simply opted for fifty-fifty distribution: half in shares, the other half in bonds. The Nobel Prize winner was incapable of applying his ingenious process to his own affairs. A blatant case of domain dependence: He failed to transfer knowledge from the academic world to the private sphere.

  A friend of mine is a hopeless adrenaline junkie, scaling overhanging cliffs with his bare hands, and launching himself off mountains in a wing suit. He explained to me last week why starting a business is dangerous: Bankruptcy can never be ruled out. “Personally, I’d rather be bankrupt than dead,” I replied. He didn’t appreciate my logic.

  As an author, I realize just how difficult it is to transfer skills to a new area. For me, devising plots for my novels and creating characters are a cinch. A blank, empty page doesn’t daunt me. It’s quite a different story with, say, an empty apartment. When it comes to interior decor, I can stand in the room for hours, hands in my pockets, devoid of one single idea.

  Business is teeming with domain dependence. A software company recruits a successful consumer-goods salesman. The new position blunts his talents; transferring his sales skills from products to services is exceedingly difficult. Similarly, a presenter who is outstanding in front of small groups may well tank when his audience reaches one hundred people. Or a talented marketing mind may be promoted to CEO and suddenly find that he lacks any strategic creativity.

  With the Markowitz example, we saw that the transfer from the professional realm to the private realm is particularly difficult to navigate. I know CEOs who are charismatic leaders in the office and hopeless duds at home. Similarly, it would be a hard task to find a more cigarette-toting profession than the prophets of health themselves, the doctors. Police officers are twice as violent at home as civilians. Literary critics’ novels get the poorest reviews. And, almost proverbially, the marriages of couples’ therapists are frequently more fragile than those of their clients. Mathematics professor Persi Diaconis tells this story: “Some years ago I was trying to decide whether or not I should move from Stanford to Harvard. I had bored my friends silly with endless discussion. Finally, one of them said, ‘You’re one of our leading decision theorists. Maybe you should make a list of the costs and benefits and try to roughly calculate your expected utility.’ Without thinking, I blurted out, ‘Come on, Sandy, this is serious.’ ”

  What you master in one area is difficult to transfer to another. Especially daunting is the transfer from academia to real life—from the theoretically sound to the practically possible. Of course, this also counts for this book. It will be difficult to transfer the knowledge from these pages to your daily life. Even for me as the writer, that transition proves to be a tough one. Book smarts don’t transfer to street smarts easily.

  77

  The Myth of Like-Mindedness

  False-Consensus Effect

  Which do you prefer: music from the ’60s or music from the ’80s? How do you think the general public would answer this question? Most people tend to extrapolate their preferences onto others. If they love the ’60s, they will automatically assume that the majority of their peers do, too. The same goes for ’80s aficionados. We frequently overestimate unanimity with others, believing that everyone else thinks and feels exactly like we do. This fallacy is called the false-consensus effect.

  Stanford psychologist Lee Ross hit upon this in 1977. He fashioned a sandwich board emblazoned with the slogan “Eat at Joe’s” and asked randomly selected students to wear it around campus for thirty minutes. They also had to estimate how many other students would put themselves forward for the task. Those who declared themselves willing to wear the sign assumed that the majority (62 perce
nt) would also agree to it. On the other hand, those who politely refused believed that most people (67 percent) would find it too stupid to undertake. In both cases, the students imagined themselves to be in the popular majority.

  The false-consensus effect thrives in interest groups and political factions that consistently overrate the popularity of their causes. An obvious example is global warming. However critical you consider the issue to be, you probably believe that the majority of people share your opinion. Similarly, if politicians are confident of election, it’s not just blind optimism: They cannot help overestimating their popularity.

  Artists are even worse off. In 99 percent of new projects, they expect to achieve more success than ever before. A personal example: I was completely convinced that my novel Massimo Marini would be a resounding success. It was at least as good as my previous books, I thought, and those had done very well. But the public was of a different opinion and I was proven wrong: false-consensus effect.

  Of course, the business world is equally prone to such false conclusions. Just because an R & D department is convinced of its product’s appeal doesn’t mean consumers will think the same way. Companies with tech people in charge are especially affected. Inventors fall in love with their products’ sophisticated features and mistakenly believe that these will bowl customers over, too.

  The false-consensus effect is fascinating for yet another reason. If people do not share our opinions, we categorize them as “abnormal.” Ross’s experiment also corroborated this: The students who wore the sandwich board considered those who refused to be stuck up and humorless, whereas the other camp saw the sign-wearers as idiots and attention seekers.

  Perhaps you remember the fallacy of social proof, the notion that an idea is better the more people believe in it. Is the false-consensus effect identical? No. Social proof is an evolutionary survival strategy. Following the crowd has saved our butts more often in the past hundred thousand years than striking out on our own. With the false-consensus effect, no outside influences are involved. Despite this, it still has a social function, which is why evolution didn’t eliminate it. Our brain is not built to recognize the truth; instead, its goal is to leave behind as many offspring as possible. Whoever seemed courageous and convincing (thanks to the false-consensus effect) created a positive impression, attracted a disproportionate amount of resources, and thus increased their chances of passing on their genes to future generations. Doubters were less sexy.

  In conclusion: Assume that your worldview is not borne by the public. More than that: Do not assume that those who think differently are idiots. Before you distrust them, question your own assumptions.

  78

  You Were Right All Along

  Falsification of History

  Winston Smith, a frail, brooding, thirty-nine-year-old office employee, works in the Ministry of Truth. His job is to update old newspaper articles and documents so that they agree with new developments. His work is important. Revising the past creates the illusion of infallibility and helps the government secure absolute power.

  Such historical misrepresentation, as witnessed in George Orwell’s classic 1984, is alive and well today. It may shock you but a little Winston is scribbling away in your brain, too. Worse still: Whereas in Orwell’s novel, he toiled unwillingly and eventually rebelled against the system, in your brain he is working with the utmost efficiency and according to your wishes and goals. He will never rise up against you. He revises your memories so effortlessly—elegantly, even—that you never notice his work. Discreet and reliable, Winston disposes of your old, mistaken views. As they vanish one by one, you start to believe you were right all along.

  In 1973, U.S. political scientist Gregory Markus asked three thousand people to share their opinions on controversial political issues, such as the legalization of drugs. Their responses ranged from “fully agree” to “completely disagree.” Ten years later, he interviewed them again on the same topics, and also asked what they had replied ten years previously. The result: What they recalled disclosing in 1973 was almost identical to their present-day views—and a far cry from their original responses.

  By subconsciously adjusting past views to fit present ones, we avoid any embarrassing proof of our fallibility. It’s a clever coping strategy because no matter how tough we are, admitting mistakes is an emotionally difficult task. But this is preposterous. Shouldn’t we let out a whoop of joy every time we realize we are wrong? After all, such admissions would ensure we will never make the same mistake twice and have essentially taken a step forward. But we do not see it that way.

  So does this mean our brains contain no accurately etched memories? Surely not! After all, you can recall the exact moment when you met your partner as if it were captured in a photo. And you can remember exactly where you were on September 11, 2001, when you learned of the terrorist attack in New York, right? You recall to whom you were speaking and how you felt. Your memories of 9/11 are extraordinarily vivid and detailed. Psychologists call these “flashbulb memories”: They feel as incontestable as photographs.

  They are not. Flashbulb memories are as flawed as regular recollections. They are the product of reconstruction. Ulric Neisser, one of the pioneers in the field of cognitive science, investigated them: In 1986, the day after the explosion of the Challenger space shuttle, he asked students to write essays detailing their reactions. Three years later, he interviewed them again. Less than 7 percent of the new data correlated with the initial submissions. In fact, 50 percent of the recollections were incorrect in two-thirds of the points, and 25 percent failed to match even a single detail. Neisser took one of these conflicting papers and presented it to its owner. Her answer: “I know it’s my handwriting, but I couldn’t have written this.” The question remains: Why do flashbulb memories feel so real? We don’t know yet.

  It is safe to assume that half of what you remember is wrong. Our memories are riddled with inaccuracies, including the seemingly flawless flashbulb memories. Our faith in them can be harmless—or lethal. Consider the widespread use of eyewitness testimony and police lineups to identify criminals. To trust such accounts without additional investigation is reckless, even if the witnesses are adamant that they would easily recognize the perpetrator again.

  79

  Why You Identify with Your Football Team

  In-Group Out-Group Bias

  When I was a child, a typical wintery Sunday looked like this: My family sat in front of the TV watching a ski race. My parents cheered for the Swiss skiers and wanted me to do the same. I didn’t understand the fuss. First, why zoom down a mountain on two planks? It makes as little sense as hopping up the mountain on one leg, while juggling three balls and stopping every hundred feet to hurl a log as far possible. Second, how can one-hundredth of a second count as a difference? Common sense would say that if people are that close together, they are equally good skiers. Third, why should I identify with the Swiss skiers? Was I related to any of them? I didn’t think so. I didn’t even know what they thought or read, and if I lived a few feet over the Swiss border, I would probably (have to) cheer for another team altogether.

  This brings us to the question: Does identifying with a group—a sports team, an ethnicity, a company, a state—represent flawed thinking?

  Over thousands of years, evolution has shaped every behavioral pattern, including attraction to certain groups. In times past, group membership was vital. Fending for yourself was close to impossible. As people began to form alliances, all had to follow suit. Individuals stood no chance against collectives. Whoever rejected membership or got expelled forfeited their place not only in the group, but also in the gene pool. No wonder we are such social animals—our ancestors were, too.

  Psychologists have investigated different group effects. These can be neatly categorized under the term in-group out-group bias. First, groups often form based on minor, even trivial, criteria. With sports affiliation
s a random birthplace suffices, and in business it is where you work. To test this, the British psychologist Henri Tajfel split strangers into groups, tossing a coin to choose who went to which group. He told the members of one group it was because they all liked a particular type of art. The results were impressive: Although (a) they were strangers, (b) they were allocated a group at random, and (c) they were far from art connoisseurs, the group members found each other more agreeable than members of other groups. Second, you perceive people outside your own group to be more similar than they actually are. This is called the “out-group homogeneity bias.” Stereotypes and prejudices stem from it. Have you ever noticed that, in science-fiction movies, only the humans have different cultures and the aliens do not? Third, since groups often form on the basis of common values, group members receive a disproportionate amount of support for their own views. This distortion is dangerous, especially in business: It leads to the infamous organizational blindness.

  Family members helping one another out is understandable. If you share half your genes with your siblings, you are naturally interested in their well-being. But there is such a thing as “pseudo-kinship.” It evokes the same emotions without blood relationship. Such feelings can lead to the most idiotic cognitive error of all: laying down your life for a random group—also known as going to war. It is no coincidence that “motherland” suggests kinship. And it’s not by chance that the goal of any military training is to forge soldiers together as “brothers.”

  In conclusion: Prejudice and aversion are biological responses to anything foreign. Identifying with a group has been a survival strategy for hundreds of thousands of years. Not any longer. Identifying with a group distorts your view of the facts. Should you ever be sent to war, and you don’t agree with its goals, desert.

 

‹ Prev