Book Read Free

The Art of Thinking Clearly

Page 19

by Rolf Dobelli


  80

  The Difference between Risk and Uncertainty

  Ambiguity Aversion

  Two boxes. Box A contains one hundred balls: fifty red and fifty black. Box B also holds one hundred balls, but you don’t know how many are red and how many are black. If you reach into one of the boxes without looking and draw out a red ball, you win $100. Which box will you choose: A or B? The majority will opt for A.

  Let’s play again, using exactly the same boxes. This time, you win $100 if you draw out a black ball. Which box will you go for now? Most likely you’ll choose A again. But that’s illogical! In the first round, you assumed that B contained fewer red balls (and more black balls), so, rationally, you would have to opt for B this time around.

  Don’t worry; you’re not alone in this error—quite the opposite. This result is known as the “Ellsberg Paradox”—named after Daniel Ellsberg, a former Harvard psychologist. (As a side note, he later leaked the top-secret Pentagon Papers to the press, leading to the downfall of President Nixon.) The Ellsberg Paradox offers empirical proof that we favor known probabilities (box A) over unknown ones (box B).

  Thus we come to the topics of risk and uncertainty (or ambiguity), and the difference between them. Risk means that the probabilities are known. Uncertainty means that the probabilities are unknown. On the basis of risk, you can decide whether or not to take a gamble. In the realm of uncertainty, though, it’s much harder to make decisions. The terms “risk” and “uncertainty” are as frequently mixed up as “cappuccino” and “latte macchiato”—with much graver consequences. You can make calculations with risk, but not with uncertainty. The three-hundred-year-old science of risk is called statistics. A host of professors deal with it, but not a single textbook exists on the subject of uncertainty. Because of this, we try to squeeze ambiguity into risk categories, but it doesn’t really fit. Let’s look at two examples: one from medicine (where it works) and one from the economy (where it does not).

  There are billions of humans on earth. Our bodies do not differ dramatically. We all reach a similar height (no one will ever be one hundred feet tall) and a similar age (no one will live for ten thousand years—or for only a millisecond). Most of us have two eyes, four heart valves, thirty-two teeth. Another species would consider us to be homogeneous—as similar to one another as we consider mice to be. For this reason, there are many similar diseases and it makes sense to say, for example: “There is a 30 percent risk you will die of cancer.” On the other hand, the following assertion is meaningless: “There is a 30 percent chance that the euro will collapse in the next five years.” Why? The economy resides in the realm of uncertainty. There are not billions of comparable currencies from whose history we can derive probabilities. The difference between risk and uncertainty also illustrates the difference between life insurance and credit default swaps. A credit default swap is an insurance policy against specific defaults, a particular company’s inability to pay. In the first case (life insurance), we are in the calculable domain of risk; in the second (credit default swap), we are dealing with uncertainty. This confusion contributed to the chaos of the financial crisis in 2008. If you hear phrases such as “the risk of hyperinflation is x percent” or “the risk to our equity position is y,” start worrying.

  To avoid hasty judgment, you must learn to tolerate ambiguity. This is a difficult task and one that you cannot influence actively. Your amygdala plays a crucial role. This is a nut-sized area in the middle of the brain responsible for processing memory and emotions. Depending on how it is built, you will tolerate uncertainty with greater ease or difficulty. This is evident not least in your political orientation: The more averse you are to uncertainty, the more conservatively you will vote. Your political views have a partial biological underpinning.

  Either way, whoever hopes to think clearly must understand the difference between risk and uncertainty. Only in very few areas can we count on clear probabilities: casinos, coin tosses, and probability textbooks. Often we are left with troublesome ambiguity. Learn to take it in stride.

  81

  Why You Go with the Status Quo

  Default Effect

  In a restaurant the other day I scanned the wine list in desperation. Irouléguy? Harslevelü? Susumaniello? I’m far from an expert, but I could tell that a sommelier was trying to prove his worldliness with these selections. On the last page, I found redemption: “Our French house wine: Réserve du Patron, Bourgogne,” $52. I ordered it right away; it couldn’t be that bad, I reasoned.

  I’ve owned an iPhone for several years now. The gadget allows me to customize everything—data usage, app synchronization, phone encryption, even how loud I want the camera shutter to sound. How many of these have I set up so far? You guessed it: not one.

  In my defense, I’m not technically challenged. Rather, I’m just another victim of the so-called default effect. The default setting is as warm and welcoming as a soft pillow, into which we happily collapse. Just as I tend to stick with the house wine and factory cell-phone settings, most people cling to the standard options. For example, new cars are often advertised in a certain color; in every catalog, video, and ad, you see the new car in the same color, although the car is available in a myriad of colors. The percentage of buyers who select this default color far exceeds the percentage of car buyers who bought this particular color in the past. Many opt for the default.

  In their book Nudge, economist Richard Thaler and law professor Cass Sunstein illustrate how a government can direct its citizens without unconstitutionally restricting their freedom. The authorities simply need to provide a few options—always including a default choice for indecisive individuals. This is how New Jersey and Pennsylvania presented two car-insurance policies to their inhabitants. The first policy was cheaper but waived certain rights to compensation should an accident take place. New Jersey advertised this as the standard option, and most people were happy to take it. In Pennsylvania, however, the second, more expensive option was touted as the standard and promptly became the bestseller. This outcome is quite remarkable, especially when you consider that both states’ drivers cannot differ all that much in what they want covered or in what they want to pay.

  Or consider this experiment: There is a shortage of organ donors. Only about 40 percent of people opt for it. Scientists Eric Johnson and Dan Goldstein asked people whether, in the event of death, they wanted to actively opt out of organ donation. Making donation the default option increased take-up from 40 percent to more than 80 percent of participants, a huge difference between an opt-in and an opt-out default.

  The default effect is at work even when no standard option is mentioned. In such cases, we make our past the default setting, thereby prolonging and sanctifying the status quo. People crave what they know. Given the choice of trying something new or sticking to the tried-and-tested option, we tend to be highly conservative, even if a change would be beneficial. My bank, for example, charges an annual fee of $60 for mailing out account statements. I could save myself this amount if I downloaded the statements online. However, though the pricey (and paper-guzzling) service has bothered me for years, I still can’t bring myself to get rid of it once and for all.

  So where does the “status-quo bias” come from? In addition to sheer convenience, loss aversion plays a role. Recall that losses upset us twice as much as similar gains please us. For this reason, tasks such as renegotiating existing contracts prove very difficult. Regardless of whether these are private or professional, each concession you make weighs twice as heavy as any you receive, so such exchanges end up feeling like net losses.

  Both the default effect and the status-quo bias reveal that we have a strong tendency to cling to the way things are, even if this puts us at a disadvantage. By changing the default setting, you can change human behavior.

  “Maybe we live our lives according to some grand hidden default idea,” I suggested to a dinner companion, ho
ping to draw him into a deep philosophical discussion. “Maybe it just needs a little time to develop,” he said after trying the Réserve du Patron.

  82

  Why “Last Chances” Make Us Panic

  Fear of Regret

  Two stories: Paul owns shares in company A. During the year, he considered selling them and buying shares in company B. In the end, he didn’t. Today he knows that if he had done so, he would have been up $1,200. Second story: George had shares in company B. During the year, he sold them and bought shares in company A. Today he also knows that if he had stuck with B, he would have netted an extra $1,200. Who feels more regret?

  Regret is the feeling of having made the wrong decision. You wish someone would give you a second chance. When asked who would feel worse, 8 percent of respondents said Paul, whereas 92 percent chose George. Why? Considered objectively, the situations are identical. Both Paul and George were unlucky, picked the wrong stock, and were out of pocket for the exact same amount. The only difference: Paul already possessed the shares in A, whereas George went out and bought them. Paul was passive, George active. Paul embodies the majority—most people leave their money lying where it is for years—and George represents the exception. It seems that whoever does not follow the crowd experiences more regret.

  It is not always the one who acts who feels more regret. Sometimes, choosing not to act can constitute an exception. An example: A venerable publishing house stands alone in its refusal to publish trendy e-books. Books are made of paper, asserts the owner, and he will stick by this tradition. Shortly afterward, ten publishers go bankrupt. Nine of them attempted to launch e-book strategies and faltered. The final victim is the conventional paper-only publisher. Who will regret the series of decisions most, and who will gain the most sympathy? Right, the stoic e-grumbler.

  Here is an example from Daniel Kahneman’s book Thinking, Fast and Slow: After every plane crash, we hear the story of one unlucky person who actually wanted to fly a day earlier or later, but for some reason he changed his booking at the last minute. Since he is the exception, we feel more sympathy for him than for the other “normal” passengers who were booked on the ill-fated flight from the outset.

  The fear of regret can make us behave irrationally. To dodge the terrible feeling in the pits of our stomachs, we tend to act conservatively, so as not to deviate from the crowd too much. No one is immune to this, not even supremely self-confident traders. Statistics show that each year on December 31 (D-day for performance reviews and bonus calculations), they tend to off-load their more exotic stocks and conform to the masses. Similarly, fear of regret (and the endowment effect) prevents you from throwing away things you no longer require. You are afraid of the remorse you will feel in the unlikely event that you needed those worn-out tennis shoes after all.

  The fear of regret becomes really irksome when combined with a “last chance” offer. A safari brochure promises “the last chance to see a rhino before the species is extinct.” If you never cared about seeing one before today, why would you fly all the way to Tanzania to do so now? It is irrational.

  Let’s say you have long dreamed of owning a house. Land is becoming scarce. Only a handful of plots with sea views are left. Three remain, then two, and now just one. It’s your last chance! This thought racing through your head, you give in and buy the last plot at an exorbitant price. The fear of regret tricked you into thinking this was a onetime offer, when in reality, real estate with a lake view will always come on the market. The sale of stunning property isn’t going to stop anytime soon. “Last chances” make us panic-stricken, and the fear of regret can overwhelm even the most hardheaded deal makers.

  83

  How Eye-Catching Details Render Us Blind

  Salience Effect

  Imagine the issue of marijuana has been dominating the media for the past few months. Television programs portray potheads, clandestine growers, and dealers. The tabloid press prints photos of twelve-year-old girls smoking joints. Broadsheets roll out the medical arguments and illuminate the societal, even philosophical aspects of the substance. Marijuana is on everyone’s lips. Let’s assume for a moment that smoking does not affect driving in any way. Just as anyone can wind up in an accident, a driver with a joint is also involved in a crash every now and then—purely coincidentally.

  Kurt is a local journalist. One evening, he happens to drive past the scene of an accident. A car is wrapped around a tree trunk. Since Kurt has a very good relationship with the local police, he learns that they found marijuana in the backseat of the car. He hurries back to the newsroom and writes this headline: “Marijuana Kills Yet Another Motorist.”

  As stated above, we are assuming that the statistical relationship between marijuana and car accidents is zero. Thus, Kurt’s headline is unfounded. He has fallen victim to the salience effect. Salience refers to a prominent feature, a stand-out attribute, a particularity, something that catches your eye. The salience effect ensures that outstanding features receive much more attention than they deserve. Since marijuana is the salient feature of this accident, Kurt believes that it is responsible for the crash.

  A few years later, Kurt moves into business journalism. One of the largest companies in the world has just announced it is promoting a woman to CEO. This is big news! Kurt snaps open his laptop and begins to write his commentary: The woman in question, he types, got the post simply because she is female. In truth, the promotion probably had nothing to do with gender, especially since men fill most top positions. If it were so important to have women as leaders, other companies would have acted by now. But in this news story, gender is the salient feature, and thus it earns undue weight.

  Not only journalists fall prey to the salience effect. We all do. Two men rob a bank and are arrested shortly after. It transpires that they are Nigerian. Although no ethnic group is responsible for a disproportionate number of bank robberies, this salient fact distorts our thinking. Lawless immigrants at it again, we think. If an Armenian commits rape, it is attributed to the “Armenians” rather than other factors that also exist among Americans. Thus, prejudices form. That the vast majority of immigrants live lawful lives is easily forgotten. We always recall the undesirable exceptions—they are particularly salient. Therefore, whenever immigrants are involved, it is the striking, negative incidents that come to mind first.

  The salience effect influences not only how we interpret the past but also how we imagine the future. Daniel Kahneman and his fellow researcher Amos Tversky found that we place unwarranted emphasis on salient information when we are forecasting. This explains why investors are more sensitive to sensational news (i.e., the dismissal of a CEO) than they are to less striking information (such as the long-term growth of a company’s profits). Even professional analysts cannot always evade the salience effect.

  In conclusion: Salient information has an undue influence on how you think and act. We tend to neglect hidden, slow-to-develop, discreet factors. Do not be blinded by irregularities. A book with an unusual, fire-engine red jacket makes it onto the bestseller list. Your first instinct is to attribute the success of the book to the memorable cover. Don’t. Gather enough mental energy to fight against seemingly obvious explanations.

  84

  Why Money Is Not Naked

  House-Money Effect

  A windy fall day in the early 1980s. The wet leaves swirled about the sidewalk. Pushing my bike up the hill to school, I noticed a strange leaf at my feet. It was big and rust-brown, and only when I bent down did I realize it was a 500–Swiss franc bill! That was the equivalent of about $250 back then, an absolute fortune for a high school student. The money spent little time in my pocket: I soon bought myself a top-of-the-range bike with disc brakes and Shimano gears, one of the best models around. The funny thing was my old bike worked fine.

  Admittedly, I wasn’t completely broke back then: I had managed to save up a few hundred francs through mowing
grass in the neighborhood. However, it never crossed my mind to spend this hard-earned money on something so unnecessary. The most I treated myself to was a trip to the movies every now and then. It was only upon reflection that I realized how irrational my behavior had been. Money is money, after all. But we don’t see it that way. Depending on how we get it, we treat it differently. Money is not naked; it is wrapped in an emotional shroud.

  Two questions. You’ve worked hard for a year. At the end of the twelve months, you have $20,000 more in your account than you had at the beginning. What do you do? (a) Leave it sitting in the bank. (b) Invest it. (c) Use it to make necessary improvements, such as renovating your moldy kitchen or replacing old tires. (d) Treat yourself to a luxury cruise. If you think like most people, you’ll opt for A, B, or C.

  Second question: You win $20,000 in the lottery. What do you do with it? Choose from A, B, C, or D above. Most people now take C or D. And of course, by doing so, they exhibit flawed thinking. You can count it any way you like; $20,000 is still $20,000.

  We witness similar delusions in casinos. A friend places $1,000 on the roulette table—and loses everything. When asked about this, he says: “I didn’t really gamble away a thousand dollars. I won all that earlier.” “But it’s the same amount!” “Not for me,” he says, laughing.

  We treat money that we win, discover, or inherit much more frivolously than hard-earned cash. The economist Richard Thaler calls this the house-money effect. It leads us to take bigger risks and, for this reason, many lottery winners end up worse off after they’ve cashed in their winnings. That old platitude—win some, lose some—is a feeble attempt to downplay real losses.

  Thaler divided his students into two groups. The first group learned they had won $30 and could choose to take part in the following coin toss: If it was tails, they would win $9. If heads, they would lose $9. Seventy percent of students opted to risk it. The second group learned they had won nothing but that they could choose between receiving $30 or taking part in a coin toss in which heads won them $21 and tails secured $39. The second group behaved more conservatively. Only 43 percent were prepared to gamble—even though the expected value for both options was the same: $30.

 

‹ Prev