The Art of Thinking Clearly
Page 10
38
Why Attractive People Climb the Career Ladder More Quickly
Halo Effect
Cisco, the Silicon Valley firm, was once a darling of the new economy. Business journalists gushed about its success in every discipline: its wonderful customer service, perfect strategy, skillful acquisitions, unique corporate culture, and charismatic CEO. In March 2000, it was the most valuable company in the world.
When Cisco’s stock plummeted 80 percent the following year, the journalists changed their tune. Suddenly the company’s competitive advantages were reframed as destructive shortcomings: poor customer service, a woolly strategy, clumsy acquisitions, a lame corporate culture, and an insipid CEO. All this—and yet neither the strategy nor the CEO had changed. What had changed, in the wake of the dot-com crash, was demand for Cisco’s product—and that was through no fault of the firm.
The halo effect occurs when a single aspect dazzles us and affects how we see the full picture. In the case of Cisco, its halo shone particularly bright. Journalists were astounded by its stock prices and assumed the entire business was just as brilliant—without closer investigation.
The halo effect always works the same way: We take a simple-to-obtain or remarkable fact or detail, such as a company’s financial situation, and extrapolate conclusions from there that are harder to nail down, such as the merit of its management or the feasibility of its strategy. We often ascribe success and superiority where little is due, such as when we favor products from a manufacturer simply because of its good reputation. Another example of the halo effect: We believe that CEOs who are successful in one industry will thrive in any sector—and furthermore that they are heroes in their private lives, too.
The psychologist Edward Lee Thorndike discovered the halo effect nearly one hundred years ago. His conclusion was that a single quality (e.g., beauty, social status, age) produces a positive or negative impression that outshines everything else, and the overall effect is disproportionate. Beauty is the best-studied example. Dozens of studies have shown that we automatically regard good-looking people as more pleasant, honest, and intelligent. Attractive people also have it easier in their professional lives—and that has nothing to do with the myth of (women) “sleeping their way to the top.” The effect can even be detected in schools, where teachers unconsciously give good-looking students better grades.
Advertising has found an ally in the halo effect: Just look at the number of celebrities smiling at us from TV ads, billboards, and magazines. What makes a professional tennis player like Roger Federer a coffee machine expert is still open for debate, but this hasn’t detracted from the success of the campaign. We are so used to seeing celebrities promoting arbitrary products that we never stop to consider why their support should be of any importance to us. But this is exactly the sneaky part of the halo effect: It works on a subconscious level. All that needs to register is the attractive face, dream lifestyle—and that product.
Sticking with negative effects, the halo effect can lead to great injustice and even stereotyping when nationality, gender, or race becomes the all-encompassing feature. One need be neither racist nor sexist to fall victim to this. The halo effect clouds our view, just as it does journalists, educators, and consumers.
Occasionally, this effect has pleasant consequences—at least in the short term. Have you ever been head over heels in love? If so, you know how flawless a person can appear. Your Mr. or Ms. Perfect seems to be the whole package: attractive, intelligent, likable, and warm. Even when your friends might point out obvious failings, you see nothing but endearing quirks.
The halo effect obstructs our view of true characteristics. To counteract this, go beyond face value. Factor out the most striking features. World-class orchestras achieve this by making candidates play behind a screen, so that sex, race, age, and appearance play no part in their decision. To business journalists I warmly recommend judging a company by something other than its easily obtainable quarterly figures (the stock market already delivers that). Dig deeper. Invest the time to do serious research. What emerges is not always pretty, but almost always educational.
39
Congratulations! You’ve Won Russian Roulette
Alternative Paths
You arrange to meet with a Russian oligarch in a forest just outside your city. He arrives shortly after you, carrying a suitcase and a gun. Placing the suitcase on the hood of his car, he opens it so you can see it is filled to the brim with stacks of money—$10 million in total. “Want to play Russian roulette?” he asks. “Pull the trigger once, and all this is yours.” The revolver contains a single bullet; the other five chambers are empty. You consider your options. Ten million dollars would change your life. You would never have to work again. You could finally move from collecting stamps to collecting sports cars!
You accept the challenge. You put the revolver to your temple and squeeze the trigger. You hear a faint click and feel adrenaline flood your body. Nothing happens. The chamber was empty! You have survived. You take the money, move to the most beautiful city you know, and upset the locals by building a luxurious villa there.
One of these neighbors, whose home now stands in the shadow of yours, is a prominent lawyer. He works twelve hours a day, three hundred days a year. His rates are impressive, but not unusual: $500 per hour. Each year he can put aside half a million dollars net after taxes and living expenses. From time to time, you wave to him from your driveway, laughing on the inside: He will have to work for twenty years to catch up with you.
Suppose that, after twenty years, your hardworking neighbor has saved up $10 million. A journalist comes along one day and puts together a piece on the more affluent residents in the area—complete with photos of the magnificent buildings and the beautiful second wives that you and your neighbor have accrued. He comments on the interior design and the exquisite landscaping. However, the crucial difference between the two of you remains hidden from view: the risk that lurks behind each of the $10 million. For this he would need to recognize the alternative paths.
But not only journalists are underachievers at this skill. We all are, as Nassim Taleb makes clear with the Russian roulette vignette.
Alternative paths are all the outcomes that could have happened but did not. With the game of Russian roulette, four alternative paths would have led to the same result (winning the $10 million) and the fifth alternative to your death. A huge difference. In the case of the lawyer, the possible paths lie much more closely together. In a village, he would have earned perhaps just $200 per hour. In the heart of New York working for one of the major investment banks, maybe it would have been $600 per hour. But, unlike you, he risked no alternative path that would have cost him his fortune—or his life.
Alternative paths are invisible, so we contemplate them very rarely. Those who speculate on junk bonds, options, and credit default swaps, thus making millions, should never forget that they flirt with many alternative paths that lead straight to ruin. To a rational mind, $10 million that comes about through a huge risk is worth less than the same sum earned by years of drudgery. (An accountant might disagree, though.)
In Fooled by Randomness, Taleb recounts how he had dinner with a friend in a bar in New York. “We flipped a coin to see who was going to pay for the meal. I lost and paid. He was about to thank me when he abruptly stopped and said that he paid for half of it probabilistically.” The friend was considering alternative paths.
In conclusion: Risk is not directly visible. Therefore, always consider what the alternatives paths are. Success that comes about through risky dealings is, to a rational mind, of less worth than success achieved the “boring” way (for example, with laborious work as a lawyer, a dentist, a ski instructor, a pilot, a hairdresser, or a consultant). Yes, looking at alternative paths from the outside is a difficult task, looking at them from the inside an almost impossible task. Your brain will do everything to convince you
that your success is warranted—no matter how risky your dealings are—and will obscure any thought of paths other than the one you are on.
40
False Prophets
Forecast Illusion
Facebook to be number one entertainment platform in three years.”
“Regime shift in North Korea in two years.”
“Sour grapes for France as Argentinian wines expected to dominate.”
“Euro collapse likely.”
“Low-cost space flights by 2025.”
“No more crude oil in fifteen years.”
Every day, experts bombard us with predictions, but how reliable are they? Until a few years ago, no one bothered to check. Then along came Philip Tetlock. Over a period of ten years, he evaluated 28,361 predictions from 284 self-appointed professionals. The result: In terms of accuracy, the experts fared only marginally better than a random forecast generator. Ironically, the media darlings were among the poorest performers; and of those, the worst were the prophets of doom and disintegration. Examples of their far-fetched forecasts included the collapse of Canada, Nigeria, China, India, Indonesia, South Africa, Belgium, and the EU. None of these countries has imploded.
“There are two kinds of forecasters: those who don’t know, and those who don’t know they don’t know,” wrote Harvard economist John Kenneth Galbraith. With this he made himself a figure of hatred in his own guild. Fund manager Peter Lynch summed it up even more cuttingly: “There are 60,000 economists in the U.S., many of them employed full-time trying to forecast recessions and interest rates, and if they could do it successfully twice in a row, they’d all be millionaires by now. . . . As far as I know, most of them are still gainfully employed, which ought to tell us something.” That was ten years ago. Today, the United States could employ three times as many economists—with little or no effect on the quality of their forecasts.
The problem is that experts enjoy free rein with few negative consequences. If they strike it lucky, they enjoy publicity, consultancy offers, and publication deals. If they are completely off the mark, they face no penalties—neither in terms of financial compensation nor in loss of reputation. This win-win scenario virtually incentivizes them to churn out as many prophecies as they can muster. Indeed, the more forecasts they generate, the more will be coincidentally correct. Ideally, they should have to pay into some sort of “forecast fund”—say, $1,000 per prediction. If the forecast is correct, the expert gets his money back with interest. If he is wrong, the money goes to charity.
So what is predictable and what is not? Some things are fairly simple. For example, I have a rough idea of how many pounds I will weigh in a year’s time. However, the more complex a system, and the longer the time frame, the more blurred the view of the future will be. Global warming, oil prices, or exchange rates are almost impossible to foresee. Inventions are not at all predictable because if we knew what technology we would invent in the future, we would already have invented it.
So, be critical when you encounter predictions. Whenever I hear one, I make sure to smile, no matter how bleak it is. Then I ask myself two questions. First, what incentive does the expert have? If he is an employee, could he lose his job if he is always wrong? Or is he a self-appointed guru who earns a living through books and lectures? The latter type of forecaster relies on the media’s attention so, predictably, his prophecies tend to be sensational. Second, how good is his success rate? How many predictions has he made over the past five years? Out of these, how many have been right and how many have not? This information is vital, yet often goes unreported. I implore the media: Please don’t publish any more forecasts without giving the pundit’s track record.
Finally, since it is so fitting, a quote from former British prime minister Tony Blair: “I don’t make predictions. I never have, and I never will.”
41
The Deception of Specific Cases
Conjunction Fallacy
Chris is thirty-five. He studied social philosophy and has had an interest in developing countries since he was a teenager. After graduation, he worked for two years with the Red Cross in West Africa and then for three years in its Geneva headquarters, where he rose to head of the African aid department. He then completed an MBA, writing his thesis on corporate social responsibility. What is more likely? (a) Chris works for a major bank or (b) Chris works for a major bank, where he runs its Third World foundation. A or B?
Most people will opt for B. Unfortunately, it’s the wrong answer. Option B does not only say that Chris works for a major bank but also that an additional condition has been met. Employees who work specifically within a bank’s Third World foundation comprise a tiny subset of bankers. Therefore, option A is much more likely. The conjunction fallacy is at play when such a subset seems larger than the entire set—which by definition cannot be the case. Amos Tversky and Nobel laureate Daniel Kahneman have studied this extensively.
We are easy prey for the conjunction fallacy because we have an innate attraction to “harmonious” or “plausible” stories. The more convincingly, impressively, or vividly that Chris the aid worker is portrayed, the greater the risk of false reasoning. If I had put it a different way, you would have recognized the extra details as overly specific, for example: “Chris is thirty-five. What is more likely? (a) Chris works for a bank or (b) Chris works for a bank in New York, where his office is on the twenty-fourth floor, overlooking Central Park.”
Here’s another example: What is more likely? (a) “Seattle airport is closed. Flights are canceled,” or (b) “Seattle airport is closed due to bad weather. Flights are canceled.” A or B? This time, you have it: A is more likely since B implies that an additional condition has been met, namely, bad weather. It could be that a bomb threat, accident, or strike closed the airport; however, when faced with a “plausible” story, we don’t stop to consider such things. Now that you are aware of this, try it out with friends. You will see that most pick B.
Even experts are not immune to the conjunction fallacy. In 1982, at an international conference for future research, experts—all of them academics—were divided into two groups. To group A, Daniel Kahneman presented the following forecast for 1983: “Oil consumption will decrease by 30 percent.” Group B heard: “A dramatic rise in oil prices will lead to a 30 percent reduction in oil consumption.” Both groups had to indicate how likely they considered the scenarios. The result was clear: Group B felt much more strongly about its forecast than group A did.
Kahneman believes that two types of thinking exist: The first kind is intuitive, automatic, and direct. The second is conscious, rational, slow, laborious, and logical. Unfortunately, intuitive thinking draws conclusions long before the conscious mind does. For example, I experienced this after the 9/11 attacks on the World Trade Center. I wanted to take out travel insurance and came across a firm that offered special “terrorism cover.” Although other policies protected against all possible incidents (including terrorism), I automatically fell for the offer. The high point of the whole farce was that I was willing to pay even more for this enticing yet redundant add-on.
In conclusion: Forget about left brains and right brains: The difference between intuitive and conscious thinking is much more significant. With important decisions, remember that, at the intuitive level, we have a soft spot for plausible stories. Therefore, be on the lookout for convenient details and happy endings. Remember: If an additional condition has to be met, no matter how plausible it sounds, it will become less, not more, likely.
42
It’s Not What You Say, but How You Say It
Framing
Consider these two statements:
“Hey, the trash can is full!”
“It would be really great if you could empty the trash, honey.”
C’est le ton qui fait la musique: it’s not what you say but how you say it. If a message is communicated in different ways, it
will also be received in different ways. In psychologists’ jargon, this technique is called framing.
We react differently to identical situations, depending on how they are presented. Kahneman and Tversky conducted a survey in the 1980s in which they put forward two options for an epidemic-control strategy. The lives of six hundred people were at stake, they told participants. “Option A saves two hundred lives. Option B offers a 33 percent chance that all six hundred people will survive, and a 66 percent chance that no one will survive.” Although options A and B were comparable (with two hundred survivors expected), the majority of respondents chose A—remembering the adage: A bird in the hand is worth two in the bush. It became really interesting when the same options were reframed. “Option A kills four hundred people. Option B offers a 33 percent chance that no one will die, and with a 66 percent chance that all six hundred will die.” This time, only a fraction of respondents chose A and the majority picked B. The researchers observed a complete U-turn from almost all involved. Depending on the phrasing—survive or die—the respondents made completely different decisions.
Another example: Researchers presented a group of people with two kinds of meat, “99 percent fat free” and “1 percent fat,” and asked them to choose which was healthier. Can you guess which they picked? Bingo: Respondents ranked the first type of meat as healthier, even though both were identical. Next came the choice between “98 percent fat free” and “1 percent fat.” Again, most respondents chose the first option—despite its higher fat content.
“Glossing” is a popular type of framing. Under its rules, a tumbling share price becomes a “correction.” An overpaid acquisition price is branded “goodwill.” In every management course, a problem magically transforms into an “opportunity” or a “challenge.” A person who is fired is “reassessing his career.” A fallen soldier—regardless of how much bad luck or stupidity led to his death—turns into a “war hero.” Genocide translates to “ethnic cleansing.” A successful emergency landing, for example on the Hudson River, is celebrated as a “triumph of aviation.” (Shouldn’t a textbook landing on a runway count as an even bigger triumph of aviation?)