The Art of Thinking Clearly
Page 16
So does it mean Jack is a fool if he turns up, hammer in hand, on Saturday morning? Not necessarily. There is one group exempt from volunteer’s folly: celebrities. If Bono, Kate Winslet, and Mark Zuckerberg pose for photos while making birdhouses, cleaning oil-stained beaches, or digging for earthquake victims, they lend something priceless to the situation: publicity. Therefore, Jack must critically assess whether he is famous enough to make his participation worthwhile. The same applies to you and me: If people don’t double-take when they pass you on the street, the best way to contribute is with greenbacks rather than greenhorn labor.
66
Why You Are a Slave to Your Emotions
Affect Heuristic
What do you think of genetically modified wheat? It’s a complex issue. You don’t want to answer too hastily. A rational approach would be to consider the controversial technology’s pros and cons separately. Write down the possible benefits, weight them in terms of importance, and then multiply them by the probability that they will occur. Doing so, you get a list of expected values. Next, do the same with the cons. List all the disadvantages, estimate their potential damage, and multiply them by the likelihood of them happening. The positive sum minus the negative sum equals the net expected value. If it is above zero, you are in favor of genetically modified wheat. If the sum is below zero, you are against it. More than likely you have already heard of this approach. It is called “expected value,” and it features in most literature on decision theory. But just as probable is that you’ve never bothered to carry out such an evaluation. And without a doubt, none of the professors who wrote the textbooks turned to this method to select their spouses.
Truth be told, no one uses this method to make decisions. First of all, we lack enough imagination to list all the possible pros and cons. We are limited by what springs to mind; we can only conjure up what we have seen in our modest experience. It is hard to imagine a storm of the century if you’re only thirty years old. Second, calculating small probabilities is impossible because we do not have enough data on rare events. The smaller the probability, the fewer data points we have and the higher the error rate on the exact probability—a vicious effect. Third, our brain is not built for such calculations. They require time and effort—not our preferred state. In our evolutionary past, whoever thought too long and hard vanished inside a predator’s jaws. We are the descendants of quick decision makers, and we rely on mental shortcuts called heuristics.
One of the most popular is the affect heuristic. An affect is a momentary judgment: something you like or dislike. The word “gunfire” triggers a negative effect. The word “luxury” produces a positive one. This automatic, one-dimensional impulse prevents you from considering risks and benefits to be independent variables, which indeed they are. Instead, the affect heuristic puts risks and benefits on the same sensory thread.
Your emotional reactions to issues such as nuclear power, organic vegetables, private schools, or motorbikes determine how you assess their risks and benefits. If you like something, you believe that the risks are smaller and the benefits greater than they actually are. If you don’t like something, the opposite is true. Risks and benefits appear to be dependent. Of course, in reality, they are not.
Even more impressive: Suppose you own a Harley-Davidson. If you come across a study that states that driving one is riskier than previously thought, you will subconsciously tweak how you rate the benefits, deeming the experience “an even greater sense of freedom.”
But how does an affect—the initial, spontaneous emotion—come to be? Researchers at the University of Michigan flashed one of three images for less than one hundredth of a second in front of participants: a smiling face, an angry face, or a neutral figure. The subjects then had to indicate whether they liked a randomly selected Chinese character or not (the participants didn’t speak Chinese). Most preferred symbols that immediately followed the smiling face. Seemingly insignificant factors influence our emotions. Here is another example where an insignificant factor plays a role. Researchers David Hirschleifer and Tyler Shumway tested the relationship between the amount of morning sun and daily market performance in twenty-six major stock exchanges between 1982 and 1997. They found a correlation that reads much like a farmer’s adage: If the sun is shining in the morning, the stock market will rise during the day. Not always, but often. Who would have thought that sunshine can move billions. The morning sun obviously has the same effect as a smiley face.
Whether we like it or not, we are puppets of our emotions. We make complex decisions by consulting our feelings, not our thoughts. Against our best intentions, we substitute the question, “What do I think about this?” with “How do I feel about this?” So, smile! Your future depends on it.
67
Be Your Own Heretic
Introspection Illusion
Bruce is in the vitamin business. His father founded the company when supplements were not yet a lifestyle product; a doctor had to prescribe them. When Bruce took over the operation in the early ’90s, demand skyrocketed. Bruce seized the opportunity with both hands and took out huge loans to expand production. Today, he is one of the most successful people in the business and president of a national association of vitamin manufactures. Since childhood, hardly a day has passed without him swallowing at least three multivitamins. A journalist once asked him if they do anything. He replied: “I’m sure of it.” Do you believe him?
I have another question for you: Take any idea you are 100 percent sure of: Perhaps that gold will rise over the next five years. Perhaps that God exists. Perhaps that your dentist is overcharging you. Whatever the belief, write it down in one sentence. Do you believe yourself?
I bet you consider your conviction more valid than Bruce’s, right? Here’s why: Yours is an internal observation, whereas Bruce’s is external. Crudely put, you can peek into your own soul, but not into his.
In Bruce’s case, you might think: “Come on, it’s obviously in his interest to believe that vitamins are beneficial. After all, his wealth and social status depend on the success of the company. He has to maintain a family tradition. All his life he has gulped down pills, so he’ll never admit that it was a waste of time.” For you, however, it’s a different story: You have searched deep inside. You are completely impartial.
But how pure and honest is internal reflection? The Swedish psychologist Petter Johannson allowed test subjects to glimpse two portrait photos of random people and choose which face was more attractive. Then he showed them the preferred photo up close and asked them to describe the most attractive features. However, with a sleight of hand, he switched the pictures. Most participants failed to notice and proceeded to justify, in detail, why they favored the image. The results of the study: Introspection is not reliable. When we soul-search, we contrive the findings.
The belief that reflection leads to truth or accuracy is called the introspection illusion. This is more than sophistry. Because we are so confident of our beliefs, we experience three reactions when someone fails to share our views. Response 1: Assumption of ignorance. The other party clearly lacks the necessary information. If he knew what you know, he would be of the same opinion. Political activists think this way: They believe they can win others over through enlightenment. Reaction 2: Assumption of idiocy. The other person has the necessary information, but his mind is underdeveloped. He cannot draw the obvious conclusions. In other words, he’s a moron. This reaction is particularly popular with bureaucrats who want to protect “stupid” consumers from themselves. Response 3: Assumption of malice. Your counterpart has the necessary information—he even understands the debate—but he is deliberately confrontational. He has evil intentions. This is how many religious leaders and followers treat disbelievers: If they don’t agree, they must be servants of the devil!
In conclusion: Nothing is more convincing than your own beliefs. We believe that introspection unearths genuine self-knowled
ge. Unfortunately, introspection is, in large part, fabrication posing two dangers: First, the introspection illusion creates inaccurate predictions of future mental states. Trust your internal observations too much and too long, and you might be in for a very rude awakening. Second, we believe that our introspections are more reliable than those of others, which creates an illusion of superiority. Remedy: Be all the more critical with yourself. Regard your internal observations with the same skepticism as claims from some random person. Become your own toughest critic.
68
Why You Should Set Fire to Your Ships
Inability to Close Doors
Next to my bed, two dozen books are stacked high. I have dipped in and out of all of them but am unable to part with even one. I know that sporadic reading won’t help me achieve any real insights, despite the many hours I put in, and that I should really devote myself to one book at a time. So why am I still juggling all twenty-four?
I know a man who is dating three women. He is in love with all three and can imagine starting a family with any of them. However, he simply doesn’t have the heart to choose just one because then he would be passing up on the other two for good. If he refrains from deciding, all options remain open. The downside is that no real relationship will develop.
In the third century BC, General Xiang Yu sent his army across the Yangtze River to take on the Qin dynasty. While his troops slept, he ordered all the ships to be set alight. The next day he told them: “You now have a choice: Either you fight to win or you die.” By removing the option of retreat, he switched their focus to the only thing that mattered: the battle. Spanish conquistador Cortés used the same motivational trick in the sixteenth century. After landing on the east coast of Mexico, he sank his own ship.
Xiang Yu and Cortés are exceptions. We mere mortals do everything we can to keep open the maximum number of options. Psychology professors Dan Ariely and Jiwoong Shin demonstrated the strength of this instinct using a computer game. Players started with one hundred points, and on the screen in front of them, three doors appeared—a red one, a blue one, and a green one. Opening a door cost a point, but for every room they entered, they could accrue more points. The players reacted logically: They found the most fruitful room and holed up there for the whole session. Ariely and Shin then changed the rules. If doors were not opened within twelve moves, they started shrinking on the screen and eventually vanished. Players now rushed from door to door to secure access to all potential treasure troves. All this unproductive scrambling meant they scored 15 percent fewer points than in the previous game. The organizers then added another twist: Opening doors now cost three points. The same anxiety kicked in: Players frittered away their points trying to keep all doors open. Even when the subjects learned how many points were hidden in each room, nothing changed. Sacrificing options was a price they were not willing to pay.
Why do we act so irrationally? Because the downside to such behavior is not always apparent. In the financial markets, things are clear: A financial option on a security always costs something. There is no such thing as a free option. In most other realms, however, options seem to be free. But this is an illusion. They also come at a price, but the price tag is often hidden and intangible: Each decision costs mental energy and eats up precious time for thinking and living. CEOs who examine every possible expansion option often choose none in the end. Companies that aim to address all customer segments end up addressing no one. Salespeople who chase every single lead close no deals.
We are obsessed with having as many irons as possible in the fire, ruling nothing out, and being open to everything. However, this can easily destroy success. We must learn to close doors. A business strategy is primarily a statement on what not to engage in. Adopt a life strategy similar to a corporate strategy: Write down what not to pursue in your life. In other words, make calculated decisions to disregard certain possibilities and when an option shows up, test it against your not-to-pursue list. It will not only keep you from trouble but also save you lots of thinking time. Think hard once and then just consult your list instead of having to make up your mind whenever a new door cracks open. Most doors are not worth entering, even when the handle seems to turn so effortlessly.
69
Disregard the Brand New
Neomania
How will the world look in fifty years? What will your everyday life be like? With which items will you surround yourself?
People who pondered this question fifty years ago had fanciful notions of how “the future” would look: Highways in the skies. Cities that resemble glass worlds. Bullet trains winding between gleaming skyscrapers. We would live in plastic capsules, work in underwater cities, vacation on the moon, and consume everything in pill form. We wouldn’t conceive offspring anymore; instead we would choose children from a catalog. Our best friends would be robots, death would be cured, and we would have exchanged our bikes for jet packs long ago.
But hang on a second. Take a look around. You’re sitting in a chair, an invention from ancient Egypt. You wear pants, developed about five thousand years ago and adapted by Germanic tribes around 750 BC. The idea behind your leather shoes comes from the last ice age. Your bookshelves are made of wood, one of the oldest building materials in the world. At dinnertime, you use a fork, a well-known “killer app” from Roman times, to shovel chunks of dead animals and plants into your mouths. Nothing has changed.
How will the world look in fifty years? In his latest book, Antifragile, Nassim Taleb gives us a clue: Assume that most of the technology that has existed for the past fifty years will serve us for another half century. And assume that recent technology will be passé in a few years’ time. Why? Think of these inventions as if they were species: Whatever has held its own throughout centuries of innovation will probably continue to do so in the future, too. Old technology has proven itself; it possesses an inherent logic even if we do not always understand it. If something has endured for epochs, it must be worth its salt. You can take this to heart the next time you are in a strategy meeting. Fifty years into the future will look a lot like today. Of course, you will witness the birth of many flashy gadgets and magic contraptions. But most will be short-lived.
When contemplating the future, we place far too much emphasis on flavor-of-the-month inventions and the latest “killer apps” while underestimating the role of traditional technology. In the 1960s, space travel was all the rage, so we imagined ourselves on school trips to Mars. In the ’70s, plastic was in, so we mulled over how we would furnish our see-through houses. Taleb, who uses above-mentioned examples of new and old technologies, coined a word for this: neomania, the mania for all things shiny and new.
In the past, I sympathized with so-called early adopters, the breed of people who cannot survive without the latest iPhone. I thought they were ahead of their time. Now I regard them as irrational and suffering from a kind of sickness: neomania. To them, it is of minor importance if an invention provides tangible benefits; novelty matters more.
So don’t go out on a limb when forecasting the future. Stanley Kubrick’s cult movie 2001: A Space Odyssey illustrates why you shouldn’t. Made in 1968, the movie predicted that, at the turn of the millennium, the United States would have a thousand-strong colony on the moon and that Pan Am would operate the commuter flights there and back. With this fanciful forecast in mind, I suggest this rule of thumb: Whatever has survived for X years will last another X years. Taleb wagers that the “bullshit filter of history” will sort the gimmicks from the game changers. And that’s one bet I’m willing to back.
70
Why Propaganda Works
Sleeper Effect
During World War II, every nation produced propaganda movies. These were devised to fill the population, especially soldiers, with enthusiasm for their country and, if necessary, to bolster them to lay down their lives. The United States spent so much money on propaganda that the War Departme
nt decided to find out whether the expense was really worth it. A number of studies were carried out to investigate how the movies affected regular soldiers. The result was disappointing: They did not intensify the privates’ enthusiasm for war in the slightest.
Was it because they were poorly made? Hardly. Rather, the soldiers were aware that the movies were propaganda, which discredited their message even before they were rolling. Even if the movie argued a point reasonably or managed to stir the audience, it didn’t matter; its content was deemed hollow from the outset and dismissed.
Nine weeks later, something unexpected happened. The psychologists measured the soldiers’ attitudes a second time. The result: Whoever had seen the movie expressed much more support for the war than those who had not viewed it. Apparently, propaganda did work after all!
The scientists were baffled, especially since they knew that an argument’s persuasiveness decreased over time. It has a half-life like a radioactive substance. Surely you have experienced this yourself: Let’s say you read an article on the benefits of gene therapy. Immediately after reading it you are a zealous convert, but after a few weeks, you don’t really remember why. More time passes until finally only a tiny fraction of enthusiasm remains.
Amazingly, just the opposite is true for propaganda. If it strikes a chord with someone, this influence will only increase over time. Why? Psychologist Carl Hovland, who led the study for the War Department, named this phenomenon the sleeper effect. To date, the best explanation is that, in our memories, the source of the argument fades faster than the argument. In other words, your brain quickly forgets where the information came from (e.g., from the Department of Propaganda). Meanwhile, the message itself (i.e., war is necessary and noble) fades only slowly or even endures. Therefore, any knowledge that stems from an untrustworthy source gains credibility over time. The discrediting force melts away faster than the message does.