The Art of Thinking Clearly

Home > Other > The Art of Thinking Clearly > Page 24
The Art of Thinking Clearly Page 24

by Rolf Dobelli


  SURVIVORSHIP BIAS

  Survivorship bias in funds and stock market indices, see: Edwin J. Elton, Martin J. Gruber, and Christopher R. Blake, “Survivorship Bias and Mutual Fund Performance,” The Review of Financial Studies 9, no. 4 (1996): 1097–1120.

  Statistically relevant results by coincidence (self-selection), see: John P. A. Ioannidis, “Why Most Published Research Findings Are False,” PLoS Med 2, no. 8 (2005): e124.

  SWIMMER’S BODY ILLUSION

  Nassim Nicholas Taleb, The Black Swan: The Impact of the Highly Improbable (New York: Random House, 2007), 109–10.

  “Ideally, the comparison should be made between people who went to Harvard and people who were admitted to Harvard but chose instead to go to Podunk State. Unfortunately, this is likely to produce samples too small for statistical analysis.” Thomas Sowell, Economic Facts and Fallacies (New York: Basic Books, 2008), 106.

  David Lykken and Auke Tellegen, “Happiness Is a Stochastic Phenomenon,” Psychological Science 7, no. 3 (May 1996): 189.

  In his book Good to Great, Jim Collins cites the CEO of Pitney Bowes, Dave Nassef: “I used to be in the Marines, and the Marines get a lot of credit for building people’s values. But that’s not the way it really works. The Marine Corps recruits people who share the corps’ values, then provides them with training required to accomplish the organization’s mission.”

  CLUSTERING ILLUSION

  The random sequence OXXXOXXXOXXOOOXOOXXOO: Thomas Gilovich, How We Know What Isn’t So: The Fallibility of Human Reason in Everyday Life (New York: Free Press, 1993), 16.

  Daniel Kahneman and Amos Tversky, “Subjective Probability: A Judgment of Representativeness,” in Daniel Kahneman, Paul Slovic, and Amos Tversky, Judgment under Uncertainty: Heuristics and Biases (New York: Cambridge University Press, 1982), 32–47.

  This paper caused controversy because it destroyed many athletes and sports commentators’ belief in the “hot hand”—in lucky streaks: Thomas Gilovich, Robert Vallone, and Amos Tversky, “The Hot Hand in Basketball: On the Misperception of Random Sequences,” Cognitive Psychology 17 (1985): 295–314.

  The Virgin Mary on toast on BBC: accessed November 1, 2012, http://news.bbc.co.uk/2/hi/4034787.stm.

  The clustering illusion has been recognized for centuries. In the eighteenth century, David Hume commented in The Natural History of Religion: “We see faces on the moon and armies in the clouds.”

  “The “Nun Bun” was a cinnamon pastry whose twisty rolls eerily resembled the nose and jowls of Mother Teresa. It was found in a Nashville coffee shop in 1996, but was stolen on Christmas in 2005. ‘Our Lady of the Underpass’ was another appearance by the Virgin Mary, this time in the guise of a salt stain under Interstate 94 in Chicago that drew huge crowds and stopped traffic for months in 2005. Other cases include Hot Chocolate Jesus, Jesus on a shrimp tail dinner, Jesus in a dental x-ray, and Cheesus (a Cheeto purportedly shaped like Jesus).” Christopher Chabris and Daniel Simons, The Invisible Gorilla: And Other Ways Our Intuitions Deceive Us (New York: Crown, 2010), 155.

  “almost immediately after you see an object that looks anything like a face, your brain treats it like a face and processes it differently than other objects.” ibid page 156.

  Recognizing faces in objects is called “pareidolia”—clocks, the front of a car, the moon.

  The brain processes different things in different regions. As soon as an object looks like a face, the brain treats it like a face—this is very different from other objects.

  SOCIAL PROOF

  Robert B. Cialdini, Influence: The Psychology of Persuasion, rev. ed. (New York: William Morrow, 1993), 114–65.

  Solomon E. Asch, “Effects of Group Pressure upon the Modification and Distortion of Judgment,” in H. Guetzkow (ed.), Groups, Leadership and Men (Pittsburgh: Carnegie Press, 1951), 177–90.

  Canned laughter works especially well if it’s in-group laughter. “Participants laughed and smiled more, laughed longer, and rated humorous material more favorably when they heard in-group laughter rather than out-group laughter or no laughter at all.” See: Michael J. Platow et al., “It’s Not Funny If They’re Laughing: Self-Categorization, Social Influence, and Responses to Canned Laughter,” Journal of Experimental Social Psychology 41, no. 5 (2005): 542–50.

  The storm of enthusiasm for Goebbels’s speech did not stem from social proof alone. What you do not see in the YouTube video is a banner above the speaker declaring “Total War = Shortest War,” an argument that made sense to many. After the Stalingrad debacle, people were sick of the war. Thus, the population had to be won back with this argument: The more aggressively it was fought, the quicker it would be over. Thanks to Johannes Grützig (Germany) for this insight. My comment: I don’t think that before the speech the Hitler regime was interested in waging war for longer than was necessary. In this respect, Goebbels’s argument is not convincing.

  Besides the vacation restaurant, there’s another case where social proof is of value: if you have tickets to a football game in a foreign city and don’t know where the stadium is. Here, it makes sense to follow the people who look like football fans.

  German philosopher Friedrich Nietzsche warned half a century before the Goebbel craze: “Madness is a rare thing in individuals—but in groups, parties, peoples, and ages it is the rule.”

  SUNK COST FALLACY

  The classic research on the sunk cost fallacy is: H. R. Arkes and C. Blumer, “The Psychology of Sunk Cost,” Organizational Behavior and Human Decision Processes 35 (1985): 124–40. In this research, Arkes and Blumer asked subjects to imagine that they had purchased tickets for a ski trip to Michigan (at a price of $100) and to Wisconsin (at a price of $50)—for the same day. The tickets are nonrefundable. Which ticket are you going to keep, assuming that you prefer the Wisconsin trip? Most subjects picked the less preferred trip to Michigan because of its higher ticket price.

  On the Concorde, see: P. J. Weatherhead, “Do Savannah Sparrows Commit the Concorde Fallacy?,” in Behavioral Ecology and Sociobiology (Berlin: Springer-Verlag, 1979), vol. 5, 373–81.

  It’s a strange finding that lower animals and children don’t exhibit the sunk cost fallacy. Only in later years do we start to display this wrong behavior. Read: Hal R. Arkes and Peter Ayton, “The Sunk Cost and Concorde Effects: Are Humans Less Rational than Lower Animals?,” Psychological Bulletin 125 (1999): 591–600.

  RECIPROCITY

  Robert B. Cialdini, Influence: The Psychology of Persuasion, rev. ed. (New York: HarperCollins, 1993), 17–56.

  Robert Trivers published the theory of reciprocal altruism in 1971, which shed light on all kinds of human behavior. Thus, reciprocity is the basis for biological cooperation—besides kinship. See any basic biology textbook since 1980.

  For evolutionary psychology’s justification of reciprocity, see: David M. Buss, Evolutionary Psychology: The New Science of the Mind (Boston: Allyn and Bacon, 1999). Also: Roy F. Baumeister, The Cultural Animal: Human Nature, Meaning, and Social Life (Oxford, UK: Oxford University Press, 2005).

  CONFIRMATION BIAS (PART 1)

  How Darwin handled the confirmation bias, in: Charles T. Munger, Poor Charlie’s Almanack, expanded 3rd ed. (Virginia Beach, VA: The Donning Company Publishers, 2006), 462.

  “What Keynes was reporting is that the human mind works a lot like the human egg. When one sperm gets into a human egg, there’s an automatic shut-off device that bars any other sperm from getting in. The human mind tends strongly toward the same sort of result. And so, people tend to accumulate large mental holdings of fixed conclusions and attitudes that are not often reexamined or changed, even though there is plenty of good evidence that they are wrong.” In: Munger, Poor Charlie’s Almanack, 461.

  “What the human being is best of doing, is interpreting all new information so that their prior conclusions remain intact.” Warren Buffett at the Berkshire Hathaway annua
l meeting, 2002, quoted in Peter Bevelin, Seeking Wisdom: From Darwin to Munger (Malmö, Sweden: PCA Publications, 2007), 56.

  Nassim Nicholas Taleb, The Black Swan: The Impact of the Highly Improbable (New York: Random House, 2007), 58–59.

  For the experiment with the sequence of numbers, see: Peter C. Wason, “On the Failure to Eliminate Hypotheses in a Conceptual Task,” Quarterly Journal of Experimental Psychology 12, no. 3 (1960): 129–40.

  “Faced with the choice between changing one’s mind and proving there is no need to do so, almost everyone gets busy on the proof.” John Kenneth Galbraith, The Essential Galbraith (New York: Houghton Mifflin, 2001), 241.

  CONFIRMATION BIAS (PART 2)

  Stereotyping as a special case of the confirmation bias, see: Roy F. Baumeister, The Cultural Animal: Human Nature, Meaning, and Social Life (Oxford, UK: Oxford University Press, 2005), 198–200.

  AUTHORITY BIAS

  Robert B. Cialdini, Influence: The Psychology of Persuasion, rev. ed. (New York: HarperCollins, 1993), 208–36.

  For the track record of doctors before 1900 and a beautiful exposition on the authority of doctors and their strange theories, see: Noga Arkiha, Passions and Tempers: A History of the Humours (New York: Harper Perennial, 2008).

  “Iatrogenic” conditions and injuries are those caused by medical treatment, for example, bloodletting.

  After the 2008 financial crisis, two unexpected events of global proportions (Black Swans) took place: The Arab uprisings (2011) and the tsunami/nuclear disaster in Japan (2011). Not one of the world’s estimated 100,000 political and security authorities foresaw (or even could foresee) these events. This should be reason enough to distrust them—particularly if they are “experts” in all things social (fashion trends, politics, economics). These people are not stupid. They are simply misfortunate enough to have chosen a career in which they cannot win. Two alternatives are open to them: (a) to admit they don’t know (not the best choice if you have a family to feed) or (b) to spout hot air.

  Stanley Milgram, Obedience to Authority; An Experimental View (New York: Harper and Row, 1974). There is also a great DVD entitled Obedience (1969).

  “If a CEO is enthused about a particularly foolish acquisition, both his internal staff and his outside advisors will come up with whatever projections are needed to justify his stance. Only in fairy tales are emperors told that they are naked.” In: Warren Buffett, letter to shareholders of Berkshire Hathaway, 1998.

  CONTRAST EFFECT

  Robert B. Cialdini, Influence: The Psychology of Persuasion, rev. ed. (New York: HarperCollins, 1993), 11–16.

  Charlie Munger calls the contrast effect the “Contrast-Misreaction Tendency.” See: Charles T. Munger, Poor Charlie’s Almanack, expanded 3rd ed. (Virginia Beach, VA: The Donning Company Publishers, 2006), 483.

  Dan Ariely refers to the effect as the “relativity problem.” See: Dan Ariely, Predictably Irrational: The Hidden Forces That Shape Our Decisions, rev. and expanded ed. (New York: Harper, 2009), chapter 1.

  Contrasting factors may lead you to take the long way around: See: Daniel Kahneman and Amos Tversky, “Prospect Theory: An Analysis of Decision under Risk,” Econometrica 47, no. 2 (1979): 263–92.

  AVAILABILITY BIAS

  The example with the letter “k”: Amos Tversky and Daniel Kahneman, “Availability: A Heuristic for Judging Frequency and Probability,” Cognitive Psychology 5 (1973): 207–32.

  The availability bias leads to a wrong risk map in our mind. Tornadoes, airplane crashes, and electrocutions are widely reported in the media, which makes them easily available in our minds. On the other hand, deaths resulting from asthma, vaccinations, and glucose intolerance are underestimated because they are usually not reported. Read: Sarah Lichtenstein et al., “Judged Frequency of Lethal Events,” Journal of Experimental Psychology: Human Learning and Memory 4 (1978): 551–78.

  Another great quote from Charlie Munger on the availability bias: “You see that again and again—that people have some information they can count well and they have other information much harder to count. So they make the decision based only on what they can count well. And they ignore much more important information because its quality in terms of numeracity is less—even though it’s very important in terms of reaching the right cognitive result. All I can tell you is that around Wesco [Charlie Munger’s investment firm, comment RD] and Berkshire, we try not to be like that. We have Lord Keynes’ attitude, which Warren quotes all the time: ‘We’d rather be roughly right than precisely wrong.’ In other words, if something is terribly important, we’ll guess at it rather than just make our judgment based on what happens to be easily countable.” In: Peter Bevelin, Seeking Wisdom: From Darwin to Munger (Malmö, Sweden: PCA Publications, 2007), 176.

  Another way of stating the availability bias by Charlie Munger: “An idea or a fact is not worth more merely because it is easily available to you.” In: Charles T. Munger, Poor Charlie’s Almanack, expanded 3rd ed. (Virginia Beach, VA: The Donning Company Publishers, 2006), 486. Quoted from Wesco Financial annual meeting, 1990, Outstanding Investor Digest, June 28, 1990, 20–21.

  The availability bias is the reason why, when it comes to risk management, firms focus primarily on risks in the financial market: There is plenty of data on this. With operational risk, however, there is almost no data. It’s not public. You would have to painstakingly cobble it together from many companies and that’s expensive. For this reason, we create theories using material that is easy to find.

  “The medical literature shows that physicians are often prisoners of their first-hand experience: their refusal to accept even conclusive studies is legendary.” Robyn M. Dawes, Everyday Irrationality: How Pseudo-Scientists, Lunatics, and the Rest of Us Systematically Fail to Think Rationally (New York: Westview Press, 2011), 102.

  Confidence in the quality of your own decisions depends solely on the number of decisions (predictions) made, regardless of how accurate or inaccurate they were. This is the chief problem with consultants. They make tons of decisions and predictions, but seldom validate them after the fact. They are on to the next projects, the next clients, and if something went wrong, well, it was a faulty implementation of their ideas and strategies. See: Hillel J. Einhorn and Robin M Hogarth, “Confidence in Judgment: Persistence of the Illusion of Validity,” Psychological Review 85, no. 5 (September 1978): 395–416.

  THE IT’LL-GET-WORSE-BEFORE-IT-GETS-BETTER FALLACY

  No reference literature. This error in thinking is obvious.

  STORY BIAS

  “The king died and then the queen” is a story. “The king died and then the queen died of grief” is a plot. The difference between the two is causality. The English novelist E. M. Forster proposed this distinction in 1927.

  Scientists still debate about which version of the king/queen debate is easier to recall from memory. The results of one study point to the following direction: If it takes a lot of mental effort to link two propositions, then recall is poor. If it takes zero mental effort to link two propositions, recall is poor, too. But if it takes an intermediate level of mental work, then recall is best. In other words, take these two sentences: “Joey’s big brother punched him again and again. The next day his body was covered by bruises.” “Joey’s crazy mother became furiously angry with him. The next day his body was covered by bruises.” To understand the second pair of sentences, you must make an extra logical inference. By putting in this extra work you form a richer memory for what you’ve read. The following study showed that recognition and recall memory for the causes was poorest for the most and least related causes and best for causes of intermediate levels of relatedness. Janice E. Keenan et al., “The Effects of Causal Cohesion on Comprehension and Memory,” Journal of Verbal Learning and Verbal Behavior 23, no. 2 (April 1984): 115–26.

  Robyn M. Dawes, Everyday Irrationality: How Pseudo-Scientists, Lunatics, and the Re
st of Us Systematically Fail to Think Rationally (New York: Westview Press, 2001), 111–13.

  “Narrative imagining—story—is the fundamental instrument of thought.” Mark Turner, The Literary Mind: The Origins of Thought and Language (New York: Oxford University Press, 1998), 4.

  The vignette of the car driving over the bridge, from Nassim Nicholas Taleb, personal communication.

  HINDSIGHT BIAS

  On Reagan’s election: John F. Stacks, “Where the Polls Went Wrong,” Time magazine, December 1, 1980.

  One of the classic studies is from Baruch Fischhoff. He asked people to judge the outcome of a war they knew little about (British forces against the Nepalese Gurkhas in Bengal in 1814). Those who knew the outcome judged that outcome as much more probable. See: Baruch Fischhoff, “Hindsight ≠ Foresight: The Effect of Outcome Knowledge on Judgment under Uncertainty,” Journal of Experimental Psychology: Human Perception and Performance 104 (1975): 288–99.

  H. Blank, J. Musch, and R. Pohl, “Hindsight Bias: On Being Wise after the Event,” Social Cognition 25, no. 1 (2007): 1–9.

  OVERCONFIDENCE EFFECT

  The original research paper on overconfidence: Sarah Lichtenstein and Baruch Fischhoff, “Do Those Who Know More Also Know More about How Much They Know?,” Organizational Behavior and Human Performance 20 (1977): 159–83.

  Marc Alpert and Howard Raiffa, “A Progress Report on the Training of Probability Assessors,” in Daniel Kahneman, Paul Slovic, and Amos Tversky, Judgment under Uncertainty: Heuristics and Biases (New York: Cambridge University Press, 1982), 294–305.

  Ulrich Hoffrage, “Overconfidence,” in Rüdiger Pohl, Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgment and Memory (Hove, UK: Psychology Press, 2004), 235–54.

  Dale Griffin and Amos Tversky, “The Weighing of Evidence and the Determinants of Confidence,” in Thomas Gilovich, Dale Griffin, and Daniel Kahneman (eds.), Heuristics and Biases: The Psychology of Intuitive Judgment (Cambridge, UK: Cambridge University Press, 2002), 230–49. The Black Swan p 153.

 

‹ Prev