The Art of Thinking Clearly
Page 27
Here is another great study that shows the inability for experts to forecast. Gustav Torngren and Henry Montgomery asked participants to select the stock from a pair of stocks that would outperform each month. They were known blue chip names, and the players were given the prior twelve months’ performance for each stock. Participants included lay people (undergrads in psychology) and professional investors. Both groups performed worse than sheer luck. Both would have fared better by tossing a coin. Overall, the laypeople were 59 percent confident in their stock picking abilities, the experts 65 percent. See: Gustav Torngren and Henry Montgomery, “Worse Than Chance? Performance and Confidence among Professionals and Laypeople in the Stock Market,” Journal of Behavioural Finance 5, no. 3 (2004): 148–53.
CONJUNCTION FALLACY
The Chris story is a modified version of the so-called Bill story and Linda story by Tversky and Kahneman: Amos Tversky and Daniel Kahneman, “Extension versus Intuitive Reasoning: The Conjunction Fallacy in Probability Judgment,” Psychological Review 90, no. 4 (October 1983): 293–315. Thus, the conjunction fallacy is often referred to as the “Linda problem.”
The example using oil consumption: Ibid., 308. Another interesting example of the conjunction fallacy can be found in the same paper. What is more probable? (a) “a complete suspensions of diplomatic relations between the US and the Soviet Union, sometime in 1983,” or (b) “a Russian invasion of Poland, and a complete suspensions of diplomatic relations between the US and the Soviet Union, sometime in 1983.” Many more people opted for the more plausible scenario B, although it is less likely.
On the two types of thinking—intuitive versus rational, or system 1 versus system 2, see: Daniel Kahneman, “A Perspective on Judgment and Choice: Mapping Bounded Rationality,” American Psychologist 58 (September 2003): 697–720. Or you can read Kahneman’s Thinking, Fast and Slow (New York: Farrar, Straus and Giroux, 2011), which is all about system 1 versus system 2.
A much simpler version of the conjunction fallacy is the following question that has been posed to children: “In summer at the beach are there more women or more tanned women?” Most children fell for (the more representative or available) “tanned women.” See: Franca Agnoli, “Development of Judgmental Heuristics and Logical Reasoning: Training Counteracts the Representativeness Heuristic,” Cognitive Development 6, no. 2 (April–June 1991): 195–217.
Tversky and Kahneman asked: What is more likely, that a seven-letter word randomly selected from a novel would end in ing or has the letter “n” as its sixth letter? This highlights both the availability bias and the conjunction fallacy. All seven-letter words ending with ing have the letter “n” as its sixth letter, but not all with the letter “n” as its sixth letter end in ing. Again, the driving force for the conjunction fallacy is the availability bias. Words ending with ing come to mind more easily. See: Tversky and Kahneman, “Extensional versus Intuitive Reasoning: The Conjunction Fallacy in Probability Judgment,” 295.
The story with the terrorism insurance is adapted from Nassim Nicholas Taleb, The Black Swan: The Impact of the Highly Improbable (New York: Random House, 2007), 76–77.
FRAMING
Amos Tversky and Daniel Kahneman, “The Framing of Decisions and the Psychology of Choice,” Science 211, no. 4481 (January 30, 1981): 453–58.
The framing effect in medicine, see: Robyn M. Dawes, Everyday Irrationality: How Pseudo-Scientists, Lunatics, and the Rest of Us Systematically Fail to Think Rationally (New York: Westview Press, 2001), 3–8.
R. Shepherd, P. Sparks, S. Bellier, and M. M. Raats, “The Effects of Information on Sensory Ratings and Preferences: The Importance of Attitudes,” Food Quality and Preference 3, no. 3 (1992): 147–55.
ACTION BIAS
Michael Bar-Eli, Ofer H. Azar, Ilana Ritov, Yael Keidar-Levin, and Galit Schein, “Action Bias among Elite Soccer Goalkeepers: The Case of Penalty Kicks,” Journal of Economic Psychology 28, no. 5 (2007): 606–21.
The quote from Charlie Munger: “We’ve got great flexibility and a certain discipline in terms of not doing some foolish thing just to be active—discipline in avoiding just doing any damn thing just because you can’t stand inactivity.” In: Wesco Financial annual meeting, 2000, Outstanding Investor Digest, December 18, 2000, 60.
Warren Buffett successfully avoids the action bias: “We don’t get paid for activity, just for being right. As to how long we’ll wait, we’ll wait indefinitely.” Warren Buffett, 1998 Berkshire Hathaway annual meeting.
“The stock market is a no-called-strike game. You don’t have to swing at everything—you can wait for your pitch. The problem when you’re a money manager is that your fans keep yelling, ‘Swing, you bum!’ ” Warren Buffett, 1999 Berkshire Hathaway annual meeting.
“It takes character to sit there with all that cash and do nothing. I didn’t get to where I am by going after mediocre opportunities.” Charlie Munger, Poor Charlie’s Almanack, expanded 3rd ed. (Virginia Beach, VA: The Donning Company Publishers, 2006), 61.
“Charlie realizes that it is difficult to find something that is really good. So, if you say ‘No’ ninety percent of the time, you’re not missing much in the world.” Otis Booth in ibid., 99.
Charlie Munger: “There are huge advantages for an individual to get into a position where you make a few great investments and just sit on your ass: You’re paying less to brokers. You’re listening to less nonsense.” Ibid., 209.
The example with the police officers in: “Action Bias in Decision Making and Problem Solving,” Ambiguity Advantage, blog, February 21, 2008.
OMISSION BIAS
Jonathan Baron, Thinking and Deciding (Cambridge, UK: Cambridge University Press, 2000), 407–8 and 514.
To get around the omission bias, put yourself in the shoes of the harmed individual. If you were that baby about to get vaccinated, what is more preferable to you: a 10/10,000 chance of death from the disease or a 5/10,000 chance death from the vaccine? And does it matter if these chances are a matter of commission or omission? Ibid., 407.
D. A. Asch, Jonathan Baron, J . C. Hershey, H. Kunreuther, J. R. Meszaros, Ilana Ritov, and M. Spranca, “Omission Bias and Pertussis Vaccination,” Medical Decision Making 14, no. 2 (April–June 1994): 118–23.
There is some confusion as to whether a behavior is due to the omission bias, the status quo bias, or social norm. Baron and Ritov disentangle these questions in this paper: Jonathan Baron and Ilana Ritov, “Omission Bias, Individual Differences, and Normality,” Organizational Behavior and Human Decision Processes 94 (2004): 74–85.
The following paper deals with the omission bias in legal practice in Switzerland. It is only available in German: Mark Schweizer, “Der Unterlassungseffekt,” chapter from “Kognitive Täuschungen vor Gericht” (PhD dissertation, University of Zurich, 2005), 108–23.
SELF-SERVING BIAS
Just as in the “taking out the garbage” example, Ross and Sicoly asked husbands and wives to which percentage they are responsible for activities like cleaning the house, making breakfast, causing arguments. Each spouse overestimated his or her role. The answers always added up to more than 100 percent. Read: Ross and Sicoly, “Egocentric Bias in Availability and Attribution.”
Barry R. Schlenker and Rowland S. Miller, “Egocentrism in Groups: Self-Serving Biases or Logical Information Processing?,” Journal of Personality and Social Psychology 35, no. 10 (October 1977): 755–64.
The following research modifies that view that we always attribute failure to outside factors: Dale T. Miller and Michael Ross, “Self-Serving Biases in the Attribution of Causality: Fact or Fiction?,” Psychological Bulletin 82 (1975): 213–25.
Roy F. Baumeister, The Cultural Animal: Human Nature, Meaning, and Social Life (Oxford, UK: Oxford University Press, 2005), 214–19.
“Of course you also want to get the self-serving bias out of your mental routines. Thinking that what’s good for yo
u is good for the wider civilization, and rationalizing foolish or evil conduct, based on your subconscious tendency to serve yourself, is a terrible way to think.” Charles T. Munger: Poor Charlie’s Almanack, expanded 3rd ed. (Virginia Beach, VA: The Donning Company Publishers, 2006), 432.
Joel T. Johnson, Lorraine M. Cain, Toni L. Falke, Jon Hayman, and Edward Perillo, “The ‘Barnum Effect’ Revisited: Cognitive and Motivational Factors in the Acceptance of Personality Descriptions,” Journal of Personality and Social Psychology 49, no. 5 (November 1985): 1378–91.
This is an example of a study with school grades: Robert M. Arkin and Geoffrey M. Maruyama, “Attribution, Affect and College Exam Performance,” Journal of Educational Psychology 71, no. 1 (February 1979): 85–93.
See this video on grades on TED.com: Dan Ariely, Why We Think It’s OK to Cheat and Steal (Sometimes).
The self-serving bias is sometimes also called “egocentric bias.” Sometimes, the scientific literature differentiates between the two, especially when it comes to group settings. The self-serving bias claims credit for positive outcomes only. The egocentric bias, however, claims credit even for negative outcomes. It is suggested that the egocentric bias is simply an availability bias in disguise because your own actions and contributions are more available to you (in memory) than the actions and contributions of the other group members. See: Ross and Sicoly, “Egocentric Biases in Availability and Attribution.”
HEDONIC TREADMILL
The classic paper on the hedonic treadmill effect: Philip Brickman and D. T. Campbell, “Hedonic Relativism and Planning the Good Society,” in M. H. Appley (ed.), Adaptation-Level Theory: A Symposium (New York: Academic Press, 1971), 278–301. It focuses not just on income, but on improvements of consumer electronic and gadgets. We quickly adjust to the latest gadgets and their “happiness effect” fades away quickly.
Daniel T. Gilbert et al., “Immune Neglect: A Source of Durability Bias in Affective Forecasting,” Journal of Personality and Social Psychology 75, no. 3 (1989): 617–38.
Daniel T. Gilbert and Jane E. Ebert, “Decisions and Revisions: The Affective Forecasting of Changeable Outcomes,” Journal of Personality and Social Psychology 82, no. 4 (2002): 503–14.
Daniel T. Gilbert, Stumbling on Happiness (New York: Alfred A. Knopf, 2006).
Major live dramas have almost no long-term impact on happiness. Daniel T. Gilbert, Why Are We Happy?, video on TED.com (http://www.youtube.com/watch?v=LTO_dZUvbJA).
Nassim Nicholas Taleb, The Black Swan: The Impact of the Highly Improbable (New York: Random House, 2007), 91.
Bruno S. Frey and Alois Stutzer, Happiness and Economics: How the Economy and Institutions Affect Human Well-Being (Princeton, NJ: Princeton University Press, 2002).
Subjective well-being (happiness) seems to be heavily influenced by genetics. In other words, it’s chance! Socioeconomic status, educational attainment, family income, marital status, or religious commitment can account for no more than about 3 percent of the variance in subjective well-being. See: David Lykken and Auke Tellegen, “Happiness Is a Stochastic Phenomenon,” Psychological Science 7, no. 3 (May 1996): 186–89.
Life satisfaction seems to be extremely stable over time, although it can be more volatile in the short term. See: Frank Fujita and Ed Diener, “Life Satisfaction Set Point: Stability and Change,” Journal of Psychology and Social Psychology 88, no. 1 (2005): 158–64.
In case you are looking for more research on the topic: hedonic treadmill is also called “hedonic adaptation.”
SELF-SELECTION BIAS
On incubation of funds: “A more deliberate form of self selection bias often occurs in measuring the performance of investment managers. Typically, a number of funds are set up that are initially incubated: kept closed to the public until they have a track record. Those that are successful are marketed to the public, while those that are not successful remain in incubation until they are. In addition, persistently unsuccessful funds (whether in an incubator or not) are often closed, creating survivorship bias. This is all the more effective because of the tendency of investors to pick funds from the top of the league tables regardless of the performance of the manager’s other funds.” Quoted from Moneyterms, http://moneyterms.co.uk/self-selection-bias/.
“It is not uncommon for someone watching a tennis game on television to be bombarded by advertisements for funds that did (until that minute) outperform other by some percentage over some period. But, again, why would anybody advertise if he didn’t happen to outperform the market? There is a high probability of the investment coming to you if its success is caused entirely by randomness. This phenomenon is what economists and insurance people call adverse selection.” Nassim Nicholas Taleb, Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets, 2nd updated ed. (New York: Random House, 2004), 158.
ASSOCIATION BIAS
The story with the gas leak, see: Roy F. Baumeister, The Cultural Animal: Human Nature, Meaning, and Social Life (Oxford, UK: Oxford University Press, 2005), 280.
Buffett wants to hear the bad news—in plain terms. “Always tell us the bad news promptly. It is only the good news that can wait.” In: Charles T. Munger, Poor Charlie’s Almanack, expanded 3rd ed. (Virginia Beach, VA: The Donning Company Publishers, 2006), 472.
“Don’t shoot the messenger” appears in Shakespeare’s Henry IV, last act.
In the eighteenth century, many states, including the states in New England, employed town criers. Their task was to disseminate news—often bad news—for example, tax increases. In order to beat the “kill the messenger” syndrome, the states adopted a law (probably read aloud by the town crier), whereby injury or abuse of the crier earned the harshest penalty. Today we are no longer as civilized. We try to lock up the loudest criers. Such an example is Julian Assange, founder of Wikileaks.
BEGINNER’S LUCK
Nassim Nicholas Taleb, The Black Swan: The Impact of the Highly Improbable (New York: Random House, 2007), 109.
COGNITIVE DISSONANCE
Scott Plous, The Psychology of Judgment and Decision Making (New York: McGraw-Hill, 1993), 22–25.
The classic paper on cognitive dissonance: Leon Festinger and James M. Carlsmith, “Cognitive Consequences of Forced Compliance,” Journal of Abnormal and Social Psychology 58 (1959): 203–10.
There is a French version of the sour-grapes rationalization: The fox wrongly believes the grapes to be green instead of vermillion and sweet. See: Jon Elster, Sour Grapes: Studies in the Subversion of Rationality (Cambridge, UK: Cambridge University Press, 1983), 123–24.
One of investor George Soros’s strengths, according to Taleb, is his complete lack of cognitive dissonance. Soros can change his mind from one second to the next—without the slightest sense of embarrassment. See: Nassim Nicholas Taleb, Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets, 2nd updated ed. (New York: Random House, 2004), 239.
HYPERBOLIC DISCOUNTING
A range of research papers cover this topic. This is the first: Richard H. Thaler, “Some Empirical Evidence on Dynamic Inconsistency,” Economic Letters 8 (1981): 201–7.
For the marshmallow test, see: Yuichi Shoda, Walter Mischel, and Philip K. Peake, “Predicting Adolescent Cognitive and Self-Regulatory Competencies from Preschool Delay of Gratification: Identifying Diagnostic Conditions,” Developmental Psychology 26, no. 6 (1990): 978–86.
“ . . . the ability to delay gratification is very adaptive and rational, but sometimes it fails and people grab for immediate satisfaction. The effect of the immediacy resembles the certainty effect: People prefer the immediate gain just as they prefer the guaranteed gain. And both of these suggest that underneath the sophisticated thinking process of the cultural animal there still lurk the simpler needs and inclinations of the social animal. Sometimes these win out.” Roy F. Baumeister, The Cultural Animal: Human Nature, Meaning, and Social
Life (Oxford, UK: Oxford University Press, 2005), 321.
What about very long periods of time? Suppose you run a restaurant and a diner makes the following suggestion: Instead of paying his check of $100 today, he will pay you $1,700 in thirty years’ time—that’s a nice interest rate of 10 percent. Would you go for it? Probably not. Who knows what will happen in the next thirty years? So have you just committed a thinking error? No. In contrast to hyperbolic discounting, higher interest rates over long periods of time are quite advisable. In Switzerland (before Fukushima), there was debate about a plan to build a nuclear power plant with a payback period of thirty years. An idiotic idea. Who knows what new technologies will come on the market during those thirty years? A payback period of ten years would be justified, but not thirty years—and that’s not even mentioning the risks.
“BECAUSE” JUSTIFICATION
The Xerox experiment by Ellen Langer cited in Robert B. Cialdini, Influence: The Psychology of Persuasion, rev. ed. (New York: HarperCollins, 1993), 4.
The “because” justification works beautifully as long as the stakes are small (making copies). As soon as the stakes are high, people mostly listen attentively to the arguments. Noah Goldstein, Steve Martin, and Robert Cialdini, Yes!—50 Scientifically Proven Ways to Be Persuasive (New York: Free Press, 2008), 150–53.
DECISION FATIGUE
“The problem of decision fatigue affects everything from the careers of CEOs to the prison sentences of felons appearing before weary judges. It influences the behavior of everyone, executive and nonexecutive, every day.” Roy Baumeister and John Tierney, Willpower: Rediscovering the Greatest Human Strength (New York: Penguin Press, 2011), 90.
The student experiment with the “deciders” and “non-deciders”: Ibid., 91, 92.
The example with the judges: Ibid., 96–99.
The detailed paper on the judges’ decisions: Shai Danziger, Jonathan Levav, and Liora Avnaim-Pesso, “Extraneous Factors in Judicial Decisions,” Proceedings of the National Academy of Science 108, no. 17 (February 25, 2011): 6889–92.