Book Read Free

Thinking in Bets

Page 3

by Annie Duke


  Chess, for all its strategic complexity, isn’t a great model for decision-making in life, where most of our decisions involve hidden information and a much greater influence of luck. This creates a challenge that doesn’t exist in chess: identifying the relative contributions of the decisions we make versus luck in how things turn out.

  Poker, in contrast, is a game of incomplete information. It is a game of decision-making under conditions of uncertainty over time. (Not coincidentally, that is close to the definition of game theory.) Valuable information remains hidden. There is also an element of luck in any outcome. You could make the best possible decision at every point and still lose the hand, because you don’t know what new cards will be dealt and revealed. Once the game is finished and you try to learn from the results, separating the quality of your decisions from the influence of luck is difficult.

  In chess, outcomes correlate more tightly with decision quality. In poker, it is much easier to get lucky and win, or get unlucky and lose. If life were like chess, nearly every time you ran a red light you would get in an accident (or at least receive a ticket). If life were like chess, the Seahawks would win the Super Bowl every time Pete Carroll called that pass play.

  But life is more like poker. You could make the smartest, most careful decision in firing a company president and still have it blow up in your face. You could run a red light and get through the intersection safely—or follow all the traffic rules and signals and end up in an accident. You could teach someone the rules of poker in five minutes, put them at a table with a world champion player, deal a hand (or several), and the novice could beat the champion. That could never happen in chess.

  Incomplete information poses a challenge not just for split-second decision-making, but also for learning from past decisions. Imagine my difficulty as a poker player in trying to figure out if I played a hand correctly when my opponents’ cards were never revealed. If the hand concluded after I made a bet and my opponents folded, all I know is that I won the chips. Did I play poorly and get lucky? Or did I play well?

  If we want to improve in any game—as well as in any aspect of our lives—we have to learn from the results of our decisions. The quality of our lives is the sum of decision quality plus luck. In chess, luck is limited in its influence, so it’s easier to read the results as a signal of decision quality. That more tightly tethers chess players to rationality. Make a mistake and your opponent’s play points it out, or it is capable of analysis afterward. There is always a theoretically right answer out there. If you lose, there is little room to off-load losing to any other explanation than your inferior decision-making. You’ll almost never hear a chess player say, “I was robbed in that game!” or, “I played perfectly and caught some terrible breaks.” (Walk the hallways during a break in a poker tournament and you’ll hear a lot of that.)

  That’s chess, but life doesn’t look like that. It looks more like poker, where all that uncertainty gives us the room to deceive ourselves and misinterpret the data. Poker gives us the leeway to make mistakes that we never spot because we win the hand anyway and so don’t go looking for them, or the leeway to do everything right, still lose, and treat the losing result as proof that we made a mistake. Resulting, assuming that our decision-making is good or bad based on a small set of outcomes, is a pretty reasonable strategy for learning in chess. But not in poker—or life.

  Von Neumann and Morgenstern understood that the world doesn’t easily reveal the objective truth. That’s why they based game theory on poker. Making better decisions starts with understanding this: uncertainty can work a lot of mischief.

  A lethal battle of wits

  In one of the more famous scenes in The Princess Bride, the Dread Pirate Roberts (the love-besotted Westley) catches up to Vizzini, the mastermind who kidnapped Princess Buttercup. Having vanquished Fezzik the Giant in a battle of strength and having outdueled swordsman Inigo Montoya, the Dread Pirate Roberts proposes he and Vizzini compete in a lethal battle of wits, which provides a great demonstration of the peril of making decisions with incomplete information. The pirate produces a packet of deadly iocane powder and, hiding two goblets of wine from view, he empties the packet, and puts one goblet in front of himself and the other in front of Vizzini. Once Vizzini chooses a goblet, they will both drink “and find out who is right and who is dead.”

  “It’s all so simple,” Vizzini scoffs. “All I have to do is deduce, from what I know of you, the way your mind works. Are you the kind of man who would put the poison into his own glass, or into the glass of his enemy?” He provides a dizzying series of reasons why the poison can’t (or must) be in one cup, and then in the other. His rant accounts for cleverness, anticipating cleverness, iocane’s origin (the criminal land of Australia), untrustworthiness, anticipating untrustworthiness, and dueling presumptions related to Westley defeating the giant and the swordsman.

  While explaining all this, Vizzini diverts Westley’s attention, switches the goblets, and declares that they should drink from the goblets in front of them. Vizzini pauses for a moment and, when he sees Westley drink from his own goblet, confidently drinks from the other.

  Vizzini roars with laughter. “You fell victim to one of the classic blunders. The most famous is ‘Never get involved in a land war in Asia,’ but only slightly less well known is this: ‘Never go in against a Sicilian when death is on the line.’”

  In the midst of laughing, Vizzini falls over, dead. Buttercup says, “To think, all that time it was your cup that was poisoned.”

  Westley tells her, “They were both poisoned. I’ve spent the last two years building up immunity to iocane powder.”

  Like all of us, Vizzini didn’t have all the facts. He considered himself a genius without equal: “Let me put it this way. Have you ever heard of Plato, Aristotle, Socrates? Morons.” But, also like all of us, he underestimated the amount and effect of what he didn’t know.

  Suppose someone says, “I flipped a coin and it landed heads four times in a row. How likely is that to occur?”

  It feels like that should be a pretty easy question to answer. Once we do the math on the probability of heads on four consecutive 50-50 flips, we can determine that would happen 6.25% of the time (.50 × .50 × .50 × .50).

  That’s making the same mistake as Vizzini. The problem is that we came to this answer without knowing anything about the coin or the person flipping it. Is it a two-sided coin or three-sided or four? If it is two-sided, is it a two-headed coin? Even if the coin is two-sided (heads and tails), is the coin weighted to land on heads more often than tails (but maybe not always)? Is the flipper a magician who is capable of influencing how the coin lands? This information is all incomplete, yet we answered the question as if we had examined the coin and knew everything about it. We never considered that both goblets might be poisoned. (“Inconceivable” would have been Vizzini’s term, had he been able to comment on his own death.)

  Now if that person flipped the coin 10,000 times, giving us a sufficiently large sample size, we could figure out, with some certainty, whether the coin is fair. Four flips simply isn’t enough to determine much about the coin.

  We make this same mistake when we look for lessons in life’s results. Our lives are too short to collect enough data from our own experience to make it easy to dig down into decision quality from the small set of results we experience. If we buy a house, fix it up a little, and sell it three years later for 50% more than we paid, does that mean we are smart at buying and selling property, or at fixing up houses? It could, but it could also mean there was a big upward trend in the market and buying almost any piece of property would have made just as much money. Or maybe buying that same house and not fixing it up at all might have resulted in the same (or even better) profit. A lot of previously successful house flippers had to face that real possibility between 2007 and 2009.

  When someone asks you about a coin they flipped four times, the
re is a correct answer: “I’m not sure.”

  “I’m not sure”: using uncertainty to our advantage

  Just as we have problems with resulting and hindsight bias, when we evaluate decisions solely on how they turn out, we have a mirror-image problem in making prospective decisions. We get only one try at any given decision—one flip of the coin—and that puts great pressure on us to feel we have to be certain before acting, a certainty that necessarily will overlook the influences of hidden information and luck.

  Famed novelist and screenwriter William Goldman (who wrote The Princess Bride, as well as Misery and Butch Cassidy and the Sundance Kid) reflected on his experiences working with actors like Robert Redford, Steve McQueen, Dustin Hoffman, and Paul Newman at the height of their successful careers. What did it mean to be a “movie star”? He quoted an actor who explained the type of characters he wanted to play: “I don’t want to be the man who learns. I want to be the man who knows.”

  We are discouraged from saying “I don’t know” or “I’m not sure.” We regard those expressions as vague, unhelpful, and even evasive. But getting comfortable with “I’m not sure” is a vital step to being a better decision-maker. We have to make peace with not knowing.

  Embracing “I’m not sure” is difficult. We are trained in school that saying “I don’t know” is a bad thing. Not knowing in school is considered a failure of learning. Write “I don’t know” as an answer on a test and your answer will be marked wrong.

  Admitting that we don’t know has an undeservedly bad reputation. Of course, we want to encourage acquiring knowledge, but the first step is understanding what we don’t know. Neuroscientist Stuart Firestein’s book Ignorance: How It Drives Science champions the virtue of recognizing the limits of our knowledge. (You can get a taste of the book by watching his TED Talk, “The Pursuit of Ignorance.”) In the book and the talk, Firestein points out that in science, “I don’t know” is not a failure but a necessary step toward enlightenment. He backs this up with a great quote from physicist James Clerk Maxwell: “Thoroughly conscious ignorance is the prelude to every real advance in science.” I would add that this is a prelude to every great decision that has ever been made.

  What makes a decision great is not that it has a great outcome. A great decision is the result of a good process, and that process must include an attempt to accurately represent our own state of knowledge. That state of knowledge, in turn, is some variation of “I’m not sure.”

  “I’m not sure” does not mean that there is no objective truth. Firestein’s point is, in fact, that acknowledging uncertainty is the first step in executing on our goal to get closer to what is objectively true. To do this, we need to stop treating “I don’t know” and “I’m not sure” like strings of dirty words.

  What if we shifted our definition of “I don’t know” from the negative frame (“I have no idea” or “I know nothing about that,” which feels like we lack competence or confidence) to a more neutral frame? What if we thought about it as recognizing that, although we might know something about the chances of some event occurring, we are still not sure how things will turn out in any given instance? That is just the truth. If we accept that, “I’m not sure” might not feel so bad.

  What good poker players and good decision-makers have in common is their comfort with the world being an uncertain and unpredictable place. They understand that they can almost never know exactly how something will turn out. They embrace that uncertainty and, instead of focusing on being sure, they try to figure out how unsure they are, making their best guess at the chances that different outcomes will occur. The accuracy of those guesses will depend on how much information they have and how experienced they are at making such guesses. This is part of the basis of all bets.

  To be sure, an experienced poker player is more likely to make a better guess than a novice player at determining the chances they will win or lose a hand. The experienced player knows the math better and is better able to narrow down what their opponents’ cards might be based on how players behave with certain types of hands. They will also be better at figuring out the choices their opponents are likely to make with those cards. So, yes, more experience will allow the player to narrow down the possibilities. None of that experience, however, makes it possible for a poker player to know how any given hand will turn out.

  This is true in any field. An expert trial lawyer will be better than a new lawyer at guessing the likelihood of success of different strategies and picking a strategy on this basis. Negotiating against an adversary whom we have previously encountered gives us a better guess at what our strategy should be. An expert in any field will have an advantage over a rookie. But neither the veteran nor the rookie can be sure what the next flip will look like. The veteran will just have a better guess.

  It is often the case that our best choice doesn’t even have a particularly high likelihood of succeeding. A trial lawyer with a tough case could be choosing among strategies that are all more likely to fail than to succeed. The goal of a lawyer in that situation is to identify the different possible strategies, figure out their best guess of the chance of success for each unpromising alternative, and pick the least awful one to maximize the quality of the outcome for their client. That’s true in any business. Start-ups have very low chances of succeeding but they try nonetheless, attempting to find the best strategy to achieve the big win, even though none of the strategies is highly likely to create success for the company. This is still worthwhile because the payoff can be so large.

  There are many reasons why wrapping our arms around uncertainty and giving it a big hug will help us become better decision-makers. Here are two of them. First, “I’m not sure” is simply a more accurate representation of the world. Second, and related, when we accept that we can’t be sure, we are less likely to fall into the trap of black-and-white thinking.

  Imagine you’re stepping on a traditional medical scale. It has two weight bars, one with notches at fifty-pound intervals and the other with notches at one-pound intervals. This allows the user to measure their weight down to the pound. What would happen if your doctor used a scale with only one bar with just two notches, one at fifty pounds and one at five hundred pounds, with no way to measure anything in between? Good luck getting medical advice after the person weighing you writes one or the other on your chart. You could only be morbidly obese or severely underweight. It would be impossible to make good decisions about your weight with such a poor model.

  The same holds true for just about all of our decisions. If we misrepresent the world at the extremes of right and wrong, with no shades of grey in between, our ability to make good choices—choices about how we are supposed to be allocating our resources, what kind of decisions we are supposed to be making, and what kind of actions we are supposed to be taking—will suffer.

  The secret is to make peace with walking around in a world where we recognize that we are not sure and that’s okay. As we learn more about how our brains operate, we recognize that we don’t perceive the world objectively. But our goal should be to try.

  Redefining wrong

  When I attend charity poker tournaments, I will often sit in as the dealer and provide a running commentary at the final table. The atmosphere at these final tables is fun and raucous. Everyone running the event has had a long night and is breathing a sigh of relief. There is typically a big crowd around the table including friends and families of the players, rooting them on (or vocally rooting against them). If people have been drinking, then . . . people have been drinking. Everyone is having a good time.

  When players have put all their chips in the pot, there is no more betting on the hand. After an all-in situation, the players in the hand turn their cards faceup on the table so that everyone can see them before I deal the remaining cards. This makes it fun for the audience, because they get to see each player’s position in the hand and the drama mounts. With the cards f
aceup, I can determine the likelihood each player will win the hand, and announce the percentage of the time each hand will win in the long run.

  At one such tournament, I told the audience that one player would win 76% of the time and the other would win 24% of the time. I dealt the remaining cards, the last of which turned the 24% hand into the winner. Amid the cheers and groans, someone in the audience called out, “Annie, you were wrong!”

  In the same spirit that he said it, I explained that I wasn’t. “I said that would happen 24% of the time. That’s not zero. You got to see part of the 24%!”

  A few hands later, almost the same thing happened. Two players put all of their chips in the pot and they turned their cards faceup. One player was 18% to win and the other 82% to win the hand. Again, the player with the worse hand when they put in their chips hit a subsequent lucky card to win the pot.

  This time that same guy in the crowd called out, “Look, it was the 18%!” In that aha moment, he changed his definition of what it meant to be wrong. When we think in advance about the chances of alternative outcomes and make a decision based on those chances, it doesn’t automatically make us wrong when things don’t work out. It just means that one event in a set of possible futures occurred.

 

‹ Prev