Thinking in Bets

Home > Other > Thinking in Bets > Page 7
Thinking in Bets Page 7

by Annie Duke


  Unfortunately, this is just the way evolution built us. We are wired to protect our beliefs even when our goal is to truthseek. This is one of those instances where being smart and aware of our capacity for irrationality alone doesn’t help us refrain from biased reasoning. As with visual illusions, we can’t make our minds work differently than they do no matter how smart we are. Just as we can’t unsee an illusion, intellect or willpower alone can’t make us resist motivated reasoning.

  So far, this chapter has mainly been bad news. We bet on our beliefs. We don’t vet those beliefs well before we form them. We stubbornly refuse to update our beliefs. Now I’ve piled on by telling you that being smart doesn’t help—and can make it worse.

  The good news starts here.

  Wanna bet?

  Imagine taking part in a conversation with a friend about the movie Citizen Kane. Best film of all time, introduced a bunch of new techniques by which directors could contribute to storytelling. “Obviously, it won the best-picture Oscar,” you gush, as part of a list of superlatives the film unquestionably deserves.

  Then your friend says, “Wanna bet?”

  Suddenly, you’re not so sure. That challenge puts you on your heels, causing you to back off your declaration and question the belief that you just declared with such assurance. When someone challenges us to bet on a belief, signaling their confidence that our belief is inaccurate in some way, ideally it triggers us to vet the belief, taking an inventory of the evidence that informed us.

  How do I know this?

  Where did I get this information?

  Who did I get it from?

  What is the quality of my sources?

  How much do I trust them?

  How up to date is my information?

  How much information do I have that is relevant to the belief?

  What other things like this have I been confident about that turned out not to be true?

  What are the other plausible alternatives?

  What do I know about the person challenging my belief?

  What is their view of how credible my opinion is?

  What do they know that I don’t know?

  What is their level of expertise?

  What am I missing?

  Remember the order in which we form abstract beliefs:

  We hear something;

  We believe it;

  Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether or not it is true.

  “Wanna bet?” triggers us to engage in that third step that we only sometimes get to. Being asked if we are willing to bet money on it makes it much more likely that we will examine our information in a less biased way, be more honest with ourselves about how sure we are of our beliefs, and be more open to updating and calibrating our beliefs. The more objective we are, the more accurate our beliefs become. And the person who wins bets over the long run is the one with the more accurate beliefs.

  Of course, in most instances, the person offering to bet isn’t actually looking to put any money on it. They are just making a point—a valid point that perhaps we overstated our conclusion or made our statement without including relevant caveats. Most people aren’t like poker players, around whom there is always the potential that someone might propose a bet and they will mean it.

  Next thing you know, someone moves to Des Moines and there’s $30,000 at stake.

  It’s a shame the social contract for poker players is so different than for the rest of us in this regard because a lot of good can result from someone saying, “Wanna bet?” Offering a wager brings the risk out in the open, making explicit what is already implicit (and frequently overlooked). The more we recognize that we are betting on our beliefs (with our happiness, attention, health, money, time, or some other limited resource), the more we are likely to temper our statements, getting closer to the truth as we acknowledge the risk inherent in what we believe.

  Expecting everyone starting to throw the gauntlet down, challenging each other to bet on any opinion, is impractical if you aren’t hanging out in a poker room. (Even in poker rooms, this generally happens only among players who know each other well.) I imagine that if you went around challenging everyone with “Wanna bet?” it would be difficult to make friends and you’d lose the ones you have. But that doesn’t mean we can’t change the framework for ourselves in the way we think about our decisions. We can train ourselves to view the world through the lens of “Wanna bet?”

  Once we start doing that, we are more likely to recognize that there is always a degree of uncertainty, that we are generally less sure than we thought we were, that practically nothing is black and white, 0% or 100%. And that’s a pretty good philosophy for living.

  Redefining confidence

  Not much is ever certain. Samuel Arbesman’s The Half-Life of Facts is a great read about how practically every fact we’ve ever known has been subject to revision or reversal. We are in a perpetual state of learning, and that can make any prior fact obsolete. One of many examples he provides is about the extinction of the coelacanth, a fish from the Late Cretaceous period. A mass-extinction event (such as a large meteor striking the Earth, a series of volcanic eruptions, or a permanent climate shift) ended the Cretaceous period. That was the end of dinosaurs, coelacanths, and a lot of other species. In the late 1930s and independently in the mid-1950s, however, coelacanths were found alive and well. A species becoming “unextinct” is pretty common. Arbesman cites the work of a pair of biologists at the University of Queensland who made a list of all 187 species of mammals declared extinct in the last five hundred years. More than a third of those species have subsequently been rediscovered.

  Given that even scientific facts can have an expiration date, we would all be well-advised to take a good hard look at our beliefs, which are formed and updated in a much more haphazard way than those in science. We don’t need someone challenging us to an actual bet to do this. We can think like a bettor, purposefully and on our own, like it’s a game even if we’re just doing it ourselves.

  We would be better served as communicators and decision-makers if we thought less about whether we are confident in our beliefs and more about how confident we are. Instead of thinking of confidence as all-or-nothing (“I’m confident” or “I’m not confident”), our expression of our confidence would then capture all the shades of grey in between.

  When we express our beliefs (to others or just to ourselves as part of our internal decision-making dialogue), they don’t generally come with qualifications. What if, in addition to expressing what we believe, we also rated our level of confidence about the accuracy of our belief on a scale of zero to ten? Zero would mean we are certain a belief is not true. Ten would mean we are certain that our belief is true. A zero-to-ten scale translates directly to percentages. If you think the belief rates a three, that means you are 30% sure the belief is accurate. A nine means you are 90% sure. So instead of saying to ourselves, “Citizen Kane won the Oscar for best picture,” we would say, “I think Citizen Kane won the Oscar for best picture but I’m only a six on that.” Or “I’m 60% that Citizen Kane won the Oscar for best picture.” That means your level of certainty is such that 40% of the time it will turn out that Citizen Kane did not win the best-picture Oscar. Forcing ourselves to express how sure we are of our beliefs brings to plain sight the probabilistic nature of those beliefs, that what we believe is almost never 100% or 0% accurate but, rather, somewhere in between.

  In a similar vein, the number can reflect several different kinds of uncertainty. “I’m 60% confident that Citizen Kane won best picture” reflects that our knowledge of this past event is incomplete. “I’m 60% confident the flight from Chicago will be late” incorporates a mix of our incomplete knowledge and the inherent uncertainty in predicting the future (e.g., the weather might intervene or there might be an unforeseen mechani
cal issue).

  We can also express how confident we are by thinking about the number of plausible alternatives and declaring that range. For example, if I am stating my belief about what age Elvis died, I might say, “Somewhere between age forty and forty-seven.” I know he died in his forties and I remember that it was his earlier forties, so for me this is the range of plausible alternatives. The more we know about a topic, the better the quality of information we have, the tighter the range of plausible alternatives. (When it comes to predictions, the plausible range of outcomes would also be tighter when there is less luck involved.) The less we know about a topic or the more luck involved, the wider our range.

  We can declare how sure we are whether we are thinking about a particular fact or set of facts (“dinosaurs were herd animals”), a prediction (“I think there is life on other planets”), or how the future will turn out given some decision we might make (“I think I will be happier if I move to Des Moines than I am where I live now” or “I think the company will be better off if we fire the president”). These are all beliefs of differing sorts.

  Incorporating uncertainty into the way we think about our beliefs comes with many benefits. By expressing our level of confidence in what we believe, we are shifting our approach to how we view the world. Acknowledging uncertainty is the first step in measuring and narrowing it. Incorporating uncertainty in the way we think about what we believe creates open-mindedness, moving us closer to a more objective stance toward information that disagrees with us. We are less likely to succumb to motivated reasoning since it feels better to make small adjustments in degrees of certainty instead of having to grossly downgrade from “right” to “wrong.” When confronted with new evidence, it is a very different narrative to say, “I was 58% but now I’m 46%.” That doesn’t feel nearly as bad as “I thought I was right but now I’m wrong.” Our narrative of being a knowledgeable, educated, intelligent person who holds quality opinions isn’t compromised when we use new information to calibrate our beliefs, compared with having to make a full-on reversal. This shifts us away from treating information that disagrees with us as a threat, as something we have to defend against, making us better able to truthseek.

  When we work toward belief calibration, we become less judgmental of ourselves. Incorporating percentages or ranges of alternatives into the expression of our beliefs means that our personal narrative no longer hinges on whether we were wrong or right but on how well we incorporate new information to adjust the estimate of how accurate our beliefs are. There is no sin in finding out there is evidence that contradicts what we believe. The only sin is in not using that evidence as objectively as possible to refine that belief going forward.

  Declaring our uncertainty in our beliefs to others makes us more credible communicators. We assume that if we don’t come off as 100% confident, others will value our opinions less. The opposite is usually true. If one person expresses a belief as absolutely true, and someone else expresses a belief by saying, “I believe this to be true, and I’m 80% on it,” who are you more likely to believe? The fact that the person is expressing their confidence as less than 100% signals that they are trying to get at the truth, that they have considered the quantity and quality of their information with thoughtfulness and self-awareness. And thoughtful and self-aware people are more believable.

  Expressing our level of confidence also invites people to be our collaborators. As I said, most of us don’t live our lives in poker rooms, where it is more socially acceptable to challenge a peer who expresses an opinion we believe to be inaccurate to a wager. Outside of the poker room, when we declare something as 100% fact, others might be reluctant to offer up new and relevant information that would inform our beliefs for two reasons. First, they might be afraid they are wrong and so won’t speak up, worried they will be judged for that, by us or themselves. Second, even if they are very confident their information is high quality, they might be afraid of making us feel bad or judged. By saying, “I’m 80%” and thereby communicating we aren’t sure, we open the door for others to tell us what they know. They realize they can contribute without having to confront us by saying or implying, “You’re wrong.” Admitting we are not sure is an invitation for help in refining our beliefs, and that will make our beliefs much more accurate over time as we are more likely to gather relevant information.

  Expressing our beliefs this way also serves our listeners. We know that our default is to believe what we hear, without vetting the information too carefully. If we communicate to our listeners that we are not 100% on what we are saying, they are less likely to walk away having been infected by our beliefs. Expressing the belief as uncertain signals to our listeners that the belief needs further vetting, that step three is still in progress.

  When scientists publish results of experiments, they share with the rest of their community their methods of gathering and analyzing the data, the data itself, and their confidence in that data. That makes it possible for others to assess the quality of the information being presented, systematized through peer review before publication. Confidence in the results is expressed through both p-values, the probability one would expect to get the result that was actually observed (akin to declaring your confidence on a scale of zero to ten), and confidence intervals (akin to declaring ranges of plausible alternatives). Scientists, by institutionalizing the expression of uncertainty, invite their community to share relevant information and to test and challenge the results and explanations. The information that gets shared back might confirm, disconfirm, or refine published hypotheses. The goal is to advance knowledge rather than affirm what we already believe. This is why science advances at a fast clip.*

  By communicating our own uncertainty when sharing beliefs with others, we are inviting the people in our lives to act like scientists with us. This advances our beliefs at a faster clip because we miss out on fewer opportunities to get new information, information that would help us to calibrate the beliefs we have.

  Acknowledging that decisions are bets based on our beliefs, getting comfortable with uncertainty, and redefining right and wrong are integral to a good overall approach to decision-making. But I don’t expect that, having dumped all these concepts in your lap, you should somehow know the best way to use them. These patterns are so engrained in our thinking that it takes more than knowing the problem or even having the right outlook to overcome the irrationalities that hold us back. What I’ve done so far, really, is identify the target; now that we are facing the right direction, thinking in bets is a tool to be somewhat better at hitting it.

  CHAPTER 3

  Bet to Learn: Fielding the Unfolding Future

  Nick the Greek, and other lessons from the Crystal Lounge

  When I first started playing poker, I lived in Columbus, Montana, population 1,200. The nearest poker game was forty miles away in downtown Billings, in the basement of a bar called the Crystal Lounge. Every day, I drove those forty miles, arriving by early afternoon. I would play until the evening, and drive home.

  The game was filled with characters out of a clichéd vision of Montana: ranchers and farmers killing time in the off-season, filling the basement room with smoke wafting over the brims of their cowboy hats. It was 1992 but it could have just as easily been 1952 from the décor and the grizzled countenances of the locals. The only thing suggesting that John Wayne wasn’t going to mosey on in was a handful of misfits, including me (a woman and the youngest player by a few decades, on the lam from defending my dissertation at the University of Pennsylvania) and a player named “Nick the Greek.”

  If your name is Nick, you come from Greece, and you gamble, they’re going to call you Nick the Greek, as sure as they’ll call you Tiny if you weigh more than 350 pounds. (And, yes, there was a guy named Tiny, real name Elwood, who regularly played in the game.) This Nick the Greek was of the small-time Billings variety. He was the general manager of the hotel across the street, having gotten transf
erred from Greece by the hotel chain. He left work for a couple of hours every afternoon like clockwork to play in the game.

  Nick the Greek had formed an unusual set of beliefs that guided his poker decisions. I knew this because he described them to me and the other players, at length, using the results of particular hands to punctuate his points. He fixated on the relatively common belief that the element of surprise was important in poker. (Don’t be predictable, mix up your play—that sort of stuff.) Then he jacked it up on steroids. To him, a starting pair of aces, the mathematically best two cards you can get dealt, was the worst hand because everyone predictably played it.

  “They always expect you to have aces. You get killed with that hand.”

  By that logic, he explained, the very best two starting cards to play were the mathematically weakest two cards you could receive, a seven and a deuce of different suits—a hand almost any player avoids.

  “I bet you never saw it coming,” he would say when he turned over that hand and won a pot. And because he played seven-deuce all the time, occasionally the stars would line up and he’d win. I also remember times when he threw away a pair of aces, faceup, at the first sign of a bet. (Never mind that he was compromising a vital element of his strategy of subterfuge by constantly showing and telling us he was doing this. Given that he had such an entrenched set of beliefs, it’s not surprising that he didn’t see the incongruity.)

  Nick the Greek, needless to say, rarely came out ahead. Yet he never changed his strategy, often complaining about his bad luck when he lost, though never in a bitter way. He was a friendly guy, pleasant to play with—the perfect poker opponent. I tried to time my daily arrivals so I’d be in the game when he made his afternoon appearance.

 

‹ Prev