From a psychological standpoint, craps is an ingeniously designed game. The odds are certainly rigged in the casino’s favor, but they are not rigged too heavily in that direction. That would be no fun at all. Players need a sense of reward, even if it’s just the illusion of winning once in a while. That’s what makes the game so addictive. I soon notice that we do win rolls, sometimes several in a row, but in all but a few cases, the money we win never quite adds up to the money we spend placing even the minimum bets. The result: At best, playing craps is a slow bleed that one can easily not notice, because the tiny amounts won in between losses blind players to the long-term financial hemorrhage that is taking place.
Yet somehow we walk away one hour later with combined winnings of $145. (I win $45, S-Money wins $100—a neat illustration of how the various odds play out, albeit purely anecdotal.) My interpretation is that we hit a streak of good luck, and had the sense to quit playing while we were still ahead. Ever the physicist, S-Money loftily informs me that, according to probability theory, “hot” or “cold” streaks are merely a perception. Each roll is independent of the previous and subsequent rolls; that is the nature of true randomness. So the odds are the same for each roll, even if the shooter has rolled the point twenty or two hundred times in a row; the outcome of the last roll does not affect what happens next. There is no such thing as being “due” for a win (or a loss).
Still, it is possible to figure out how often we are likely to have a winning session of craps. Translating this concept into actual calculus is tricky, in part because throwing dice falls into the realm of discrete events—analyzing event probabilities under repeated trials—whereas calculus, by definition, deals with continuous things. If you plot the probability of the outcomes of individual rolls, you’ll get a shape resembling a pyramid—a perfectly good shape, but not one that represents a continuous function.
However, if you throw the dice two thousand times (or more), add up how much you win and how much you lose each time, and plot it all out on a Cartesian grid, the result is a standard bell curve, also known as a normal distribution curve. For any random sample—say, many random rolls of the dice—you will get a distribution of values clustered around an average (or “mean”) value. That mean value is the peak, or highest point, of the curve, where the most data points cluster together; there are fewer and fewer data points as we move out to the edges. In craps, big wins or big losses happen very infrequently, and would be found at the extreme edges of the bell curve, while outcomes with smaller wins and losses would cluster near the peak of the curve.
Now that we have a pretty bell curve, we can use calculus to determine how often we will have a winning session in craps. First we need to understand what it is we are calculating; we have to set up the story. Every time we throw the dice, the probabilities for rolling a specific number don’t change from the odds outlined above; there is still a 1 in 6 chance of rolling a 7 with each and every roll, as represented by the pyramid. We are asking a different question: Given that we know we will throw the dice two thousand times, what is the probability that we will win or lose?
This reduces the question to an either-or option, assuming even odds—and remember that the game of craps is not even odds; we’re just making that assumption for the sake of simplicity. In this case, for every roll, there is a 50/50 chance that we will win or lose, and each outcome is separate from those before and after. It’s known as a random walk or, as many mathematicians like to call it, the drunkard’s walk. The probability that any one session of craps is an overall win or a loss approaches the distribution of this smooth bell curve the more times we throw the dice. If we throw the dice an infinite number of times, our win-loss rate will match the bell curve exactly.
How do we determine that likelihood? We take an integral. The actual formula for the integral from one point to another on a bell curve has never been explicitly written down; it is usually calculated with a computer. But here is the gist of the concept. Imagine a number line that runs from negative infinity (on the left) to infinity (on the right), with 0 smack in the middle, and a standard bell curve peaking at 0. This represents the distribution of outcomes for a 50/50 chance of winning or losing. The probability of losing will be the area under the curve that spans from minus infinity to 0, while the probability of winning will be the area under the curve from 0 to infinity. Each is equal to one half in this simplified example. The more times we roll the dice, the closer we will come to matching those probabilities. With 50/50 odds, for an infinite number of rolls, we will break even.
We can be even more specific by picking a random point on the x axis—say, 500—to determine the likelihood that we will either lose money or win up to $500. The answer will be the area under that portion of the curve that runs from negative infinity to 500. If we want to know the likelihood that we will win more than $500, we determine the area under that portion of the curve that runs from 500 to infinity.
The biggest problem when it comes to craps is that the odds are not 50/50. Let’s say the house has a slight edge, making the odds 49/51. Now our bell curve is shifted slightly to the left on our grid, making it slightly more likely that we will lose; and the longer we play, the closer we will get to that distribution. We also need to specify the size and type of bet for each roll, because the probabilities in craps are linked not just to the outcomes of the rolls of the dice, but also to the payoff rates for different kinds of bets.
GAMING THE SYSTEM
We won at craps because we got lucky in the short term: We hit a probabilistic sweet spot by pure random chance and had the sense to quit while we were ahead. Vegas notoriously attracts gamblers convinced they have discovered a “system”—a perfect strategy to beat the house. They are deluding themselves. Even assuming these perennial optimists have taken every single variable into account for their calculations, it takes only the tiniest house advantage to tip the scales irrevocably. We played for just one hour. Play the game long enough, and eventually you will lose everything. The occasional perceived hot streak or lucky break doesn’t alter that fact. The casinos are very up front about this. Another craps dealer in the New York, New York casino—let’s call him Vito—didn’t mince any words on that score: “Everyone thinks they got a system. You think you’re gonna beat this table? Go ahead and try. We got ATMs all over the casino, just for people like you.” Listen to the wisdom of Vito, my friends. Forewarned is forearmed.
Even if the odds are in your favor, there’s no guarantee you’ll win. Let’s imagine the situation were reversed, and the players had the slightest advantage; it wouldn’t necessarily translate into an automatic win. You must pay just as much attention to your bankroll as the odds of winning; if the odds are good but you’re betting a substantial portion of your bankroll on each roll of the dice, it’s enough to wipe out any advantage pretty quickly. That’s the essence of a little exercise called gambler’s ruin. It’s a favorite of University of Washington physicist Dave Bacon—better known to the blogosphere as the Quantum Pontiff—who became fascinated with cataloging the outcomes of repeated throws of dice as a child. He admits this made him an übergeekazoid, but it probably saved him a lot of money in the long run.
Gambler’s ruin begins with the assumption—a false one, when it comes to craps—that the player has a slight advantage in a game of chance and should win slightly more than half the time. Say you have a bankroll of x dollars. For every dollar you bet, you win another dollar or lose the original dollar, depending on the outcome. (This is the same payoff rate as the pass and don’t-pass bets in craps.) What is the probability that you will run out of money, even with that slight advantage, rather than increase your bankroll by, say, doubling your money?
Drawing on that childhood fascination, Bacon devised a handy formula and plugged in a few values to see if a pattern emerged. He found that even with a fairly large advantage—say, 55/45—if you only start with $10 and make a fixed bet each time, there is an 11.8 percent chance of being ruined be
fore you succeed in doubling your money. If you have a 51/49 advantage and a starting bankroll of $178, your chances of ruin before doubling up decrease to 0.1 percent, or 1 in 1,000. In craps, of course, you don’t have an advantage. Bacon has crunched those numbers, too. If the house has the usual edge of 1.42 percent, and you start with $100 and want to double it, your probability of ruin is 98.2 percent. That’s why casinos make such a killing.
So the first rule of gambling, for those who have studied the odds, is simply, Don’t.22 Still, craps is quite a lot of fun, provided you view it as harmless entertainment, rather than a get-rich-quick scheme to pad your 401(k). A good rule of thumb is to budget a set amount you are willing to lose and just chalk it up to the price of a day’s entertainment. Once you lose that amount, suck it up and walk away, and maybe explore a few of the other delights of Vegas.
Admittedly, this is easier said than done. For one thing, casinos employ a stickman at every craps table, whose job is to talk up the game and encourage players to make the riskier bets. For another, a 2008 paper in the Journal of Marketing Research reported on a study by two professors at the University of California. They found that even if people went into the casino determined to stay within their gambling budget, the pain of losing would usually cause them to bet more money in hopes of recouping their losses. Those who won tended to keep to their budget.
If someone develops a gambling addiction, the problem is even worse. In 2007, a Nebraska businessman named Terrance Watanabe lost nearly $127 million in a yearlong binge at the Caesars Palace and Rio casinos, blowing most of his personal fortune. When parent company Harrah’s Entertainment sued him for nonpayment of his gambling debt, Watanabe counter-sued, claiming casino staff plied him with drinks and encouraged him to gamble while intoxicated, thereby impairing his judgment. There could be an element of truth to that: High rollers like Watanabe—“whales” in the jargon of casino staff—are a lucrative source of income for casinos. As such, casinos treat them very well, doling out all manner of luxuries, free of charge, to keep them happy. But there are rules: Nevada gaming regulations stipulate that someone who is clearly intoxicated should not be allowed to gamble. In fact, Watanabe claimed he was barred from the Wynn casino for compulsive drinking and gambling; the Harrah’s establishments welcomed him with open arms.
Watanabe’s spectacular downfall is a rare occurrence. It is certainly possible to maximize your fun playing craps in a casino without breaking the bank. Just follow this basic principle: You want to play as long as possible with a fixed amount of money. That means losing as little as possible with each bet by choosing those bets with the most favorable odds and pay-outs. It’s easy to determine the optimal percentage of your bankroll you should bet in order to maximize your long-term return without busting out, using something called the Kelly criterion.
Born in Texas, John L. Kelly was a Naval Air Force pilot during World War II who survived a plane crash into the ocean and eventually earned a PhD in physics from the University of Texas-Austin. He found work in the oil industry, using his scientific training to identify likely oil sites. But his employer’s instincts were better than Kelly’s models, so Kelly decided the oil business was best left to those with a nose for hidden deposits and found himself working for Bell Labs, one of the most prestigious research centers in the United States. He cut a colorful figure among his fellow physicists, with his Texas drawl, passion for guns, and penchant for taking calculated risks.
It was a hugely popular television game show called The $64,000 Question that inspired Kelly to devise his famous formula in the 1950s. People would place bets on the most likely contestants to win. But there is a three-hour time difference between New York City—where the show was produced and aired live—and the West Coast. Kelly heard a rumor that one gambler on the West Coast had a partner back east tell him the winners by phone so that he could place bets before the show aired in the West, giving said gambler an inside track. This spurred Kelly to ponder probabilities and gambling. He reasoned that if a gambler with an inside track bets everything he or she has on the basis of those tips, the gambler will lose everything the first time he or she gets a bad tip. But if the same gambler makes just the minimum bet for each tip, that insider information no longer confers much of an advantage. Recognizing the importance of how much someone bets in fashioning a winning strategy, Kelly determined that dividing your edge by the odds tells you what percentage of your bank roll you should bet each time.
The odds determine how much profit you make if you win; the edge describes the amount you expect to win on average if you make the same wager repeatedly under the same probabilities. Remember the lesson of gambler’s ruin: Even if the odds are in your favor, you still don’t want to bet your entire bankroll in one fell swoop; your odds of losing everything on one roll are much higher. Play it safe and bet too little, however, and your return won’t be sufficient to make up for the inevitable losses. Kelly’s formula reveals the optimal betting strategy for maximizing long-term returns. For a bet with even odds, Kelly tells us to bet a fraction of our bankroll that is determined by 2p -1, where p is our probability of winning.
When it comes to playing craps in a Vegas casino, it will be a discouraging answer unless you have the good fortune to be the house. Players usually have an edge of zero at best (a 50/50 chance) and more often it is slightly less. In either case, the Kelly criterion says that the best way to maximize your long-term return in craps is to bet 0 percent of your bankroll—that is, not to play. But that is just a detached, mathematical analysis that doesn’t take into account the fun factor, the sheer pleasure one derives from playing craps.
We can tweak this problem a little to take that subjective quality into account by assigning it a quantitative value: Let’s say the odds are 49/51, giving the house a 2 percent edge, but the fun factor is 3 percent, giving us a net edge over the casino of 1 percent. That corresponds to a winning probability of 0.51, so the Kelly criterion tells us to bet 2 percent of our bankroll. Now, we can place our bets accordingly to optimize our fun—that is, play as long as possible by maximizing our long-term gains. We’ll still most likely lose in the end, but we will be getting the most bang for our buck.
There is a downside to the Kelly criterion, or rather, a kind of trade-off: Following the Kelly criterion exactly leads to a lot of volatility in the outcomes. In the long term, it works; in the short term, it can lead to intense anxiety over the wild fluctuations in one’s fortunes. For those who prefer a bit less drama in their gambling, a popular middle-ground strategy is to bet half of what the Kelly criterion recommends. This optimizes your return to within three fourths that of the Kelly criterion while greatly reducing the volatility. It seems apt that the optimal formula for long-term gain would increase short-term risk, considering that the man himself was a bit of a daredevil. Ironically, Kelly never actually put his method to the test: He died of a brain hemorrhage in 1965 at the age of forty-one while walking down a Manhattan sidewalk. What were the odds of that?
NEEDLE IN A HAYSTACK
We round out our wild Vegas weekend with several hours of good old-fashioned poker—a game of skill and strategy, as opposed to pure random chance, wherein the casino takes a cut of the pot instead of relying on a built-in house edge. What have I learned? “Craps” is an apt moniker. Also? I can’t bluff worth a damn at Texas hold ’em. But in the end, we emerge from the weekend with our wallets relatively unscathed.
Relaxing over cocktails at the Bellagio that evening, I ponder the fact that probability theory and gambling are also linked to fortune-telling and one of the most famous “natural” numbers, π. In Philip Pullman’s The Amber Spyglass, fictional Oxford physicist Mary Malone finds she can communicate with the mysterious, conscious particles collectively named Dust using the yarrow-stick casting methods of the I Ching (it’s also possible to use coins). For those who scoff that a physicist would never express any appreciation for a “supernatural” method of divination, consider this: When he was kn
ighted, Neils Bohr included the yin-yang symbol in the design for his coat of arms, to reflect his appreciation for the I Ching’s ingenious use of probabilistic concepts.
Mary Malone’s divination method has a real-world counterpart in one of the oldest problems in geometrical probability, known as Buffon’s needle. This experiment was the brainchild of a French naturalist and mathematician named Georges-Louis Leclerc, Comte de Buffon. Born and raised on the Côte d’Or, the young George-Louis started off studying law before getting side-tracked by mathematics and science. It’s not clear that he ever earned a degree, because he was forced to leave the university after getting tangled up in a duel. He then toured Europe, only returning when he heard his father had remarried—not so much out of familial devotion as concern over collecting his inheritance.
Buffon fils is best known for writing the Histoire naturelle, a whopping forty-four volumes of encyclopedic knowledge that covered everything known at that time about the natural world. A full hundred years before Charles Darwin’s Origin of Species, Buffon noted the similarities between humans and apes and mused on the possibility of a common ancestry, concluding that species must have evolved since that common point. He never proposed an actual mechanism for this evolution, but his tome was translated into numerous languages and certainly influenced Darwin, who described Buffon—in the foreword to the sixth edition of Origin—as “the first author who in modern times has treated it in a scientific spirit.”
Buffon’s quirky contribution to probability theory lies in a paper he published in 1777 entitled, Sur le jeu de franc-carreau [On the Game of Open-Tile]. He first considered a small coin—an ecu, for all you crossword puzzle buffs—thrown randomly on a square-tiled floor. It was all the rage in Buffon’s social circles to place bets on whether the coin would land entirely within the bounds of a single tile or across the boundaries of two tiles right next to each other. Buffon had a bit of an advantage over his peers thanks to his mathematical interests. He realized he could figure out the odds of the wager using calculus, making him the first person to introduce calculus into probability theory.
The Calculus Diaries: How Math Can Help You Lose Weight, Win in Vegas, and Survive a Zombie Apocalypse Page 8