A Beautiful Math

Home > Other > A Beautiful Math > Page 26
A Beautiful Math Page 26

by Tom Siegfried


  5. Strictly speaking, utility theory can be used without game theory to make economic predictions, and it often is. But before game theory came along, the mathematical basis of utility was less than solid. In formulating game theory, von Neumann and Morgenstern developed a method to compute utility with mathematical rigor. Utility theory on its own can be used by individuals making solitary decisions, but when one person's choice depends on what others are choosing, game theory is then necessary to calculate the optimum decision.

  6. See Ulrich Schwalbe and Paul Walker, "Zermelo and the Early History of Game Theory," Games and Economic Behavior, 34 (January 2001): 123–137.

  7. The term "minimax" refers to the game theory principle that you should choose a strategy that minimizes the maximum loss you will suffer no matter what strategy your opponent plays and maximizes the minimum gain you can attain when choosing from each possible strategy.

  8. In 1937, von Neumann published another influential paper, not specifically linked to game theory, that presented a new view on the nature of growth and equilibrium in economic systems. That paper was another major element of von Neumann's contribution to economic science. See Norman Macrae, John von Neumann, Pantheon Books, New York, 1991, pp. 247–256.

  9. In a footnote, he did mention possible parallels to economic behavior.

  10. In the story, Moriarty appears in Victoria station just as Holmes and Watson's train departs for Dover, where a ferry will transport them to France. Watson believes they have successfully escaped from the villain, but Holmes points out that Moriarty will now do what Holmes himself would have done—engage a special train to speed him to Dover before the ferry departs. But anticipating this move by Moriarty, Holmes decides to get off the train in Canterbury and catch another train to Newhaven, site of another ferry to France. Sure enough, Moriarty hired a special train and went to Dover. But a game theorist would wonder why Moriarty would not have anticipated the fact that Holmes would have anticipated Moriarty's move, etc. See Leslie Klinger, ed., The New Annotated Sherlock Holmes, Vol. 1, W.W. Norton, New York, 2005, pp. 729–734.

  11. Oskar Morgenstern, "The Collaboration between Oskar Morgenstern and John von Neumann on the Theory of Game," Journal of Economic Literature, 14 (September 1976), reprinted in John von Neumann and Oskar Morgenstern, Theory of Games and Economic Behavior, Sixtieth-Anniversary Edition, Princeton University Press, Princeton, N.J., 2004.

  12. Robert J. Leonard, "From Parlor Games to Social Science: Von Neumann, Morgenstern, and the Creation of Game Theory, 1928–1944," Journal of Economic Literature, 33 (1995): 730–761.

  13. John von Neumann and Oskar Morgenstern, Theory of Games and Economic Behavior, Sixtieth-Anniversary Edition, Princeton University Press, Princeton, N.J., 2004, p. 2.

  14. Ibid., p. 4.

  15. Ibid., p. 2.

  16. Ibid., p. 6.

  17. Samuel Bowles, telephone interview, September 11, 2003.

  18. Von Neumann and Morgenstern, Theory of Games, p. 11.

  19. Ibid., p. 11

  20. Ibid., p. 12

  21. Ibid., p. 14.

  22. Ibid., Theory of Games and Economic Behavior, p. 13.

  23. If you really want to get technical, you have to subtract the bus fare from the winnings (or add it to the cost) when calculating the payoffs for this game. But that makes it too complicated, so let's assume they live in a "free ride" zone.

  24. Thus a mixed strategy is a "probability distribution" of pure strategies. The concept of probability distribution will become increasingly important in later chapters.

  25. J.D. Williams, The Compleat Strategyst: Being a Primer on the Theory of Games of Strategy, McGraw-Hill, New York, 1954.

  26. The actual math for calculating the optimal strategies for this game matrix is given in the Appendix.

  27. In the original formulation of game theory, von Neumann insisted on treating games as if they were only one-shot affairs—no repetitions. In that case, a mixed strategy could not be implemented by choosing different strategies different percentages of the time. You could make only one choice. If your minimax solution was a mixed strategy, you had to use the random-choice device to choose which of the possible pure strategies you should play.

  28. A similar version of this game is presented in a book on game theory by Morton Davis, which in turn was modified from a somewhat more complex version of "simplified" poker described by von Neumann and Morgenstern.

  29. See Morton Davis, Game Theory: A Nontechnical Introduction, Dover, Mineola, N.Y., 1997 (1983), pp. 36–38.

  30. Von Neumann and Morgenstern, Theory of Games, p. 43.

  NASH'S EQUILIBRIUM

  1. Roger Myerson, "Nash Equilibrium and the History of Economic Theory," 1999. Available online at http://home.uchicago.edu/~rmyerson/research/jelnash.pdf.

  2. Paul Samuelson, "Heads I Win, Tails You Lose," in von Neumann and Morgenstern, Theory of Games, p. 675.

  3. Leonid Hurwicz, "Review: The Theory of Economic Behavior," American Economic Review, 35 (December 1945). Reprinted in von Neumann and Morgenstern, Theory of Games, p. 664.

  4. Ibid., p. 662.

  5. Arthur H. Copeland, "Review," Bulletin of the American Mathematical Society, 51 (July 1945): 498–504. Reprinted in von Neumann and Morgenstern, Theory of Games.

  6. Hurwicz, "Review," p. 647.

  7. Herbert Simon, "Review," American Journal of Sociology, 50 (May 1945). Reprinted in von Neumann and Morgenstern, Theory of Games, p. 640.

  8. In the film version of A Beautiful Mind, the math is garbled beyond any resemblance to what Nash actually did.

  9. John Nash, "The Bargaining Problem," Econometrica, 18 (1950): 155– 162. Reprinted in Harold Kuhn and Sylvia Nasar, eds., The Essential John Nash, Princeton University Press, Princeton, N.J., 2002, pp. 37–46.

  10. John Nash, "Non-Cooperative Games," dissertation, May 1950. Reprinted in Kuhn and Nasar, The Essential John Nash, p. 78.

  11. Ibid., p. 59.

  12. Erica Klarreich, "The Mathematics of Strategy," PNAS Classics, http://www.pnas.org/misc/classics5.shtml.

  13. Samuel Bowles, telephone interview, September 11, 2003.

  14. Ibid.

  15. John Nash, "Non-cooperative Games," Annals of Mathematics, 54 (1951). Reprinted in Kuhn and Nasar, The Essential John Nash, p. 85. I have corrected "collaboration of communication" as printed there to "collaboration or communication"—it is clearly a typo, differing from Nash's dissertation.

  16. Kuhn, The Essential John Nash, p. 47.

  17. As one reviewer of the manuscript for this book pointed out, it is not necessarily true that all economic systems converge to equilibrium, and that in some cases a chaotic physical system might be a better analogy than a chemical equilibrium system. The idea of equilibrium is nevertheless an important fundamental concept, and much of modern economics involves efforts to understand when it works and when it doesn't.

  18. This observation (in a slightly different form) has been attributed to the physicist Murray Gell-Mann.

  19. Quoted in William Poundstone, Prisoner's Dilemma, Anchor Books, New York, 1992, p. 124.

  20. Mathematically, Tucker's game was the same as one invented earlier by Merrill Flood and Melvin Dresher. Tucker devised the Prisoner's Dilemma as a way of illustrating the payoff principles in Flood and Dresher's game. See Poundstone, Prisoner's Dilemma, pp. 103ff.

  21. Charles Holt and Alvin Roth, "The Nash Equilibrium: A Perspective," Proceedings of the National Academy of Sciences USA, 101 (March 23, 2004): 4000.

  22. Robert Kurzban and Daniel Houser, "Experiments Investigating Cooperative Types in Humans: A Complement to Evolutionary Theory and Simulations," Proceedings of the National Academy of Sciences USA, 102 (February 1, 2005): 1803–1807.

  23. R. Duncan Luce and Howard Raiffa, Games and Decisions, John Wiley & Sons, New York, 1957, p. 10.

  24. Ariel Rubenstein, Afterword, in von Neumann and Morgenstern, Theory of Games, p. 633.

  25. Ibid., p. 634.

 
26. Ibid, p. 636.

  27. Colin Camerer, Behavioral Game Theory, Princeton University Press, Princeton, N.J., 2003, p. 5.

  28. Ibid., pp. 20–21.

  29. The Royal Swedish Academy of Sciences, "Press Release: The Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel 2005," October 10, 2005.

  SMITH'S STRATEGIES

  1. D.G.C. Harper, "Competitive Foraging in Mallards—‘Ideal Free' Ducks," Animal Behaviour, 30 (1982): 575–584.

  2. Of course, you could conclude that animals are in fact rational, or at least more rational than they are generally considered to be.

  3. Martin Nowak, interview in Princeton, N.J., October 19, 1998.

  4. Rosie Mestel, The Los Angeles Times, April 24, 2004, p. B21.

  5. Maynard Smith's first paper on evolutionary game theory was written in collaboration with Price; it appeared in Nature in 1973. The story is told in John Maynard Smith, "Evolution and the Theory of Games," American Scientist, 64 (January–February 1976): 42. Price died in 1975.

  6. John Maynard Smith, "Evolutionary Game Theory," Physica D, 22 (1986): 44.

  7. Ibid.

  8. The relationship between Nash equilibria and evolutionary stable strategies can get extremely complicated, and a full discussion would include considerations of the equations governing the reproductive rate of competing species (what is known as the "replicator dynamic"). A good place to explore these issues is Herbert Gintis, Game Theory Evolving, Princeton University Press, Princeton, N.J., 2000.

  9. For the calculation of the Nash equilibrium giving this ratio, see the Appendix.

  10. This equivalence of a mixed population—two-thirds doves and one-third hawks—with mixed behavior of the same birds holds only in the simple case of a two-strategy game. In more complicated games, the exact math depends on whether you're talking about mixtures of populations or mixtures of strategies.

  11. Rufus Johnstone, "Eavesdropping and Animal Conflict," Proceedings of the National Academy of Sciences USA, 98 (July 31, 2001): 9177–9180.

  12. John M. McNamara and Alasdair I. Houston, "If Animals Know Their Own Fighting Ability, the Evolutionarily Stable Level of Fighting is Reduced," Journal of Theoretical Biology, 232 (2005): 1–6.

  13. Martin Nowak, interview in Princeton, October 19, 1998.

  14. Ibid.

  15. In all, 15 strategies participated in the round-robin tournament. Axelrod added a strategy that chose defect or cooperate at random.

  16. Martin Nowak, lecture in Quincy, Mass., May 18, 2004.

  17. A paper describing the results Nowak discussed in Quincy appeared the following year: Lorens A. Imhof, Drew Fudenberg, and Martin Nowak, "Evolutionary Cycles of Cooperation and Defection," Proceedings of the National Academy of Sciences USA, 102 (August 2, 2005): 19797–10800.

  18. Herbert Gintis and Samuel Bowles, "Prosocial Emotions," Santa Fe Institute working paper 02-07-028, June 21, 2002.

  FREUD'S DREAM

  1. Von Neumann, actually, was very interested in the brain, and his last book was a series of lectures (that he never delivered) comparing the brain to a computer. But I found no hint that he saw an explicit connection between neuroscience and game theory.

  2. P. Read Montague and Gregory Berns, "Neural Economics and the Biological Substrates of Valuation," Neuron, 36 (October 10, 2002): 265.

  3. Colin Camerer, interview in Santa Monica, Calif., June 17, 2003.

  4. Read Montague, interview in Houston, Tex., June 24, 2003.

  5. The earliest MRI technologies were good for showing anatomical detail, but did not track changes in brain activity corresponding to behaviors. By the early 1990s, though, advances in MRI techniques led to fMRI—functional magnetic resonance imaging—which could record changes in activity over time in a functioning brain.

  6. Read Montague, interview in Houston, June 24, 2003.

  7. A Web page that tracks new words claimed that its first use was in the Spring 2002 issue of a publication called The Flame.

  8. M.L. Platt and P.W. Glimcher, "Neural Correlates of Decision Variables in Parietal Cortex," Nature, 400 (1999): 233–238.

  9. Read Montague, interview in Houston, June 24, 2003.

  10. A.G. Sanfey et al., "The Neural Basis of Economic Decision-Making in the Ultimatum Game," Science, 300 (2003): 1756.

  11. Read Montague, interview in Houston, Tex., June 24, 2003.

  12. Paul Zak, interview in Claremont, Calif., August 4, 2003.

  13. Aldo Rustichini, "Neuroeconomics: Present and Future," Games and Economic Behavior, 52 (2005): 203–204.

  14. James Rilling et al., "A Neural Basis for Social Cooperation," Neuron, 35 (July 18, 2002): 395–405.

  15. In this version of the game, Players A and B both get 10 "money units" and Player A chooses whether to give his 10 to Player B. If he does, the experimenter quadruples the amount to make 40, so Player B now has 50 (40 plus the original 10). Player B then chooses to give some amount back to A, or keep the whole 50. If Player A doesn't think B returned a fair amount, Player A is given the option to "punish" B by assessing "punishment points." Every punishment point subtracts one monetary unit from B's payoff, but it costs A one monetary unit for every two punishment points assessed. See Dominique J.-F. de Quervain et al., "The Neural Basis of Altruistic Punishment," Science, 305 (August 27, 2004): 1254–1258.

  16. Colin Camerer, interview in Santa Monica, June 17, 2003.

  17. Paul Zak, interview in Claremont, Calif., August 4, 2003.

  SELDON'S SOLUTION

  1. Werner Güth, Rolf Schmittberger, and Bernd Schwarze, "An Experimental Analysis of Ultimatum Bargaining," Journal of Economic Behavior and Organization, 3 (December 1982): 367–388.

  2. Jörgen Weibull, "Testing Game Theory," p. 2. Available online at http://swopec.hhs.se/hastef/papers/hastef0382.pdf.

  3. Ibid., p. 5.

  4. Ibid., p. 17.

  5. Steven Pinker, The Blank Slate, Viking, New York, 2002, p. 102.

  6. Isaac Asimov, Prelude to Foundation, Bantam Books, New York, 1989, p. 10.

  7. Ibid., pp. 11–12.

  8. Ibid., p. 12.

  9. Robert Boyd, interview in Los Angeles, Calif., April 14, 2004.

  10. Some of the ultimatum game results were especially perplexing, in particular the first round of games played in Mongolia. Francisco J. Gil-White, of the University of Pennsylvania, was confused by the pattern of offers and rejections—until discovering that some players didn't believe they would actually receive real money. In another incident, he was puzzled by the rejection of a generous offer. It turned out the player thought Gil-White was an impoverished graduate student. By rejecting all offers, the player reasoned, he would ensure all the money was given back to Gil-White.

  11. Joseph Henrich, telephone interview, May 13, 2004.

  12. Colin Camerer, interview in Pasadena, Calif., March 12, 2004.

  13. Ibid.

  14. Robert Boyd, interview in Los Angeles, Calif., April 14, 2004.

 

‹ Prev