Book Read Free

Against the Gods: The Remarkable Story of Risk

Page 14

by Peter L. Bernstein


  De Moivre's advance in the resolution of these problems ranks among the most important achievements in mathematics. Drawing on both the calculus and on the underlying structure of Pascal's Triangle, known as the binomial theorem, de Moivre demonstrated how a set of random drawings, as in Jacob Bernoulli's jar experiment, would distribute themselves around their average value. For example, assume that you drew a hundred pebbles in succession from Jacob's jar, always returning each pebble drawn, and noted the ratio of white to black. Then assume you made a series of successive drawings, each of a hundred balls. De Moivre would be able to tell you beforehand approximately how many of those ratios would be close to the average ratio of the total number of drawings and how those individual ratios would distribute themselves around the grand average.

  De Moivre's distribution is known today as a normal curve, or, because of its resemblance to a bell, as a bell curve. The distribution, when traced out as a curve, shows the largest number of observations clustered in the center, close to the average, or mean, of the total number of observations. The curve then slopes symmetrically downward, with an equal number of observations on either side of the mean, descending steeply at first and then exhibiting a flatter downward slope at each end. In other words, observations far from the mean are less frequent than observations close to the mean.

  The shape of de Moivre's curve enabled him to calculate a statistical measure of its dispersion around the mean. This measure, now known as the standard deviation, is critically important in judging whether a set of observations comprises a sufficiently representative sample of the universe of which they are just a part. In a normal distribution, approximately 68% of the observations will fall within one standard deviation of the mean of all the observations, and 95% of them will fall within two standard deviations of the mean.

  The standard deviation can tell us whether we are dealing with a case of the head-in-the-oven-feet-in-the-refrigerator, where the average condition of this poor man is meaningless in telling us how he feels. Most of the readings would be far from the average of how he felt around his middle. The standard deviation can also tell us that Jacob's 25,550 draws of pebbles would provide an extremely accurate estimate of the division between the black and white pebbles inside the jar, because relatively few observations would be outliers, far from the average.

  De Moivre was impressed with the orderliness that made its appearance as the numbers of random and unconnected observations increased; he ascribed that orderliness to the plans of the Almighty. It conveys the promise that, under the right conditions, measurement can indeed conquer uncertainty and tame risk. Using italics to emphasize the significance of what he had to say, de Moivre summarized his accomplishment: "[A]tho' Chance produces Irregularities, still the Odds will be infinitely great, that in process of Time, those Irregularities will bear no proportion to recurrency of that Order which naturally results from ORIGINAL DESIGN."13

  De Moivre's gift to mathematics was an instrument that made it possible to evaluate the probability that a given number of observations will fall within some specified bound around a true ratio. That gift has provided many practical applications.

  For example, all manufacturers worry that defective products may slip through the assembly line and into the hands of customers. One hundred percent perfection is a practical impossibility in most instancesthe world as we know it seems to have an incurable habit of denying us perfection.

  Suppose the manager of a pin factory is trying to hold down the number of defective pins to no more than 10 out of every 100,000 produced, or 0.01% or the total.14 To see how things are going, he takes a random sample of 100,000 pins as they come off the assembly line and finds 12 pins without heads-two more than the average of 10 defectives that he had hoped to achieve. How important is that difference? What is the probability of finding 12 defective pins out of a sample of 100,000 if, on the average, the factory would be turning out 10 defective pins out of every 100,000 produced? De Moivre's normal distribution and standard deviation provide the answer.

  But that is not the sort of question that people usually want answered. More often, they do not know for certain before the fact how many defective units the factory is going to produce on the average. Despite good intentions, the true ratio of defectives could end up higher than 10 per 100,000 on the average. What does that sample of 100,000 pins reveal about the likelihood that the average ratio of defectives will exceed 0.01% of the total? How much more could we learn from a sample of 200,000? What is the probability that the average ratio of defectives will fall between 0.009% and 0.011%? Between .007% and .013%? What is the probability that any single pin I happen to pick up will be defective?

  In this scenario, the data are given-10 pins, 12 pins, 1 pin-and the probability is the unknown. Questions put in this manner form the subject matter of what is known as inverse probability: with 12 defective pins out of 100,000, what is the probability that the true average ratio of defectives to the total is 0.01%?

  One of the most effective treatments of such questions was proposed by a minister named Thomas Bayes, who was born in 1701 and lived in Kent." Bayes was a Nonconformist; he rejected most of the ceremonial rituals that the Church of England had retained from the Catholic Church after their separation in the time of Henry VIII.

  Not much is known about Bayes, even though he was a Fellow of the Royal Society. One otherwise dry and impersonal textbook in statistics went so far as to characterize him as "enigmatic."16 He published nothing in mathematics while he was alive and left only two works that were published after his death but received little attention when they appeared.

  Yet one of those papers, Essay Towards Solving A Problem In The Doctrine Of Chances, was a strikingly original piece of work that immortalized Bayes among statisticians, economists, and other social scientists. This paper laid the foundation for the modern method of statistical inference, the great issue first posed by Jacob Bernoulli.

  When Bayes died in 1761, his will, dated a year earlier, bequeathed the draft of this essay, plus one hundred pounds sterling, to "Richard Price, now I suppose a preacher at Newington Green."17 It is odd that Bayes was so vague about Richard Price's location, because Price was more than just a preacher in Islington in north London.

  Richard Price was a man with high moral standards and a passionate belief in human freedom in general and freedom of religion in particular. He was convinced that freedom was of divine origin and therefore was essential for moral behavior; he declared that it was better to be free and sin than to be someone's slave. In the 1780s, he wrote a book on the American Revolution with the almost endless title of Observations on the Importance of the American Revolution and the Means of Making it a Benefit to the World in which he expressed his belief that the Revolution was ordained by God. At some personal risk, he cared for the American prisoners of war who had been transferred to camps in England. Benjamin Franklin was a good friend, and Adam Smith was an acquaintance. Price and Franklin read and criticized some of the draft chapters of The Wealth of Nations as Smith was writing it.

  One freedom bothered Price: the freedom to borrow. He was deeply concerned about the burgeoning national debt, swollen by the wars against France and by the war against the colonies in North America. He complained that the debt was "funding for eternity" and dubbed it the "Grand National Evil."18

  But Price was not just a minister and a passionate defender of human freedom. He was also a mathematician whose work in the field of probability was impressive enough to win him membership in the Royal Society.

  In 1765, three men from an insurance company named the Equitable Society called on Price for assistance in devising mortality tables on which to base their premiums for life insurance and annuities. After studying the work of Halley and de Moivre, among others, Price published two articles on the subject in Philosophical Transactions; his biographer, Carl Cone, reports that Price's hair is alleged to have turned gray in one night of intense concentration on the second of thes
e articles.

  Price started by studying records kept in London, but the life expectancies in those records turned out to be well below actual mortality rates.19 He then turned to the shire of Northampton, where records were more carefully kept than in London. He published the results of his study in 1771 in a book titled Observations on Reversionary Payments, which was regarded as the bible on the subject until well into the nineteenth century. This work has earned him the title of the founding father of actuarial science-the complex mathematical work in probability that is performed today in all insurance companies as the basis for calculating premiums.

  And yet Price's book contained serious, costly errors, in part because of an inadequate data base that omitted the large number of unregistered births. Moreover, he overestimated death rates at younger ages and underestimated them at later ages, and his estimates of migration into and out of Northampton were flawed. Most serious, he appears to have underestimated life expectancies, with the result that the life-insurance premiums were much higher than they needed to be. The Equitable Society flourished on this error; the British government, using the same tables to determine annuity payments to its pensioners, lost heavily.20

  Two years later, after Bayes had died, Price sent a copy of Bayes's "very ingenious" paper to a certain John Canton, another member of the Royal Society, with a cover letter that tells us a good deal about Bayes's intentions in writing the paper. In 1764, the Royal Society subsequently published Bayes's essay in Philosophical Transactions, but even then his innovative work languished in obscurity for another twenty years.

  Here is how Bayes put the problem he was trying to solve:

  PROBLEM

  Given that the number of times in which an unknown event has happened and failed: Required the chance that the probability of its happening in a single trial lies somewhere between any two degrees of probability that can be named.21

  The problem as set forth here is precisely the inverse of the problem as defined by Jacob Bernoulli some sixty years earlier (page 118). Bayes is asking how we can determine the probability that an event will occur under circumstances where we know nothing about it except that it has occurred a certain number of times and has failed to occur a certain number of other times. In other words, a pin could be either defective or it could be perfect. If we identify ten defective pins out of a sample of a hundred, what is the probability that the total output of pins-not just any sample of a hundred-will contain between 9% and 11% defectives?

  Price's cover letter to Canton reflects how far the analysis of probability had advanced into the real world of decision-making over just a hundred years. "Every judicious person," Price writes, "will be sensible that the problem now mentioned is by no means a curious speculation in the doctrine of chances, but necessary to be solved in order to [provide] a sure foundation for all our reasonings concerning past facts, and what is likely to be hereafter."22 He goes on to say that neither Jacob Bernoulli nor de Moivre had posed the question in precisely this fashion, though de Moivre had described the difficulty of reaching his own solution as "the hardest that can be proposed on the subject of chance."

  Bayes used an odd format to prove his point, especially for a dissenting minister: a billiard table. A ball is rolled across the table, free to stop anywhere and thereafter to remain at rest. Then a second ball is rolled repeatedly in the same fashion, and a count is taken of the number of times it stops to the right of the first ball. That number is "the number of times in which an unknown event has happened." Failure-the number of times the event does not happen-occurs when the ball lands to the left. The probability of the location of the first ball-a single trial-is to be deduced from the "successes" and "failures" of the second.23

  The primary application of the Bayesian system is in the use of new information to revise probabilities based on old information, or, in the language of the statisticians, to compare posterior probability with the priors. In the case of the billiard balls, the first ball represents the priors and the continuous revision of estimates as to its location as the second ball is repeatedly thrown represents the posterior probabilities.

  This procedure of revising inferences about old information as new information arrives springs from a philosophical viewpoint that makes Bayes's contribution strikingly modem: in a dynamic world, there is no single answer under conditions of uncertainty. The mathematician A.F.M. Smith has summed it up well: "Any approach to scientific inference which seeks to legitimise an answer in response to complex uncertainty is, for me, a totalitarian parody of a would-be rational learning process."24

  Although the Bayesian system of inference is too complex to recite here in detail, an example of a typical application of Bayesian analysis appears in the appendix to this chapter.

  The most exciting feature of all the achievements mentioned in this chapter is the daring idea that uncertainty can be measured. Uncertainty means unknown probabilities; to reverse Hacking's description of certainty, we can say that something is uncertain when our information is correct and an event fails to happen, or when our information is incorrect and an event does happen.

  Jacob Bernoulli, Abraham de Moivre, and Thomas Bayes showed how to infer previously unknown probabilities from the empirical facts of reality. These accomplishments are impressive for the sheer mental agility demanded, and audacious for their bold attack on the unknown. When de Moivre invoked ORIGINAL DESIGN, he made no secret of his wonderment at his own accomplishments. He liked to turn such phrases; at another point, he writes, "If we blind not ourselves with metaphysical dust we shall be led by a short and obvious way, to the acknowledgment of the great MAKER and GOUVERNOUR of all."25

  We are by now well into the eighteenth century, when the Enlightenment identified the search for knowledge as the highest form of human activity. It was a time for scientists to wipe the metaphysical dust from their eyes. There were no longer any inhibitions against exploring the unknown and creating the new. The great advances in the efforts to tame risk in the years before 1800 were to take on added momentum as the new century approached, and the Victorian era would provide further impulse.

  APPENDIX: AN EXAMPLE OF THE BAYESIAN SYSTEM OF STATISTICAL INFERENCE IN ACTION

  We return to the pin-manufacturing company. The company has two factories, the older of which produces 40% of the total output. This means that a pin picked up at random has a 40% probability of coming from the old factory, whether it is defective or perfect; this is the prior probability. We find that the older factory's defective rate is twice that found in the newer factory. If a customer calls and complains about finding a defective pin, which of the two factories should the manager call?

  The prior probability would suggest that the defective pin was most likely to have come from the new plant, which produces 60% of the total. On the other hand, that plant produces only one-third of the company's total of defective pins. When we revise the priors to reflect this additional information, the probability that the new plant made the defective pin turns out to be only 42.8%; there is a 57.2% probability that the older plant is the culprit. This new estimate becomes the posterior probability.

  uring the last 27 years of his life, which ended at the age of 78 in 1855, Carl Friedrich Gauss slept only once away from his home in Gottingen.' Indeed, he had refused professorships and had declined honors from the most distinguished universities in Europe because of his distaste for travel.

  Like many mathematicians before and after him, Gauss also was a childhood genius-a fact that displeased his father as much as it seems to have pleased his mother. His father was an uncouth laborer who despised the boy's intellectual precocity and made life as difficult as possible for him. His mother struggled to protect him and to encourage his progress; Gauss remained deeply devoted to her for as long as she lived.

  Gauss's biographers supply all the usual stories of mathematical miracles at an age when most people can barely manage to divide 24 by 12. His memory for numbers was so enormous that he carried t
he logarithmic tables in his head, available on instant recall. At the age of eighteen, he made a discovery about the geometry of a seventeen-sided polygon; nothing like this had happened in mathematics since the days of the great Greek mathematicians 2,000 years earlier. His doctoral thesis, "A New Proof That Every Rational Integer Function of One Variable Can Be Resolved into Real Factors of the First or Second Degree," is recognized by the cognoscenti as the fundamental theorem of algebra. The concept was not new, but the proof was.

  Gauss's fame as a mathematician made him a world-class celebrity. In 1807, as the French army was approaching Gottingen, Napoleon ordered his troops to spare the city because "the greatest mathematician of all times is living there."2 That was gracious of the Emperor, but fame is a two-sided coin. When the French, flushed with victory, decided to levy punitive fines on the Germans, they demanded 2,000 francs from Gauss. That was the equivalent of $5,000 in today's money and purchasing power-a heavy fine indeed for a university professor.* A wealthy friend offered to help out, but Gauss rebuffed him. Before Gauss could say no a second time, the fine was paid for him by a distinguished French mathematician, Marquis Pierre Simon de Laplace (1749-1827). Laplace announced that he did this good deed because he considered Gauss, 29 years his junior, to be "the greatest mathematician in the world,"3 thereby ranking Gauss a few steps below Napoleon's appraisal. Then an anonymous German admirer sent Gauss 1,000 francs to provide partial repayment to Laplace.

  Laplace was a colorful personality who deserves a brief digression here; we shall encounter him again in Chapter 12.

  Gauss had been exploring some of the same areas of probability theory that had occupied Laplace's attention for many years. Like Gauss, Laplace had been a child prodigy in mathematics and had been fascinated by astronomy. But as we shall see, the resemblance ended there. Laplace's professional life spanned the French Revolution, the Napoleonic era, and the restoration of the monarchy. It was a time that required unusual footwork for anyone with ambitions to rise to high places. Laplace was indeed ambitious, had nimble footwork, and did rise to high places.'

 

‹ Prev