Chances Are

Home > Other > Chances Are > Page 14
Chances Are Page 14

by Michael Kaplan


  As science grew into a shared, international discipline, its focus shifted from the individual to the pluralistic. Astronomers and geographers were learning to collate and condense observations of many things by many people. One of the first great joint projects was the mass observation in 1761 of the transit of Venus—an attempt to use the many known ratios in the solar system to establish one absolute number: our distance from the sun. The measurement required simultaneous observations; on mountains and shores from Siberia to St. Helena, scholars stood by with their chronometers. Some found their long voyages made utterly useless by a passing cloud—but enough came up with results to make their disparity worth examining.

  The great mathematician and promoter of international science Leibniz had fretted that error compounds itself; the message of astronomical measurement appeared to be the opposite: error tends to balance out. This suggested that all observations, no matter how distant from the anticipated true value, should be added to a fair assessment; that there was a shape to error, which, if judiciously traced, would point out the hiding place of the goddess. The problem was: what shape?

  As we have seen, fitting curves to spots on a graph, approximating wild traces by combinations of more easily constructed household shapes, was the great mathematical obsession of the eighteenth and early nineteenth centuries. Laplace and his German contemporary Gauss were both adepts in this art—and both were practical astronomers, keen to correct the existing tables and rid the sky of error. As mathematicians, they knew how difficult it is to generate a curve to pass through a given set of points; as astronomers, though, they had a pretty clear idea of what sort of shape that curve should have: it should be symmetrical, since any single error was equally likely to be too big or too small. It should rise to a maximum, since readings ought to cluster around the truth, as the number of observations increases; it should drop quickly toward zero on either side, since few observations will differ grossly from the majority view. Laplace fiddled with a variety of mountain shapes to fit this requirement—downs, alps, volcanoes—but found the calculations needed to fit them to his data too complex. Gauss boldly started from what he wanted: a curve that would justify Tycho Brahe’s method. If the arithmetic mean of careful observations was the most probable “true” value, what curve would make that value most probable, while at the same time scattering error symmetrically around, with the least total deviation from the mean? How does Design relate to Chance? We remember: it relates through de Moivre’s Normal curve, the bell shape that reveals how answers to yes-or-no questions emerge through the number of times they are asked.

  The best practical demonstration of how the Normal curve can govern error was an eccentric form of pinball machine devised by the Englishman Francis Galton. Called “the Quincunx.” this was a board studded with a diagonal arrangement of evenly spaced pins, through which a quantity of lead shot dropped from a small central chute, rattling down to an array of slots at the bottom.

  Think of each shot as an observation and each pin as a potential source of error. Starting at the center (which you can take as, say, the true position of the supernova in Cassiopeia) truth falls and strikes the first pin (your false nose slips); this could send it left or right, making your observation greater or lesser than the truth. Next, it hits another source of error: your assistant watching the pendulum is sleepy; he may miss a beat, sending your observation further off the beam; or he may snap awake and call too early, unknowingly sending it back toward the center.

  Does this sound familiar? Are you and the demon of error not actually playing repeated games of coin-toss? Is this not a binomial distribution, for which de Moivre’s Normal curve is the more easily calculated approximation? In fact, if you build and run a Quincunx (or go see the very large and satisfying one in the Paris science museum), you will find the slots at the bottom fill with shot in a perfect bell curve, with the highest point aligned under the true position of the chute and one or two stray pellets out at the tails.

  The normal distribution of observational error gave scientists two useful tools: a way to postulate a true position for something, even though it had never been seen exactly there; and a way to gauge whether the mass of observations was behaving as one would expect—whether our fallibility was normal.

  The great power of orderly arrangements is that they allow you to see quickly if something’s missing; so, if error is subject to the laws of probability, then it must be significant if error behaves improbably. Observers will always err, but if they do so with a marked tendency one way or the other, there must be a cause. So, for example, the planet Neptune was discovered—not because some new sphere swam into our ken, entrancing the lone surveyor with its soft blue radiance—but because the error in observations of the orbit of Uranus was not normally distributed. There is a science to being wrong.

  We should stop for a moment here to take note of a huge mental leap taken by Laplace and his contemporaries—one that, like an army passing though a town by night, was both momentous and surreptitious. Remember that de Moivre was talking about mechanisms of probability: games of chance. These have pre-existing rules of behavior that generate patterns of results. Laplace, however, was interested in guessing the rules, given the pattern of play: what’s called inverse probability. The connection between probability and inverse probability is a fraught one, with tensions persisting to the present day. Laplace, though, glided confidently from one to the other through the intervening medium of astronomy: because Newtonian mechanics fit the observed reality of the solar system so well and because so many astronomical events repeat without variation, the question of priority between rule and observation seemed moot. In a clockwork universe, there’s little difference between saying “The minute hand going around once makes the hour hand advance one step” and “since the hour hand has advanced one step, I conclude the minute hand has gone around once.” It’s a finesse that rarely works in more earthly matters; the fault is not in our stars, but in ourselves.

  Adolphe Quetelet was born in Ghent in 1796, a citizen of the Batavian Republic, a fictitious country invented by the French Revolution. He came of age in the Kingdom of the Netherlands, an equally fictitious construct of the counter-revolution that briefly and unsuccessfully amalgamated Belgium and Holland. His early love was art, but practical concerns soon persuaded him to teach mathematics and learn meteorology; he was eventually commissioned to head the Royal Observatory at Brussels. While he was in Paris in 1823, gathering instruments and techniques for the new institution, he came across Laplace’s methods for reducing error in observations. Their overlapping interest was the weather; Laplace had been trying to use the Normal curve to squeeze variation out of barometric observations and see if the moon caused tides in the atmosphere as it did in the ocean.

  The weather, though, would have to wait for Quetelet. Revolution and war once again overtook Belgium in 1830; even his half-built observatory became a temporary fortress. Quetelet, therefore, turned to social numbers for his data, taking the deluge of raw information gathered by newly powerful states as the equivalent of weather readings: definite facts without known causes. He started in a methodical way with physical measurements, taking as his first data a list of the chest-circumferences of 5,000 Scottish soldiers, which had been published in the Edinburgh Medical Journal. Looking at these uncommunicative inches, he made a mental leap that mirrored Galileo’s: If many people measuring one thing is like one person measuring it many times—perhaps measuring many examples of one thing is also like measuring the same thing many times. So instead of looking at these measurements individually, he considered them to be many varying observations of a type: The Scottish Soldier. And when plotted in these terms, lo! the measurements were distributed in a normal curve around a mean value of just below 40 inches. Suddenly there was a way to generalize about people, to bridge the philosophical gap between human nature and the mass. As individuals, we are as full of variation as an observation is full of error. As members of society,
though, we approximate the mean.

  It was Quetelet who gave us our ambivalent idea of “normal.” Looking at the rapidly growing stacks of publicly available data, he found the curve imposing its miraculous order on records of births, marriages, deaths, crimes, methods of crime, suicides, methods of suicide, and more. All humanity seemed huddled willy-nilly under the bell curve as it once sheltered under the cloak of the Virgin.

  He found that marriage increases in line with the price of grain; that your best chance of acquittal in a French court was to be female, over 30, well educated, charged with an offense against the person, and appearing in court of your own accord—and that lilacs were most likely to bloom in Brussels if the sum of the squares of the mean daily temperature since the last frost added up to (4,264C°)2. He believed in and loved the regularity of averages because they clarified our otherwise baffling variety.

  “What we call an anomaly deviates in our eyes from the general law only because we are incapable of embracing enough things in a single glance.” No one before had described us to ourselves in this way. The thought that, despite our impression of individual freedom, we were collectively subject to some higher law—that somewhere among us was a center of social gravity, plowing relentlessly through the ether of history, was both exhilarating and horrifying. How could our perceived free will be so illusory? Because we are subject to a mass of conflicting causes—habits, wants, social relations, economic circumstances. They pull us back and forth, but always gravitating toward the normal for our time and place. As with error, it is possible to take pigheaded obstinacy far out toward either tail of the curve—but those who do so are few and have none to follow them.

  Quetelet had begun with art—and it would be unfair to him to forget the aesthetic element in his view of the normal. Since the eighteenth century, art critics had been torn between a humiliating devotion to the antique and a hope that the modern could produce an art to surpass it. Those perfect bodies unearthed in Rome and Athens were at once a wonder and a reproach—but Quetelet had the solution: they were perfect because they were Platonic ideals representing Man without variation from the normal.

  I have endeavored to compare the proportions of the models, which, in the opinion of the artists of Paris, Rome, Belgium, and other places, united the most perfect graces of form; and I have been surprised to find how little variety of opinion exists, in different places, regarding what they concurred in terming the beautiful.

  So beauty is not what is in the eye of the beholder; it is innate in the normal. Alone among social phenomena, it has an absolute standard from country to country. In this respect, Quetelet is still with us: the Body Mass Index, the universal standard for obesity, is his invention.

  Gradually, we are beginning to see where the idea of “perfectly normal” comes from. Indeed, Quetelet’s apotheosis of the normal went even further:An individual who should comprise in himself (in his own person), at a given period, all the qualities of the average man, would at the same time represent all which is grand, beautiful, and excellent. . . . It is in this manner that he is a great man, a great poet, a great artist. It is because he is the best representative of his age, that he is proclaimed to be the greatest genius.

  We may be seeing here the heroic age of the bourgeoisie: farewell to the lone Romantic monster, perched on his cliff and daring the lightning to strike him! The true hero embodies the spirit of all, shuns extremes, achieves consensus (and, possibly, wears spectacles, carries an umbrella, and dozes after dinner while his daughter practices the pianoforte). To be bourgeois, of course, means to be both complacent and afraid; Quetelet, who had seen the revolutionaries wreck his observatory, had reason to fear the extreme.

  In Quetelet we see the original of the Eurocrat: a liberal in believing that society had its own momentum, and primarily interested in legislation as a means of smoothing out local perturbations, avoiding disorder and social turmoil. Individual freedom, although desirable, should not include a right to reject the average: that would be ignoring the laws of social physics.

  At the same time, Quetelet was a firm believer in perfectibility: the effect of wealth and civilization was to tighten society’s curve, bringing its outer limits closer and closer to the mean. The frightening, irrational extremes would destroy themselves, and we would all come to embody l’homme moyen, the mass individual who represented our collective spirit; the great poet with the ideal body of our time and place.

  Unlike the many utopian schemers of his time, Quetelet did not think his new day would dawn automatically. History was not an ineluctable Germanic process, the Idea lumbering toward Realization; it was a human science, in which our self-awareness was vital. All we needed were more facts. His great message to humanity was: Gather data! Know yourselves! “I consider this work as but a sketch of a vast plan, to be completed only by infinite care and immense researches.”

  Quetelet’s two big insights—statistical stability and the normal distribution of social phenomena—remained unproven, despite a lifetime’s passionate work of gathering, tabulating, and tracing. Yet this was exactly what assured the spread of his ideas: the ease with which they could be used to explain anything and the comfort of knowing that those explanations could not yet be falsified.

  Burdened as we are with self-consciousness, it is natural that humans should constantly ask: “How are we doing?” The oldest comprehensible writing, Linear B, is an inventory; Moses numbered the children of Israel; the Iliad lists the ships of the Achaeans; Caesar Augustus sent out his decree of census; Domesday took account of every pig in the kingdom. Gathering data with clarity and accuracy is by no means a modern phenomenon; one could even say that the true mark of the Dark Ages was its inability to keep lists.

  There is, however, a big difference between accounting and inference, having a list and using it. Double-entry bookkeeping gave Renaissance merchants a way to assess business continuously, gauging the total state of their fortunes as on the day of reckoning; it operates “as if,” creating an instantaneous, fictional balance of assets and liabilities. A similar treatment of social data had to wait until 1662, when John Graunt, draper of London, published his Natural and Political Observations upon the Bills of Mortality. In the same decade that mathematical probability arrived, in the work of Pascal, statistics appeared—like its twin planet.

  London’s weekly Bills of Mortality were an artifact of the city’s susceptibility to plague. They were compiled parish by parish and stated how many babies had been christened, how many people had died, and—as far as the authorities could determine—what people had died of. The problems with the Bills of Mortality as a data set were numerous: they covered only members of the Church of England; they listed only burials in parish graveyards; and although they listed an impressive variety of causes of death, from bursting to lethargy, classification was left to ignorant and ill-paid “searchers,” whose diagnostic skill was not even up to the standards of contemporary doctors.

  Graunt’s simplest goal was to estimate the population of his city: to draw from its mortality an accurate sense of its vitality. He began with the 12,000 recorded christenings every year. Graunt estimated one christening for every two years of a woman’s childbearing life, so there should be some 24,000 women of childbearing age. He guessed that there were twice as many married women as childbearing women—so, 48,000 families; and assumed that each family (counting children, servants, and lodgers) would have eight members: London’s population was therefore roughly 384,000.

  “Estimate,” “guess,” “assume”—these words are never far away in social statistics. The challenge from the very beginning was to find ways to reduce error. Graunt did this using two very modern techniques: sampling and confirmation from unrelated data. He took three representative parishes and actually counted the number of families in them with the numbers of deaths per family, to come up with a ratio of three deaths for eleven families: families/deaths = 11/3. Multiplying the total number of deaths in the Bills of Mort
ality by 11/3 gave a figure of 47,666 families for the whole city—a good fit to his previous estimate. He also looked at the map and counted the number of families in a 100-yard square in London’s most uniformly settled area: the city within the walls. He multiplied his figure of 54 families by the 220 squares in this walled city to get a figure of 11,880 families, then checked the Bills of Mortality to discover that the parishes within the walls accounted for a quarter of all deaths in London. 11,880 × 4 = 47,520. Graunt’s estimate fits in three dimensions: he had found a vital way to rid numbers of error by cross-examining them.

  Graunt lost his stock-in-trade in the Great Fire, and subsequently became bankrupt, Catholic, and dead in short order—but not before leaving us two further types of information on which great pyramids of industry and speculation have since been built: the mortality table and the odd discrepancy in human births.

  Children rarely burst or succumb to lethargy; old people rarely die of thrush, convulsions, or being “Overlaid” by their parents. Distributing these causes of death to their proper ages and assuming a constant rate of risk through life for the expected adult diseases, Graunt devised a table of the number of survivors from a random group of 100 Londoners at ages from 0 to 76. The 64 (only 64!) six-year-olds playing at pitch-and-toss in the narrow street or dawdling to their lessons became the 40 who married at sixteen in their half-built Wren parish church, the 25 who brought their first-born for christening (if it had not been Overlaid), and the 16 who, in the prime of life, ran the shop and business inherited from their parents. By the age of 56, six of them occasionally met at the feasts of their trade or at its elections; three, never the best of friends, remained at 66 to complain about the young, and one at 76 sat by the fire, a pipkin of gruel on his knees, as the lethargy crept upon him.

 

‹ Prev