Book Read Free

Chances Are

Page 13

by Michael Kaplan


  His company has been in business since 1863, two years after a great fire in Glarus revealed to the Swiss, as London’s had to the English, how financially insecure the physical world could be. “We’ve been in business for some time, it’s true,” says Mr. Hess, “but even more than a century is not such a long period to compare risks and be sure of probability. Catastrophe losses—hurricanes, earthquakes—are getting bigger; but is that because these phenomena are worse or because there are more people and more of them buy insurance? Think of how much has been built in Florida since 1920: all in the path of hurricanes and all insured. The probability of disaster may not change, but the exposure does.”

  The best diversification plan balances risks as they are perceived at the moment—but what if one force is driving all risks in the same direction? “Climate change is reality. There will be effects on living conditions, agriculture, business—it’s certain. It’s not just catastrophe cover: conditions could change for health insurance or credit insurance—all because temperatures rise,” says Mr. Hess. “Global change is just that—global. These are not independent events; we can’t completely diversify between them. That’s why we prefer yearly contracts—if you can’t quantify risk over the long term, don’t insure it over the long term.”

  The independence of risks is an a posteriori matter, derived from observation—but human action can suddenly combine what had seemed discrete. Life, aviation, and building insurance were considered separate lines of business with low correlation between them. Even in the same country, they counted as diversified risks, helping to balance out the reinsurer’s portfolio—until the morning of September 11, 2001, when all three came together with such horrible results. Three thousand lives; $45 billion; the biggest insurance loss in history. The insurance industry’s assumptions had been shaped by limited experience. What it defined as the total loss of a skyscraper by fire was simply damage to the interior of ten floors: the worst that could happen before the firefighters put out the flames. Insurance was priced on those assumptions—no one thought total loss could mean . . . total loss.

  We may talk of things as simply happening, obeying their own laws—but our own involvement changes the conditions so radically that we would be far more accurate talking about “beliefs” rather than “events,” and “degrees of certainty” rather than “degrees of likelihood.” As the man from Swiss Re says: “Reality is never based solely on the probable; it is often based on the possible and, time and time again, on that which was not even perceived to be conceivable beforehand.” Probability, once applied to the human world, ceases to be the study of occurrence; it becomes the study of ourselves.

  6

  Figuring

  Where is the Life we have lost in living?

  Where is the wisdom we have lost in knowledge?

  Where is the knowledge we have lost in information?

  —T. S. Eliot, The Rock

  It’s a familiar anxiety: sitting on a chair either too small or too hard, we await the expert’s assessment, trying to read the results upside down from the clipboard. A medical test, a child’s exams, an employment profile: hurdles that, once leapt, allow life to continue. Then come the magic words—“perfectly normal”—bringing with them an inward sigh of relief.

  Perfectly normal? The phrase is a modern cliché but also, when examined closely, a very odd idea. What makes the normal perfect? Do all kinds of normality imply perfection? Would you be as relieved if your health, child, or employment prospects were described as “perfectly mediocre”? And yet the two words, mathematically, are the same.

  Normal is safe; normal is central; normal is unexceptional. Yet it also means the pattern from which all others are drawn, the standard against which we measure the healthy specimen. In its simplest statistical form, normality is represented by the mean (often called the “average”) of a group of measurements: if you measure, say, the height of everyone on your street, add up all the heights, and then divide by the number of people, you will have a “normal” height for your street—even if no particular neighbor is exactly that tall. Normality can also be thought of as the highest point on de Moivre’s bell curve: we saw how, given enough trials, events like rolling a die “normally” represent their inherent probability. Normality, in modern society, stands for an expectation: the measure of a quality that we would consider typical for a particular group; and, since we naturally seek to belong, we have elevated that expectation to an aspiration. Man is born free but is everywhere on average.

  Society recognizes five basic qualitative distinctions: gender, nationality, skin color, employment, and religion (some might add sexual orientation; others insist on class). Almost everything else we measure numerically, basing our sense of what’s normal on the distribution curve generated from quantified observations repeated over time or across a population—all phrases taken from statistics. This means the normal can drift: without having changed ourselves, we can find we are no longer average (much as the normal Floridian, born Latino, dies Jewish). The United Kingdom Census for 2001, for instance, tells us that 40 percent of children are born to single mothers, a great leap from the qualitative expectations of fifty years ago: working dad, housewife mom, couple of kids—all white. The same census also reveals that 390,000 people state their religion as “Jedi.”

  There is a temptation to think of this numerical approach as inevitable: we have social statistics because that is the scientific way to do things. In fact, the advance of statistics into the territory of human affairs—the invention of the social sciences—is a much more human than scientific story, based on human qualities: the force of self-confidence, the delight in order and susceptibility to undelivered promises.

  The Enlightenment took as its foundation the idea that Nature (its very capitalization is an Enlightenment trope) established just proportion in all things, from the most distant star to the most distinguished sentiment. As Alexander Pope said in the Essay on Man:

  All nature is but art unknown to thee,

  All chance, direction which thou canst not see;

  All discord, harmony not understood;

  All partial evil, universal good;

  And, spite of pride, in erring reason’s spite,

  One truth is clear, Whatever is, is right.

  Nothing true could be random. Belief in chance was a form of vulgar error, despised by the pious because it cheapened Providence, and by the skeptical because it denied Reason. Whether God’s or Newton’s laws took priority, things didn’t simply happen. Even David Hume, usually so suspicious of accepted principles, said: “It is universally allowed that nothing exists without a cause of its existence, and that chance, when strictly examined, is a mere negative word, and means not any real power which has anywhere a being in nature.”

  But if Nature and Reason were not random, where did this leave people? If the stars in their courses proclaim a mighty order, why were human affairs so messy? “Everything must be examined,” exclaimed Diderot in the Encyclopédie; “everything must be shaken up, without exception and without circumspection. We must ride roughshod over all these ancient puerilities, overturn the barriers that reason never erected, give back to the arts and sciences the liberty that is so precious to them.” The Enlightenment glowed with outrage at the indefensible. Its battle was against superstition, against prejudice, and against tradition—against, that is, the qualitative tradition of the Middle Ages. Once this was exploded, natural Newtonian proportions would reappear in human affairs: people would enjoy the protection of the state without its interference; laws would regulate but not compel; education would invite minds to explore the pleasures of science, not beat Latin into them with a rod. The free, virtuous yet pleasurable lives that French writers ascribed to Persians, Chinese, Indians, and the amorous Polynesians would appear just as naturally at home. “It will come! It will assuredly come!” cried Lessing. As the inevitable drew near, France was clearly the place where the question of human nature would move from a mat
ter of philosophical speculation to one of political urgency.

  One person who saw the inevitable coming with the impatient joy of the dawn watcher was Marie-Jean-Antoine-Nicolas de Caritat, Marquis de Condorcet. A model of impulsive, heartfelt humanity, with the face of a sensitive boxer under his peruke, Condorcet had abandoned Christianity but retained its force of emotion and desire for certainty. He resisted being confined to any one specialty: mathematics, law, literature, biography, philosophy, and social improvement all called to him.

  Condorcet was certain that there could be a moral physics. All we lacked were facts; he was sure this deficit would be made up soon—but then hope was the essence of his nature:Those sciences, created almost in our own days, the object of which is man himself, the direct goal of which is the happiness of man, will enjoy a progress no less sure than that of the physical sciences; and this sweet idea—that our nephews will surpass us in wisdom as in enlightenment—is no longer an illusion.

  He had read and absorbed Laplace’s ideas on probability as early as 1781; in particular, he was interested in applying these laws to the criminal courts. His warm heart was wrung at the thought of the innocent being condemned—as happened far too often in a capricious and distorted legal system. France had no tradition of common-law rights, but it had a surplus of mathematical genius—so could there be a calculus to prevent injustice?

  Condorcet’s method was to redesign tribunals as equations, balancing the fraction of individual liberty against authority. He estimated the maximum risk of someone’s being convicted wrongly and tried to set it equal to a risk that all would accept without a second thought (he chose the example of taking the Calais-Dover packet boat). This should represent maximum allowable error, to which he applied the number of judges, their individual “degree of enlightenment,” and therefore the minimum plurality to guarantee that margin of safety. He sent the results to Frederick the Great of Prussia, who represented, before the Revolution, the only hope a liberal Frenchman could have of seeing his ideas put into practice.

  Four years later, the long-awaited cleansing deluge washed over France. To begin with, all happened as the men of reason would have wished. The Tennis Court Oath, at which the representatives of all those neither noble nor clerical vowed to remain in permanent session until France had a constitution, was an event so pure, so ancient-Roman, that David’s first sketch of it showed all participants in the nude.

  But then the people rose again—this time to slaughter the prisoners in Paris jails and sack the Archbishop’s palace, throwing his library of precious medieval manuscripts into the Seine. The reaction of the leaders of the Revolution was that combination of guilt and outrage felt by overindulgent parents whose children have trashed the house. Human nature clearly needed more than freedom; it needed rules and responsibilities.

  As the Republic of Virtue slid into the Reign of Terror, Condorcet, too, followed the sinuous path of hope and despair. At first, the Revolution gave him the chance to see his ideas put into practice: he wrote constitutions, declarations to the nations of the world, plans for universal education—indeed, the very plans which now form the basis of the French system (including the École Normale—“normal,” in this case, meaning perfect, establishing the norm for all to follow). Soon, however, his position became less secure: his natural independence, his moral rigor, and his staunch resistance to violence all served to isolate him. In July 1793, he was accused of conspiracy and outlawed.

  In hiding, Condorcet continued to send helpful suggestions to the very Committee of Public Safety that had condemned him. He wrote an arithmetic textbook for his planned national schools and sketched the development of humanity in ten stages from the eras of darkness and superstition to the age of freedom and enlightenment that was just about to begin. He foresaw a universal scientific discourse, “sweeping away our murderous contempt for men of another color or language,” with government by charts, whose arrangement of isolated facts would point to general consequences. All enemies would soon be reconciled, as mankind advanced ever closer toward the limit of his function of development.

  Worrying that his presence was compromising his friends, Condorcet fled Paris, only to be caught in a rural inn because, though claiming to be a carpenter, he had soft hands, a volume of Horace in his pocket, and believed you needed a dozen eggs to make an omelette. He died that night in prison.

  Qualities had caught out this early statistician. He could not survive within the mass, the people, because he did not know his station. He had failed at being normal.

  Meanwhile, the people were also changing their role. No longer the sovereigns of the Revolution, they were about to become its fodder. Pressed by foreign foes, the revolutionary government took an unprecedented step: recruitment of the entire country—the levée en masse.

  Within six months, more than three-quarters of a million men were under arms. The cellars in which generations of Frenchmen had discreetly urinated were dug up to extract the saltpeter from their soil for gunpowder; even the Bayeux tapestry was sliced up to make wagon covers; the pieces were rescued just in time by a sharp-eyed passer-by.

  War was to occupy, with only the briefest of intervals, the next twenty-two years. Danger required action on the largest scale; and action on a large scale requires the ability to handle numbers with at least six digits. Almost unwittingly, through force of circumstance, France left the Age of Reason and entered the Age of Statistics.

  At his zenith, Napoleon controlled a population of 83 million and an army that could muster 190,000 men for a single battle—and he exercised much of that control himself. During his career as Emperor, he wrote or dictated more than 80,000 letters. It was he who, from a tent at Tilsit, specified the improvement of the Paris water supply; it was he who, trusting Probability over Providence, ordered lightning rods for St. Peter’s in Rome. Three hundred young men, his auditeurs, had the task of gathering information, and reporting directly to him.

  As an artillery officer, Napoleon understood numbers; and as a field general, he understood supply. Facts captivated him, and numbers gave him control over facts. “If you want to interest him,” said a contemporary, “quote a statistic.” The uniformity in society established by the Revolution in the name of fraternity became a uniformity of administration, the lever with which Napoleon—or any government—could move the world. Hydra-headed, multifarious, legion—it is significant that philosophers have used the same adjectives to castigate the unthinking mass of people and the equally varied and dangerous body of error. Truth, in this image, is singular, individual, and unchanging, as remote from our murky earthbound struggles as a bright star beaming from a cold sky.

  In 1572, a supernova flared up in the constellation Cassiopeia, and, in the three years of its unusual brightness, illuminated the nature of error. The question was whether this new star was nearer the Earth than is the moon—an assumption made necessary because it was well known that the fixed stars were eternal and unchanging, so new lights must be on some inner sphere along with comets and other untidy visitors. The controversy was so great that Galileo himself joined in. But his contribution marked a distinct change from the previous view of error, which had been that error is deductively wrong, error is sinful. He pointed out that even “with the same instruments, in the same place, by the same observer who has repeated the observation a thousand times,” there would be a variation in results. Now, two honest people looking at something sufficiently distant is not actually very different from one person looking at it twice. And if one person makes an error, it is as likely to be above as below the “true value”; so opposite variations in observation between different scholars are not necessarily a sign that either is a knave or a fool. A wise man will look at the totality of measurements and see how they cluster; and the more measurements, the more likely they are to assemble around the unseen but perfect truth.

  Science now had license to become right though error: in Polonius’s phrase, “by indirections find directions
out.” It was like diving off that rocky perch on which the medieval mind had imagined the perfect to stand—and striking out into the stream, working with the flow to find the middle channel.

  The Danish astronomer Tycho Brahe had also seen the new star: his publication De nova stella attracted the generosity of the king, who gave him an island in the Baltic and the means to found the first specialist observatory since the days of Ptolemy. The business done here was the making of tables, generating solid data based on painstaking observation to check the excesses of theory.

  Tycho had lost his nose in a duel over a since-forgotten mathematical dispute, and wore a silver replacement. It is only to be expected that, in the freezing Baltic nights, a slight nasal instability might have corrupted his observations; and that would be only one of many possible sources of error. So if Galileo’s answer to error was to combine observations, how—thought Tycho—should they be combined?

  He averaged his data, and did it in a clever and conscious way: first taking the average (arithmetic mean) of paired readings made under similar conditions, then averaging pairs of averages, then taking the average of the whole group to produce an assumed true reading. The results justified the procedure: Brahe’s six-year observation of the ascension of Alpha Arietis in relation to Venus had generated readings that varied by as much as 16’30”, but averaging produced a result only 15” out from modern calculations—a more accurate reading than could be made in one observation with any instrument of the time.

 

‹ Prev