The best way to determine whether changes in stock prices are in fact independent is to find out whether they fall into a normal distribution. Impressive evidence exists to support the case that changes in stock prices are normally distributed. That should come as no surprise. In capital markets as fluid and as competitive as ours, where each investor is trying to outsmart all the others, new information is rapidly reflected in the price of stocks. If General Motors posts disappointing earnings or if Merck announces a major new drug, stock prices do not stand still while investors contemplate the news. No investor can afford to wait for others to act first. So they tend to act in a pack, immediately moving the price of General Motors or Merck to a level that reflects this new information. But new information arrives in random fashion. Consequently, stock prices move in unpredictable ways.
Interesting evidence in support of this view was reported during the 1950s by Harry Roberts, a professor at the University of Chicago.14 Roberts drew random numbers by computer from a series that had the same average and the same standard deviation as price changes in the stock market. He then drew a chart showing the sequential changes of those random numbers. The results produced patterns that were identical to those that stock-market analysts depend on when they are trying to predict where the market is headed. The real price movements and the computer-generated random numbers were indistinguishable from each other. Perhaps it is true that stock prices have no memory.
The accompanying charts show monthly, quarterly, and annual percentage changes in the Standard & Poor's Index of 500 stocks, the professional investor's favorite index of the stock market. The data run from January 1926 through December 1995, for 840 monthly observations, 280 quarterly observations, and 70 annual observations.*
Although the charts differ from one another, they have two features in common. First, as J.P. Morgan is reputed to have said, "The market will fluctuate." The stock market is a volatile place, where a lot can happen in either direction, upward or downward. Second, more observations fall to the right of zero than to the left: the stock market has gone up, on the average, more than it has gone down.
The normal distribution provides a more rigorous test of the random-walk hypothesis. But one qualification is important. Even if the random walk is a valid description of reality in the stock market-even if changes in stock prices fall into a perfect normal distribution-the mean will be something different from zero. The upward bias should come as no surprise. The wealth of owners of common stocks has risen over the long run as the economy and the revenues and profits of corporations have grown. Since more stock-price movements have been up than down, the average change in stock prices should work out to more than zero.
In fact, the average increase in stock prices (excluding dividend income) was 7.7% a year. The standard deviation was 19.3%; if the future will resemble the past, this means that two-thirds of the time stock prices in any one year are likely to move within a range of +27.0% and -12.1%. Although only 2.5% of the years-one out of forty-are likely to result in price changes greater than +46.4%, there is some comfort in finding that only 2.5% of the years will produce bear markets worse than -31.6%.
Stock prices went up in 47 of the 70 years in this particular sample of history, or in two out of every three years. That still leaves stocks falling in 23 of those years; and in 10 of those 23 years, or nearly half, prices plum meted by more than one standard deviation-by more than 12.1%. Indeed, losses in those 22 bad years averaged -15.2%.
U O d O .~ o b N ,"1 a M ~ 0 d d~ d H O E.pN O v Q U aw
Note that the three charts have different scales. The number of observations, shown on the vertical scales, will obviously differ from one another-in any given time span, there are more months than quarters and more quarters than years. The horizontal scales, which measure the range of outcomes, also differ, because stock prices move over a wider span in a year than in a quarter and over a wider span in a quarter than during a month. Each number on the horizontal scale measures price changes between the number to the left and that number.
Let us look first at the 840 monthly changes. The mean monthly change was +0.6%. If we deduct 0.6% from each of the observations in order to correct for the natural upward bias of the stock market over time, the average change becomes +0.00000000000000002%, with 50.6% of the months plus and 49.4% of the months minus. The first quartile observation, 204 below the mid-point, was -2.78%; the third quartile observation, 204 above the mid-point, was +2.91. The symmetry of a normal distribution appears to be almost flawless.
The random character of the monthly changes is also revealed by the small number of runs-of months in which the stock market moved in the same direction as in the preceding month. A movement in the same direction for two months at a time occurred only half the time; runs as long as five months occurred just 9% of the time.
The chart of monthly changes does have a remarkable resemblance to a normal curve. But note the small number of large changes at the edges of the chart. A normal curve would not have those untidy bulges.
Now look at the chart of 280 quarterly observations. This chart also resembles a normal curve. Nevertheless, the dispersion is wide and, once again, those nasty outliers pop up at the extremes. The 1930s had two quarters in which stock prices fell by more than a third-and two quarters in which stock prices rose by nearly 90%! Life has become more peaceful since those woolly days. The quarterly extremes since the end of the Second World War have been in the range of +25% and -25%.
The average quarterly change is +2.0%, but the standard deviation of 12.1% tells us that +2.0% is hardly typical of what we can expect from quarter to quarter. Forty-five percent of the quarters were less than the average of 2.0%, while 55% were above the quarterly average.
The chart of 70 annual observations is the neatest of the three, but the scaling on the horizontal axis of the chart, which is four times the size of the scaling on the quarterly chart, bunches up a lot of large changes.
The differences in scales are not just a technical convenience to make the different time periods comparable to one another on these three charts. The scales tell an important story. An investor who bought and held a portfolio of stocks for 70 years would have come out just fine. An investor who had expected to make a 2% gain each and every threemonth period would have been a fool. (Note that I am using the past tense here; we have no assurance that the past record of the stock market will define its future.)
So the stock-market record has produced some kind of resemblance to a random walk, at least on the basis of these 840 monthly observations, because data would not appear to be distributed in this manner around the mean if stock-price changes were not independent of one another-like throws of the dice. After correction for the upward drift, changes were about as likely to be downward as upward; sequential changes of more than a month or so at a time were rare; the volatility ratios across time came remarkably close to what theory stipulates they should have been.
Assuming that we can employ Jacob Bernoulli's constraint that the future will look like the past, we can use this information to calculate the risk that stock prices will move by some stated amount in any one month. The mean monthly price change in the S&P table was 0.6% with a standard deviation of 5.8%. If price changes are randomly distributed, there is a 68% chance that prices in any one month will change by no less than -5.2% or by no more than +6.4%. Suppose we want to know the probability that prices will decline in any one month. The answer works out to 45%-or a little less than half the time. But a decline of more than 10% in any one month has a probability of only 3.5%, which means that it is likely to happen only about once every thirty months; moves of 10% in either direction will show up about once in every fifteen months.
As it happens, 33 of the 840 monthly observations, or about 4% of the total, were more than two standard deviations away from the monthly average of +0.6%-that is, worse than -11% and greater than 12.2%. Although 33 superswings are fewer than we might expect in a perfectly random se
ries of observations, 21 of them were on the downside; chance would put that number at 16 or 17. A market with a builtin long-term upward trend should have even fewer disasters than 16 or 17 over 816 months.
At the extremes, the market is not a random walk. At the extremes, the market is more likely to destroy fortunes than to create them. The stock market is a risky place.
Up to this point, our story has been pretty much about numbers. Mathematicians have held center stage as we studied the innovations of ancient Hindus, Arabs, and Greeks all the way up to Gauss and Laplace in the nineteenth century. Probability rather than uncertainty has been our main theme.
Now the scene is about to shift. Real life is not like Paccioli's game of balla, a sequence of independent or unrelated events. The stock market looks a lot like a random walk, but the resemblance is less than perfect. Averages are useful guides on some occasions but misleading on many others. On still other occasions numbers are no help at all and we are obliged to creep into the future guided only by guesses.
This does not mean that numbers are useless in real life. The trick is to develop a sense of when they are relevant and when they are not. So we now have a whole new set of questions to answer.
For instance, which defines the risk of being hit by a bomb, seven million people or one elephant? Which of the following averages should we use to define the stock market's normal performance: the average monthly price change of +0.6% from 1926 to 1995, the piddling average of only +0.1% a month from 1930 to 1940, or the enticing average of +1.0% a month from 1954 to 1964?
In other words, what do we mean by "normal"? How well does any particular average describe normal? How stable, how powerful, is an average as an indicator of behavior? When observations wander away from the average of the past, how likely are they to regress to that average in the future? And if they do regress, do they stop at the average or overshoot it?
What about those rare occasions when the stock market goes up five months in a row? Is it true that everything that goes up must come down? Doth pride always goeth before a fall? What is the likelihood that a company in trouble will bring its affairs into order? Will a manic personality swing over to depression any time soon, and vice versa? When is the drought going to end? Is prosperity just around the comer?
The answers to all these questions depend on the ability to distinguish between normal and abnormal. Much risk-taking rests on the opportunities that develop from deviations from normal. When analysts tell us that their favorite stock is "undervalued," they are saying that an investor can profit by buying the stock now and waiting for its value to return to normal. On the other hand, mental depressions or manic states sometimes last a lifetime. And the economy in 1932 refused to move itself around the corner, even though Mr. Hoover and his advisers were convinced that prodding by the government would only deter it from finding its way back all by itself.
Nobody actually discovered the concept of "normal" any more than anybody actually discovered the concept of "average." But Francis Galton, an amateur scientist in Victorian England, took the foundation that Gauss and his predecessors had created to support the concept of average-the normal distribution-and raised a new structure to help people distinguish between measurable risk and the kind of uncertainty that obliges us to guess what the future will bring.
Galton was not a scientist in search of immutable truths. He was a practical man, enthusiastic about science but still an amateur. Yet his innovations and achievements have had a lasting impact on both mathematics and hands-on decision-making in the everyday world.
rancis Galton (1822-1911) was a social snob who never worked to earn a living, except for a brief stint in a hospital during his early twenties.' Yet he was one of the most charming and likable of the many characters mentioned in this account. He was Charles Darwin's first cousin, an occasional inventor, and an avid explorer of parts of Africa where whites had never been seen. He made a seminal contribution to the theory of risk management, but he made that contribution in stubborn pursuit of an evil concept.
Measurement was Galton's hobby-or, rather, obsession. "Wherever you can, count," he would say.' He took note of the size of heads, noses, arms, legs, heights, and weights, of the color of eyes, of the sterility of heiresses, of the number of times people fidgeted as they listened to lectures, and of the degree of color change on the faces of spectators at the Derby as they watched the horses run. He classified the degree of attractiveness of girls he passed on the street, pricking a hole in a left-pocket card when a girl was comely and pricking a right-pocket card when she was plain. In his "Beauty Map" of Britain, London girls scored highest; Aberdeen girls scored lowest. He examined 10,000 judges' sentences and observed that most of them occurred at regular intervals of 3, 6, 9, 12, 15, 18, and 24 years, while none appeared at 17 and only a few at 11 or 13. At a cattle exhibition, he tabulated the guesses of 800 visitors as to the weight of an ox and found that the "average vox populi was correct to within one percent of the real value."3
Galton's Anthropometric Laboratory, which he established in 1884, measured and kept track of the range and character of every possible measurement of the human body, including even finger prints. Finger prints fascinated Galton because, unlike every other part of the body, their configuration never changes as a person grows older. In 1893, he published a 200-page book on the subject that soon led to the widespread use of finger printing by police.
Galton's compulsion to measure was evident even on a trip to Africa in 1849 to hunt big game in what is now Namibia. When he arrived at a village of Hottentots, he discovered "figures that would drive the females of our land desperate-figures that could afford to scoff at Crinoline."4 One woman in particular caught his attention.' As a scientific man, he reported, he was "exceedingly anxious to obtain accurate measurements of her shape." Unable to speak Hottentot and uncertain how to undertake this urgent piece of research, he still managed to achieve his goal:
Of a sudden my eye fell upon my sextant; the bright thought struck me, and I took a series of observations upon her figure in every direction.... [T]his being done, I boldly pulled out my measuring tape, and measured the distance from where I was to the place where she stood, and having thus obtained both base and angles, I worked out the results by trigonometry and logarithms.
Galton was the personification of the Victorian Englishman who strode the earth as though it were his private preserve. On another occasion during his hunting trip to Africa, he grew worried that the local chieftain might attack his camp. Clad in his red hunting coat, cap, and jackboots, he mounted an ox, charged up to the largest hut in the village, and forced the ox's head into the hut. The camp was never attacked.
At another village, he committed a social gaffe by refusing to take part in a ritual in which the host gargles and then spits the liquid into the face of his guest. And when King Nangoro presented him with Princess Chapange for an evening of pleasure, Galton was aghast when she arrived for the occasion "raddled with ochre and butter." "I was dressed in my one well-preserved suit of white linen, so I had her ejected with scant ceremony."
King Nagoro found it hard to believe that there were places in the world inhabited entirely by people with white skins. To him, Galton and his friends were rare migratory animals or some kind of anomaly. One of Galton's companions had to undress repeatedly before the king to prove that he was white all over.
Galton's curiosity was insatiable. When a traveling circus came through Cambridge while he was studying there, he walked straight into the lion's cage, only the fourth person to have done so in that circus's history. He kept himself from falling asleep during his favorite studying hours of 10 p.m. to 2 a.m. with his "Gumption-Reviver machine," a gadget he had invented that kept his head wet with cold water. Later in life, he invented a device for reading under water; he nearly drowned on one occasion when he submerged himself in his bath while enjoying a good book.
As we shall see shortly, Galton's fascination with measurement and his talent for
innovation had loathsome consequences. Still he must be credited with a remarkable contribution to statistics and to risk management. As with Cardano, his insistence on testing his ideas through experimentation led to new statistical theory even though a search for new theory was not his primary objective.
Galton moves us into the world of everyday life, where people breathe, sweat, copulate, and ponder their future. We are now far removed from both the gaming tables and the stars, the means chosen by earlier mathematicians to get their theories right. Galton took the theories as he found them and went on to discover what made them tick.
Although Galton never alludes to Jacob Bernoulli, his work reflects Bernoulli's insistence that the study of probability is an essential tool for the analysis of disease, mental acuteness, and physical agility. And he follows in the footsteps of Graunt and Price, whose main interest was the organization of human society rather than the science of nature. What Galton and these other innovators learned along the way led ultimately to the emergence of today's complex instruments for the control and measurement of risk in both business and finance.
Galton grew up in an environment of affluence and lively intellectual activity. His grandfather, Erasmus Darwin, was among the most famous physicians of his time and a man with many interests beyond medicine. He invented a ferry driven by machinery instead of pulled by animals and a lavatory that flushed, experimented with windmills and steam engines, and wrote The Loves of the Plants, 2,000 lines of poetry describing in scientific detail the reproductive processes of many different plants. In 1796, when he was 65 years old, Erasmus published a two-volume work called Zoonomia, or the Theory of Generations. Although the book went through three editions in seven years, it failed to impress the scientific community because it was rich in theory but poor in facts. Nevertheless, Zoonomia bears a striking resemblance to The Origin of the Species, published 63 years later by Erasmus's more famous grandson, Charles Darwin.
Against the Gods: The Remarkable Story of Risk Page 16