by George Dyson
Von Neumann was convinced that Lewis Fry Richardson (whose work he described to Strauss as “remarkable and bold”) had been on the right path, and he believed that understanding the weather would eventually convey more power, for good or evil, than understanding how to build bombs. In the Institute’s proposal to the navy, drafted in collaboration with Carl-Gustaf Rossby, he estimated that once the new computer was operating, “a completely calculated prediction for the entire northern hemisphere should take about 2 hours per day of prediction.” In a personal letter to Strauss, in which he also expressed “serious misgivings as to the wisdom of making [Oppenheimer] the Director of the Institute,” he added that the Meteorology Project “would also be the first step toward weather control—but I would prefer not to go into that at this time.” Even “the most constructive schemes for climate control would have to be based on insights and techniques that would also lend themselves to forms of climatic warfare as yet unimagined,” he later warned.26
Once the navy contract was secured, von Neumann hosted, with Rossby’s assistance, a conference on meteorology held at the Institute on August 29–30, 1946. The final topic of discussion was how to resurrect Richardson’s effort, now that enough numerical horsepower to do so was in the works. “It was felt that the numerical attack should be repeated immediately,” the meeting summary reports, “since even the existing mechanical computing facilities have capacities considerably in excess of those that were available to Richardson.”27
More than a dozen meteorologists were invited to take up residence at the Institute, but there was no place to put them, and it was reported on July 15, 1946, that there were still eleven meteorologists for whom “no living quarters have been obtained.”28 Partly because of the lack of housing, and partly because of the lack of a working computer, the project was scaled back, and in the end no more than a handful of meteorologists were ever at the Institute at one time. Von Neumann’s first contribution was to show that the existing methods of integrating the hydrodynamical equations “are unstable under those conditions of spatial and temporal resolution which are essentially characteristic of the problem of meteorological prediction,” this being where Richardson had gone wrong. “I have developed one method which can be shown to be stable and which appears to be suitable for numerical procedure if electronic equipment is available,” he announced in his second progress report to the Office of Naval Research.29
“The atmosphere,” he explained in the next progress report, “is composed of a multitude of small mass-elements, whose behavior is so interrelated that none can be dissociated, even in effect, from all the rest.” The problem was how to translate the analog computation being performed by the atmosphere into a digital computer and speed it up. “A closed system of differential equations, ordinary or partial, linear or non-linear, may be regarded as a set of instructions for constructing its solution from known boundary and initial values,” he continued. “Until now, however, the time required to carry out those ‘instructions’ has been prohibitive.”30 This was now going to change.
The reaction of most meteorologists toward computer-assisted forecasting paralleled that of the Institute mathematicians toward computer-assisted mathematics: skepticism that a machine could improve upon what they were doing with brains alone. As Thompson explained, they “were against it, not for any objective reasons but because they really wanted to believe that forecasting should be an art.”31 According to Charney, the 1946 conference “failed to grip the imagination of the leading dynamical meteorologists who were invited, and few worth-while suggestions were proffered. However, my own imagination, which had already been stirred by Zworykin’s article, was completely captured. I made haste to join the project on my return from Europe in 1948.”32
Jule Gregory Charney, born on New Year’s Day 1917 in San Francisco, had been misdiagnosed with a heart ailment as a child, which left him with an unusual enthusiasm for life. His parents, Ely and Stella, had emigrated from Russia to New York, found work in the garment industry, and moved west to California in 1914. After an interlude in San Francisco, they moved to East Central Los Angeles in 1922 and to Hollywood in 1927, where Jule’s mother obtained enough work from the film studios to get the family through the Depression intact. Both parents were active socialists, and the household was a hotbed of political discussion and union affairs. Jule taught himself calculus while still in high school, before enrolling in UCLA in 1934 and graduating in 1938.
With the war approaching, Charney, who was scraping by as a teaching assistant in mathematics and physics, had to decide whether to pursue meteorology, which interested him, or aeronautics, which promised to be more useful for the war. He went to see aeronautical pioneer Theodore von Kármán at Caltech, who advised meteorology, explaining that aeronautics had matured to where future progress would be made through engineering, not mathematics, whereas meteorology was ripe for a mathematical approach. Charney never looked back. Jacob Bjerknes had recently come to UCLA from Norway to launch a training program for meteorologists, and Charney joined the new department as a teaching assistant in July of 1941 at sixty-five dollars a month.
Charney had a remarkable ability to condense the entire atmosphere, from planetary to molecular scale, into equations that captured what was important and discarded what was not. “Voltaire’s giant, Micromegas, who stood in the same relation to the atmosphere as we do to a rotating dishpan, would describe the atmosphere as a highly turbulent, heterogeneous fluid subject to strong thermal influences and moving over a rough, rotating surface,” he would later write. “He would discern a mean zonal circulation with westerly surface winds increasing with height in middle latitudes in both hemispheres and easterly surface winds near the Equator and poles.” Closer examination would reveal perturbations related to the uneven distribution of the continents and oceans, and superimposed on these quasi-permanent features “he would find a whole collection of migratory vortices, varying in scale from thousands of kilometers to centimeters and less, but with the bulk of the energy in the thousand-kilometer class.”33
By the end of the war Charney was in the middle of his PhD thesis, “The Dynamics of Long Waves in a Baroclinic Westerly Current,” completed in 1946. Newly married, he and Elinor Charney (née Frye) then left Los Angeles for Chicago and his first postdoctoral appointment, with Rossby, who invited him to the Princeton conference in August, where he met von Neumann, learned of his ambitions, and sensed that there was a bit too much mathematics, and too little meteorology, at the IAS. He and Elinor sailed for Bergen and Oslo in the spring of 1947, where he worked among the Norwegians until returning to Princeton in the early spring of 1948.
Charney had arrived at the right place at the right time. The computer was undergoing initial testing, and the first problems were being coded in anticipation of there being a machine on which they could run. A rotating contingent of Norwegian meteorologists, led by Arnt Eliassen and Ragnar Fjørtoft, joined the group. Charney became the liaison between the hands-on experience of the Norwegian forecasters and the mathematical world of von Neumann. “Although I had a fairly clear idea of what I wanted to do in the physical sense, I had only the vaguest notion of how to do it mathematically,” Charney explains.34 Von Neumann’s skills were exactly the reverse. Charney also attracted and nurtured a number of extraordinary American meteorologists, notably Joseph Smagorinsky and Norman Phillips, who played a leading role in the realization of numerical weather prediction over the next ten years.
Smagorinsky was still a graduate student after the war when Charney came to give a talk at the Weather Bureau headquarters at Twenty-fourth and M streets in Washington, D.C., at a time when numerical forecasting was almost entirely alien to the way the Weather Bureau worked. “During the war, when I was a student, a cadet at MIT, I had been told by one of the eminent professors there, Bernhard Haurwitz, that numerical forecasting can’t be done,” says Smagorinsky. “And the reason given was not a very good one. But it was easier to say that it can�
��t be done than it can be. And I carried this notion of impossibility in my mind.” After Charney’s talk, Smagorinsky asked the only question that showed any real thinking about the problem, and Charney invited him to join the new computing group.
“The primary reason for Richardson’s failure,” Charney noted in the first progress report he prepared for the Office of Naval Research, “may be attributed to his attempt to do too much too soon.”35 It was the meteorological community as a whole who solved Richardson’s first problem: gathering sufficient data to establish initial conditions—it soon being recognized that within days the notion of “boundary conditions” disintegrated and it was necessary to have hemispheric knowledge, the boundary between Northern and Southern hemispheres being the only one that held up over time. It was von Neumann, Goldstine, and Bigelow who solved Richardson’s second problem: supplying enough computing power to do the job. And it was Charney who did the most to solve Richardson’s third problem: formulating equations whose solutions did not quickly become more unstable than the weather itself. The key was to filter out noise.
“The atmosphere is a musical instrument on which one can play many tunes,” Charney explained to Thompson in February of 1947. “High notes are sound waves, low notes are long inertial waves, and nature is a musician more of the Beethoven than the Chopin type. He much prefers the low notes and only occasionally plays arpeggios in the treble and then only with a light hand. The oceans and the continents are the elephants in Saint-Saëns’ animal suite.”36
Thompson was listening, and reported to the Office of Naval Research that “the hydrodynamic equations cover the entire spectrum of events, sonic waves, gravity waves, slow inertial waves, et. cetera, and it might simplify matters considerably if those equations were somehow informed that we are interested in only certain kinds of atmospheric behavior—i.e., the propagation of large-scale disturbances.”37 With Charney’s help, numerical filters were soon constructed, and incorporated into codes that, with many hundreds of hours of hand-computing by Margaret Smagorinsky, Norma Gilbarg, and Ellen-Kristine Eliassen, were taken for trial runs. “The system that they were going to use on the big computer, we were doing manually,” says Margaret Smagorinsky. “It was a very tedious job. The three of us worked in a very small room, and we worked hard. It was a small room with three people and three Monroe calculating machines.”38
With the delays in the completion of the computer—and the priority given to the hydrogen bomb problems—it was decided to run a full-scale trial calculation on the ENIAC instead. In March of 1950, Charney was joined by George Platzman, Ragnar Fjørtoft, John Freeman, and Joseph Smagorinsky on an expedition to Aberdeen. They were guided by Klári von Neumann, who helped code their problem and initiated them into the ways of the ENIAC and its peripheral card-processing machines.
“The enactment of a vision foretold by L. F. Richardson 50 years before … began at 12 p.m. Sunday, March 5, 1950, and continued 24 hours a day for 33 days and nights, with only brief interruptions,” Platzman reported.39 “We have completed a 12-hour forecast,” he noted, thirteen days later, in his journal for March 18. “At the end of four weeks we had made two different 24-hour forecasts,” Charney reported on April 10. “The first … was not remarkable for accuracy, though it had some good points.… The second … turned out to be surprisingly good. Even the turning of the wind over Western Europe and the extension of the trough, which Ragnar thought to be a baroclinic phenomenon, was correctly forecast.”40 Over the course of the next week they made two more twenty-four-hour forecasts, for January 31 and February 14, 1949.
Because of the limited internal storage, “it devolved upon punch cards to serve as the large-capacity read/write memory, and this mandated an intimate coupling of punch-card operations with ENIAC operations, ingeniously contrived by von Neumann.” Each step of the calculation required sixteen successive operations: six for the internal ENIAC arithmetic, and ten for the external punch-card operations to process the results and prepare for the next step. “In the course of the four 24-hour forecasts about 100,000 standard I.B.M. punch cards were produced and 1,000,000 multiplications and divisions were performed,” Charney, von Neumann, and Fjørtoft reported. Once the bugs were worked out, “the computation time for a 24-hour forecast was about 24 hours, that is, we were just able to keep pace with the weather.”41
Charney and his colleagues returned to Princeton in triumph. “It mattered little that the twenty-four hour prediction was twenty-four-hours in the making,” he explained. “That was purely a technological problem. Two years later we were able to make the same prediction on our own machine in five minutes.”42 Emboldened by their success, they developed a series of increasingly detailed models of the atmosphere over the Northern Hemisphere by day, while livening up the atmosphere at the housing project by night.
“Oh, we loved him,” says Thelma Estrin, of Charney. “He was warm and friendly and loved parties and was always the last one to go home.” The engineers and the meteorologists, living together in the close-knit Mineville barracks, drew the other academic visitors into their fold. “All the meteorologists were great fun, hard drinkers,” says Hungarian topologist Raoul Bott, who arrived, with a degree in engineering, as a protégé of von Neumann’s in 1949. “We had tremendous wild parties,” he remembers. “It was a high point in my life.”
Bott singles out one evening, just after the first ENIAC expedition, when the poet Dylan Thomas was in town. “And at about ten thirty in the evening, or maybe eleven, we were having a great party in one of those shacks there, and I thought, ‘Well, wouldn’t it be great to bring Dylan Thomas here now?’ So I called the hotel—I was a brash young man—and got Dylan Thomas, and he had been in bed. And he said, ‘Oh, I’d love to be woken up, by all means,’ you know, and he was ready for partying. And so I drove there—we had this 1935 convertible Buick—with my wife, she of course all excited, and right away when he got in the car I could see that there would be a little bit of a problem, because obviously that was going to be his girl for the night.”43
As they refined their models, Charney’s group needed a benchmark by which to gauge their predictions. For his test case, Richardson had used the otherwise uneventful morning of May 20, 1910. Charney’s group chose Thanksgiving 1950, when a severe storm struck the central and eastern United States. The weather system, whose development was missed by the forecasts available at the time, caused three hundred deaths, unprecedented property damage, and even blew part of the roof off the Palmer Physical Laboratory at Princeton University. It was the perfect storm.
“Because of the spontaneity and intensity of its development this storm was selected as an ideal test case for the prediction of cyclogenesis,” Charney explains. Despite the unpredictability of turbulence, he believed that “the inception and development of cyclones are determinate and predictable events.” Although cyclogenesis might appear random, “the initial perturbation will have a preferred location in space and time, and its amplitude, though it may be small initially, will be entirely determined by the basic flow. It is like an automobile being pushed slowly but inexorably over a cliff.”44
“The storm of November 25–27 was first noted on the surface weather map of 1230 GMT, November 24 as a small Low developing over North Carolina and western Virginia,” began the summary in the November 1950 Monthly Weather Review.45 Over the next forty-eight hours the disturbance grew to become the worst storm ever recorded over the United States. Coburn Creek, West Virginia, received sixty-two inches of snow. Records of minus 1 degree Fahrenheit were set in Louisville, Kentucky, and Nashville, Tennessee, and thirty inches of snow fell in Pittsburgh, bringing the steel industry to a halt.
“The two-and-a-half-dimensional model did not catch the cyclogenesis, [although] there was some vague indication of something going on,” Charney later reported. “And so we went to a three-level model, that is, a two-and-two-thirds dimensional model, and we did catch the cyclogenesis. It wasn’t terribly accurate, but there
was no question that [we did]. And I always thought that this was a terribly important thing.… I wanted the world to know about that!”46
The successful 24-hour prediction, following a 16-by-16 grid of 300-kilometer cells through 48 half-hour time steps, required 48 minutes of computing time. According to Charney, “during this time the machine performed approximately 750,000 multiplications and divisions, 10,000,000 additions and subtractions, and executed 30,000,000 distinct orders.”47 While trying to simulate weather inside the computer, the meteorologists were plagued by the weather outside the computer. The “York” refrigeration units continued to become overloaded in the sultry Princeton heat, and during thunderstorms the Williams tube memory often failed. On one very hot day in May there was trouble with the IBM card equipment, and the machine log records: “IBM machine putting a tar-like substance on cards.” The next log entry explains: “Tar is tar from roof.”48
Charney’s plan to develop a hierarchy of models, incrementally more complete, now picked up steam. On August 5, 1952, von Neumann chaired a meeting at the Institute “to explore the possibilities of routine preparation of numerical forecasts by the Weather Bureau and the Air Force and Navy meteorological services.” In September of 1953 the Weather Bureau, air force, and navy agreed to establish a Joint Numerical Weather Prediction Unit, and in January of 1954 a Technical Advisory Group chaired by von Neumann recommended the use of an IBM 701, the rental of which had been budgeted at $175,000 to $300,000 per year.49 IBM delivered the computer early in 1955, and the first operational forecast was made on April 18.