Absolute Zero and the Conquest of Cold

Home > Other > Absolute Zero and the Conquest of Cold > Page 12
Absolute Zero and the Conquest of Cold Page 12

by Tom Shachtman


  Thomson was busy—not only with his own work but also with pursuing a woman who would later become his wife—and did not attend the christening, or for some months offer any cogent work suggestions to Joule, though he did write a bit disparagingly to brother James of Joule's request that he stand godfather to the child. A more sympathetic friend of Joule's attended the ceremony as Thomson's surrogate.

  Joule's restless mind further refined the apparatus, in the direction of smaller and smaller flow passages, until he was satisfied with a nozzle that fit the definition of a "porous plug." That did the trick. Testing many gases, Joule and Thomson found that air, carbon dioxide, oxygen, and nitrogen became colder during the expansion but that hydrogen became hotter. Experimenting further and graphing the results, they discovered a set of "inversion temperatures." At and below these temperatures, precooled gases were certain to cool further when expanded. What shortly became known as "Joule-Thomson expansion"—the expansion of pressurized, precooled gases through a porous plug into a lower-pressure vessel, producing a significant decrease in temperature—became the basis for many subsequent efforts in refrigeration, even those used today. We will encounter it as a key concept in the last stages of the drive toward absolute zero.

  In the early 1860s, Thomson and Joule's new elucidation of what could produce cold came to influence the theory of heat. Rudolf Clausius found in Thomson's paper on thermoelectricity an important clue relating to an attribute of matter that deeply intrigued him. For years Clausius had been wondering what could explain or measure the apparently universal tendency toward dissipation. Thomson demonstrated in his article on thermoelectricity that materials possessed some internal energy and postulated that it was somehow used for molecular cohesion. Clausius had touched on a similar concept in his 1850 paper, but it was only after Thomson had produced a sort of experimental verification of internal energy that Clausius pounced on the idea as though it were the Rosetta Stone that could explain what had previously been right before his eyes but had been incomprehensible.

  There were not two types of transformation of energy, Clausius wrote in 1865, there were three. Along with mechanical energy being transformed into heat, and heat being transferred from a hotter body to a colder body, there was the transformation that took place when the constituent molecules of a material were rearranged. From this notion, and from the accepted fact that the change from a solid to a liquid, and from a liquid to a gas, involved work or heat, he derived the concept of disgregation, the degree of dispersion of the molecules of a body. The disgregation of a solid was low, that of a liquid higher, and that of a gas higher still.

  Clausius argued that when a gas was expanded but no work was performed, a transformational change in its energy condition still took place—an increase in its disgregation. To explain this further, Clausius introduced the term entropy, a measure of the unavailable energy in a closed system, or a measure of the bias in nature toward dissipation. The greater the disgregation, the greater the entropy.

  Building on the work of dozens of investigators over forty years, Clausius finally concluded that the "fundamental laws of the universe which correspond to the two fundamental theorems of the mechanical theory of heat" were "1) The energy of the universe is constant; 2) The entropy of the universe tends to a maximum."

  This ultimate, concise, eloquent expression of the forms of energy eviscerated what historian of thermodynamics Donald Cardwell called the "balanced, symmetrical, self-perpetuating universe" of Boyle and Newton, substituting a glimpse of something wholly modern, stripped of theological benevolence, and thoroughly disquieting: "a universe tending inexorably to doom, to the atrophy of a 'heat death,' in which no energy at all will be available although none will have been destroyed; and the complementary condition is that the entropy of the universe will be at its maximum."

  In other words, everything will settle into a state that has a single uniform—but non-zero temperature. Within a half century of Clausius's pronouncement, the concept of entropy would provide the key to his intellectual heir, Walther Nernst, for refining the understanding of entropy in a way that would allow twentieth-century experimenters to reach to within a tantalizing two-billionths of a degree of absolute zero.

  7. Of Explosions and Mysterious Mists

  IN 1865, THE YEAR OF Rudolf Clausius's seminal paper, his former student Carl Linde began working for a locomotive manufacturer and at the same time helped found the Munich Polytechnische Schule, the first of its kind in Bavaria. Linde later recalled that he had been expected to follow in his father's footsteps and become a minister, but an early infatuation led him to study the power of machines rather than the power of God. Learning physics from Clausius before becoming an engineer, Linde never lost his respect for theory. In 1870 a contest sponsored by the Mineral Oil Society of Halle caught his eye: the challenge was to design a system to maintain 25 tons of paraffin for as long as a year at a temperature of—5°C, achieved through artificial means.

  Linde addressed the problem as a student of Clausius. He read all he could about the several extant refrigeration systems, including the Carré absorption machinery, which then dominated the field, having found success in the United States as well as in France. Then he subjected the systems to thermodynamic analysis. The one designed by the Geneva-based chemist Raoul-Pierre Pictet was the most efficient, a vapor-compression system that used sulfur dioxide as the cooling medium; it functioned at much lower pressures than its competitors, but the sulfur dioxide sometimes made contact with water and could transform into the very corrosive sulfuric acid, which ate away the metal of the machinery. Linde found that the other systems were based on principles that did not take advantage of what thermodynamics taught about the conservation of energy. So he designed a thermodynamically sound system of his own, without sulfur dioxide, along the lines of a Carnot cycle achieved through vapor compression.

  Mechanical Effects of Extracting Heat at Low Temperatures, his article detailing all this, appeared in a new and relatively obscure Bavarian trade journal, where it was noted by the director of the largest Austrian brewery company, who commissioned Linde to design a refrigeration system for a new brewery. Linde-designed refrigerators were so much better than the Carré- and Pictet-designed machines that within a few years his units had replaced the older ones, first in breweries and then in other industrial processes that required cooling, until there were more than a thousand Linde machines at work in factories all over Europe.

  Artificially produced refrigeration has been the least noted of the three technological breakthroughs of great significance to the growth of cities that came to the fore between the 1860s and the 1880s. More emphasis has been given to the role played by the elevator and by the varying means of communication, first the telegraph and later the telephone. The elevator permitted buildings to be erected higher than the half-dozen stories a worker or resident could comfortably climb; telegraphs and telephones enabled companies to locate managerial and sales headquarters at a distance from the ultimate consumers of goods and services. Refrigeration had equal impact, allowing the establishment of larger populations farther than ever from the sources of their food supplies. These innovations helped consolidate the results of the Industrial Revolution, and after their introduction, the populations of major cities doubled each quarter century, first in the United States—where the technologies took hold earlier than they did in older countries—and then elsewhere in the world.

  A spate of fantastic literature also began to appear at this time; in books such as Jules Verne's Paris in the Twentieth Century, set in 1960, indoor climate control was mentioned, though its wonders were not fully explored. From the mid-nineteenth century on, most visions of technologically rich futures included predictions of control over indoor and sometimes outdoor temperature.

  In addition to flocking to cities for jobs, Americans also became urbanites in the latter part of the nineteenth century because there seemed to be fewer hospitable open spaces into which an exp
loding population could expand. Large areas of the United States were too hot during many months of the year to sustain colonies of human beings; these included the Southwest and parts of the Southeast, with their tropical and semitropical climates, deserts and swamps. Looked at in retrospect, the principal limitation on people settling in those areas was the lack of air conditioning and home refrigeration.

  In the second half of the nineteenth century, the use of cold in the home became an index of civilization. In New York, 45 percent of the population kept provisions in natural-ice home refrigerators. It was said in this period that if all the natural-ice storage facilities along the Hudson River in New York State were grouped together, they would account for 7 miles of its length. Consumption of ice in New York rose steadily from the 100,000-tons-per-year level of 1860 toward a million tons annually in 1880. But while the per capita use of ice in large American cities climbed to two-thirds of a ton annually, in smaller cities it remained lower, a quarter of a ton per person per year.

  When New York apple growers felt competitively squeezed by western growers who shipped their products in by refrigerated railroad car, they hired experts to improve the quality of their own apples. A specialist was hired to help prevent blue mold, a disease affecting oranges, so that California's oranges would be more appealing to New York consumers than oranges from Central and South America. Believing there were not enough good clams to eat on the West Coast, the city fathers of San Francisco ordered a refrigerator carload of eastern bivalves to plant in San Francisco Bay, founding a new industry there. Commenting in 1869 on the first refrigerated railroad-car shipment of strawberries from Chicago to New York, Scientific American predicted, "We shall expect to see grapes raised in California and brought over the Pacific Railroad for sale in New York this season."

  The desire for refrigeration continued to grow, almost exponentially, but the perils associated with using sulfuric acid, ammonia, ether, and other chemicals in vapor compression and absorption systems remained a constraint on greater use of artificial ice, as did the high costs of manufacturing ice compared with the low costs of what had become a superbly efficient natural-ice industry. Artificial refrigeration finally began to surpass natural-ice refrigeration in the American West and Midwest in the mid-1870s. In the space of a few years, as a result of the introduction of refrigeration, hog production grew 86 percent, and the annual export of American beef (in ice-refrigerated ships) to the British Isles rose from 109,500 pounds to 72 million pounds. Simultaneously, the number of refrigerated railroad cars in the United States skyrocketed from a few thousand to more than 120,000.

  Growth of the American railroads and of refrigeration went hand in hand; moreover, the ability conveyed by refrigeration to store food and to transport slaughtered meat in a relatively fresh state led to huge, socially significant increases in the food supply, and to changes in the American social and geographical landscape. "Slaughter of livestock for sale as fresh meat had remained essentially a local industry until a practical refrigerator car was invented," Oscar Anderson's study of the spread of refrigeration in the United States reported. And because refrigeration permitted processing to go on year-round, hog farmers no longer had to sell hogs only at the end of the summer, the traditional moment for sale—and the moment when the market was glutted with harvest-fattened hogs—but could sell them whenever they reached their best weight.

  In Great Britain, the Bell family of Glasgow, who wanted to replace the natural-ice storage rooms on trans-Atlantic ships with artificially refrigerated rooms that could make their own ice, sought advice from another Glaswegian, Lord Kelvin, who assisted the engineer J. Coleman in designing what became the Bell-Coleman compressed-air machine, which the Bells used to aid in the transport of meat to the British Isles from as far away as Australia. Because of refrigeration, every region of the world able to produce meat, vegetables, or fruit could now be used as a source for food to sustain people in cities even half a world away. Oranges in winter were no longer a luxury affordable only by kings.

  Refrigeration in combination with railroads helped cause the wealth of the United States to begin to flow west, raising the per capita income of workers in the food-packing and transshipment centers of Chicago and Kansas City at the expense of workers in Boston, New York, and Philadelphia. Refrigeration enabled midwestern dairy farmers, whose cost of land was low, to undercut the prices charged for butter and cheese by the dairy farmers of the Northeast. Refrigeration made it possible for St. Louis and Omaha packers to ship dressed beef, mutton, or lamb to market at a lower price per pound than it cost to ship live animals, and when the railroad magnates tried to coerce the packers to pay the same rate for dressed meat as for live animals, the packers built their own refrigerated railcars and forced a compromise.

  The enormous jump in demand for meat, accelerated by refrigerated storage and transport, spurred ranchers and the federal government to take over millions of acres in the American West for use in raising cattle. This action brought on the last phase of the centuries-long push by European colonizers to rid America of its native tribes, by forcing to near extinction the buffalo and the Native American tribes whose lives centered on the buffalo. The conventional view of American history is that it was the "iron horse" that finally killed off the "red man"; but one could with as much justification say that it was the refrigerator.

  Cold of the temperature of ice—cold adequate for most tasks of preserving food and medicines, making beer, transporting crops, preventing hospital rooms from overheating—could be produced by ordinary refrigeration. But scientific explorers wanted to journey far beyond the shoreline of the country of the cold into a temperature region more than a hundred degrees below the freezing point of water. This was an arena beyond the sensory equipment of warm-blooded human beings, a region so cold that skin and nerves could not even register the intensity of its cold; the only way to measure its grade of cold was through thermometers. To conquer this region scientists would require a more powerful technology. They found it in the liquefaction of gases.

  This was a rediscovery, for liquefaction had begun with van Marum and ammonia in 1787, and significant leaps forward had been taken in 1823, when Faraday had liquefied chlorine, and in the early 1830s by Thilorier, who actually went beyond liquefaction to create solid dry ice from carbon dioxide.

  In 1838 for an audience at the Royal Institution Faraday demonstrated the remarkably low temperature of—no°C, achieved by use of the "Thilorier mixture" of dry ice (carbonic acid), snow, and ether. He might have immediately gone further with liquefaction, using the Thilorier mixture, had he not suffered a mental collapse that friends attributed to the exhaustion of having done enough work in a single year to fill four scientific papers. Modern historians believe Faraday's illness may have been mercury poisoning, a then-unknown malady. Whatever the cause, bad health kept Faraday out of the laboratory until 1845; but as soon as he recovered, the possibilities for achieving lower temperatures by means of the Thilorier mixture induced him to return to liquefaction experiments. So enabled was Faraday by the Thilorier mixture that despite having otherwise primitive equipment—a hand pump to compress the gases, and a laboratory widely regarded as the worst then available to a serious experimenter in London—in a few months in 1845 he liquefied all the known gases, with the exception of six he could not change, and which were dubbed "permanent" gases: oxygen, nitrogen, hydrogen, nitric oxide, methane, and carbon monoxide.

  The "permanent" gases were a significant scientific problem worthy of a strong attack by a scientist of Faraday's brilliance, but he seems to have decided in 1845 that he had exhausted the limits of his new tool, and he went no further in liquefaction, instead returning his attention to electricity and magnetism. Less brilliant researchers on the Continent took up the challenge. Recognizing that the weight of seawater produced high pressures, a French physicist named Aimé first compressed nitrogen and oxygen by ordinary means into metal cylinders, then lowered the cylinders into the ocean, t
o a depth of more than a mile. He recorded pressures of up to 200 atmospheres—about 3,000 pounds per square inch—but no liquefaction of the gases occurred. Johannes Natterer of Vienna, whom one historian calls an "otherwise undistinguished medical man," thought the problem of liquefaction basically simple: if Boyle's law held, all he needed to do was raise the amount of pressure on the gas, which should decrease its volume to the point of liquefaction. So he kept beefing up his apparatus until it was able to exert as much as 50,000 pounds of pressure on nitrogen gas. But even under such pressure, the gas would not liquefy.

  Two abler researchers now addressed the problem, each from a different direction. One of the most astute scientists of the time, the Russian Dmitri Ivanovich Mendeleyev, compiler of the periodic table of atomic weights, started from the liquid state, trying to determine precisely at what temperature any liquefied gas could be again induced to become a vapor. That was a logical approach, but it did not prove fruitful. The opposite approach—to determine the conditions required to make a gas become a liquid—was adopted by a Scottish physician living in Belfast, Thomas Andrews.

  The eldest son of a Belfast linen merchant, Andrews had a bent for chemistry, and at age seventeen, having exhausted the facilities for chemical study in Glasgow, he searched through several capitals in Europe for a laboratory to work in before making contact in Paris with a young chemist in the process of becoming a distinguished teacher, Jean-Baptiste Dumas. Andrews returned to Ireland in the late 1830s to study medicine and to teach chemistry. He practiced medicine for only a short time, then became absorbed in a series of experiments on the heat produced or consumed during chemical reactions. By 1849 he was the vice president of the new Queen's College in Belfast, its first professor of chemistry, and a Fellow of the Royal Society. He labored for five years on inconclusive experiments to determine the composition and density of ozone gas. But in the early 1860s these led him to his most important work, exploring in a systematic way what no one had adequately charted, the murky region in which gases and liquids transmuted into one another.

 

‹ Prev