World War II was a turning point for all four cold-based industries. In 1940 the U.S. Army wanted a portable oxygen separator for use in field hospitals; when Union Carbide refused to sell its equipment to the government, the military turned to William Giauque of Berkeley, whose ingenuity had already produced the demagnetization process for reaching close to absolute zero, and who now advanced the utility of liquefied gases by making a new, mobile oxygen separator to provide oxygen for aviators and hospitals.
Days after the Japanese attacked Pearl Harbor, in December 1941, meatpackers in Chicago boned, quick-froze, and shipped to the Pacific a million pounds of beef, chicken, and pork. This demonstration intensified the military's determination to feed frozen foods to personnel in faraway theaters of combat. The sheer quantity of food required by the military created a greater need to avoid wasting crops or livestock. Moreover, because so much fresh food went to the military, there was higher civilian demand for frozen fruits, vegetables, meats, and fish. A last and unexpected spur to the frozen-foods industry was also due to the war: since many automobile showrooms stood empty because they lacked product, their owners sought other uses for their space, and a significant number of the Showrooms became frozen-food distribution centers.
Enlarged military bases in areas that had previously been sparsely populated, such as the deserts of the Southwest and the near-tropical areas of the Southeast, started to attract civilian populations to serve them, increasing the demand for air conditioning. In the dry regions, the technique of "evaporative cooling" became fashionable, partly because it was cheap and easy to operate: a fan blew dampened air through a home, and the dry air in the house absorbed the humidity, cooling the rooms. The availability of evaporative cooling made cities such as Phoenix more habitable, and their populations grew rapidly. Humid areas such as Florida required more expensive air-cooling techniques; despite the cost, new migrants to the Southeast gravitated to developments featuring homes with central air conditioning already installed, as life in the South began to seem unimaginable without artificially cooled air.
The explosive growth of America's suburbs in the postwar era brought with it spectacular increases in the use of frozen foods, refrigeration, air conditioning, and liquified gases. A spacious, air-conditioned supermarket selling a wide variety of frozen foods helped attract people to the suburbs; the annual consumption of frozen foods leapt toward 50 pounds per person, With two hundred new products introduced each year. Although only one out of every eight homes in the United States had air conditioning in the 1960s, four out of ten in the Sunbelt region had it. Nearly every American home featured a refrigerator, and many newer homes, more than one. Greater use of the cold became inextricably associated with America's advancing standard of living—an index of material comfort that expresses the degree to which people control their environment. And as the American standard of living rose, refrigeration, air conditioning, frozen foods, and other products made with cold technology no longer were luxuries but were judged necessities of modern life.
Meanwhile, what had once taken Onnes years, cumbersome equipment, and considerable expense to produce—a few liters of liquefied helium—by the 1950s could be done routinely in almost any laboratory, after the invention by Sam Collins of MIT of a liquefier no bulkier than a home appliance. Shortly, the technique was extended to commercial manufacture of liquid helium. New or improved uses for many cold-liquefied gases emerged: the employment of liquid nitrogen to store blood and semen, the use of liquid hydrogen and liquid oxygen in the rockets of space-exploration programs, liquefied-gas coolants and scrubbers for nuclear reactors, liquid-helium traps for interstellar particles. Artificial insemination of dairy herds became more widespread as the ability to store semen advanced. Miniature Linde-Hampson systems with Joule-Thomson expansion nozzles made possible more portable liquid-cooled infrared sensors; these systems were utilized mostly by the military—for "smart" missiles, projectiles, and night-vision systems—but they were also useful in detecting fires and, in medicine, for detecting diseased tissue. Surgical technique was advanced by the introduction of cryosurgery; a probe carrying liquid nitrogen into the brain or the prostate could do what a metallic knife could not: first tentatively cool a section of tissue, allowing the surgeon the latitude of evaluating the probable results of surgery before deciding whether to permanently destroy the target area.
As space shots reached beyond Earth's atmosphere, boosted into orbit by combinations of hydrogen and oxygen fuel stored in ultracold liquid form in the immense rockets, scientists received the first experimental verification that the temperature of interstellar space was within a few degrees Kelvin of absolute zero. Shortly, liquid oxygen and liquid hydrogen provided the fuel to send men to the moon and return them to Earth, and cold-control apparatus permitted them to stay alive on the journey.
In the universe described by Newtonian physics, nothing smaller than a cannonball shot at a solid wall with great force could push through that wall; the force and mass of the cannonball overcame the energy of the atoms within the wall, which in all other cases was great enough to repulse the energy of the atoms attempting to penetrate it. In the universe described by quantum physics, subatomic particles can also sometimes pass such a barrier, by a process known as tunneling, in which the particles do not overcome the energy of the atoms in their way but instead create a path between the atoms of the wall. In the wake of the revelation of the B-C-S theory in 1957, two scientists half a world apart made discoveries about subatomic tunneling related to superconductivity. In Tokyo, a graduate student in physics employed by the Sony Corporation, Leo Esaki, described tunneling effects in semiconductors at low temperatures, and made what are now called Esaki diodes. In Schenectady, New York, the Norwegian-born graduate student Ivar Giaever was working for General Electric and taking a course at Rensselaer Polytech when he realized that the tunneling of electrons might be used to measure the energy-band gap long ago identified by Fritz London as existing at the Fermi surface in a superconductor. On April 22, 1960, he made a metal sandwich, a layer of insulation between two thin plates of metals; when the outside layers were in the normal state, electrons in a current could tunnel through the insulation, but when one of the layers was in the superconducting state, no tunneling occurred. The effect was measurable, but it had not yet been explained.
A third graduate student, Brian Josephson, at Cambridge University in 1962, drew on the insights of Esaki and Giaever and on Cooper's work with electron pairs—the linchpin of the B-C-S theory—to predict that Cooper pairs could tunnel through the metal sandwich, even when both outer layers were superconductors. Also, this tunneling current proved to be extremely sensitive to external voltages and currents. If the voltage in the current coursing through the layers suddenly fell below a certain level, or if a magnetic field disturbed the superconductivity, the change in the voltage would be measurable. Josephson's teacher at Cambridge, Philip Anderson, took this idea back to Bell Labs, where he and associates were able to prove it experimentally. Several companies began production of "Josephson junctions" to record or measure minute changes in electrical voltage and magnetic fields. These junctions became the heart of new gadgets known as SQUIDs (superconducting quantum in terference devices). SQUIDs are used in voltmeters for low-temperature experiments, including those on space satellites; in magnetometers sensitive enough to pick up the magnetic field of a passing submarine, a human brain, or a heart, or even of a single neuron; and in making speedy logic elements and memory cells in computers. An experimental Josephson junction computer was constructed in the 1980s, with parts of the works sitting in special ultracool liquids, a machine 100 times faster than the usual computers. Indicating the importance that scientists attached to the ideas of Esaki, Giaever, and Josephson, they were jointly awarded the Nobel Prize for 1973.
Josephson junction innovations did not garner much public attention, which went instead to the yearly, breathtaking improvements in microchips and semiconductors. Nor did t
he public realize that as electronic technologies advanced, their need for cooling became greater. Supersonic-aircraft flight speeds had brought to the fore the problems of cooling the electronic gadgetry necessary to operate them, to reduce the size of the devices and improve their reliability. Phalanxes of engineers began devoting themselves to a new specialty, the technology of removing heat from all sorts of electronic equipment. Back in 1947, the first chips contained one component each; by the 1970s, a chip could hold close to 100,000 components, including transistors, diodes, resistors, and capacitors, and there was a need to cool such chips while they were operating and also during their manufacture, often to cryogenic levels. According to a textbook, the "dramatic" increase in miniaturization mandated that "thermal considerations ... must be introduced at the earliest stages of design." The more complex the electronic gadget—the computer, the television set, the mobile telephone—the more likely it was to contain parts manufactured at temperatures hundreds of degrees below freezing. Cooling electronic machinery during manufacture and performance became the fifth large cold-based industry of the modern era.
The need for cooling renewed interest in thermoelectricity, the effect discovered by Peltier in 1834 and more fully explored by Kelvin in 1854, in which cold can be generated by electric currents flowing across two conductors made of different materials. Twentieth-century research revealed that the thermoelectric powers of semiconductors are much greater than those of metals and that thermoelectric capacities can be expanded at low temperatures (80 to 160 K) by the application of a magnetic field. New semiconducting materials made from compounds never known before the last few decades are now used to create very small refrigerating devices for computer components and other sensitive electronic circuitry, including miniature lasers.
The ultimate point in miniaturization and cooling may have been reached by the Baykov Institute of Metallurgy in Moscow and the Odessa Institute of Refrigeration, which in the 1990s created thermoelectric coolers on single crystals made from solid solutions of bismuth and antimony compounds. Single-crystal coolers are being used experimentally for infrared detectors, light-emitting diodes and lasers, and devices for night viewing, astronomical observations, ground and space optic communication, missile guidance, and target illumination.
In January 1962 Lev Landau was involved in a car crash in Russia that broke nearly every bone in his body and put him into a coma for fifty-seven days; perhaps goaded by his being near death, the Nobel committee awarded him the Nobel Prize in physics in December 1962 for his decades-old work on the theory of condensed matter and superfluidity. Soviet scientists were chagrined that Landau would be honored without the prize being jointly awarded to Pyotr Kapitsa, who had done the basic experimental work on superfluidity, but they hoped Kapitsa's turn would come. It took another sixteen years, until 1978, when Kapitsa shared the Nobel Prize with two American physicists who discovered cosmic microwave background radiation. Landau never regained his health or returned to his laboratory work before his death in 1968. By then, interest in superfluidity had surged again, especially as magnetic cooling enabled investigators to lower temperatures into the "millikelvin" range of a few thousandths of a degree above absolute zero.
In the 1970s, demagnetization was alternated with other cooling techniques, in a process that deliberately mimicked the operation of the ideal engine Carnot had imagined, to bring down temperatures and to produce a continuing series of revelations. Physicists had long assumed that superfluidity might be a widespread phenomenon, not simply a property of one form of liquid helium at low temperatures. These guesses received some verification from the work done in 1971 by three physicists at Cornell University, Robert C. Richardson, David M. Lee, and graduate student Douglas Osheroff. They discovered that at a temperature of two-thousandths of a degree above absolute zero, a new form of the element, known as helium-3, a form that had previously been thought capable of becoming a superfluid but had proved elusive, could be made into a superfluid. Their results were so unexpected that one prestigious journal initially rejected their article about the discovery. Superfluid helium-3, they found, was "anisotropic"—like a crystal, it appeared to have different values for properties depending on which axis was being measured. The new superfluid could act like a quantum microscope, permitting the direct observation of the effects caused by interactions among atoms. Richardson, Lee, and Osheroff would win the 1996 Nobel Prize for their work on creating a superfluid from helium-3.
Part of the reason they won the prize was that shortly after their discovery, using the Cornell trio's data, astronomers came to believe that the transition of helium-3 into a superfluid was analogous to the formation of the vast structures in space called cosmic strings, in the microseconds after the "big bang," and that superfluidity as a state or attribute of matter might be present in rotating neutron stars, thousands of light-years distant from Earth. This realization was one instance of a grand coming-together of branches of science that had once seemed separate and unrelated. Connections were found that bolstered the links between the study of the behavior of matter at ultra-low temperatures, the study of subatomic particles, and the study of the origins of the universe. The ability to control the ultracold was the key to all three. Supersensitive devices based on cold technologies had become capable of measuring the entire electromagnetic spectrum, of registering images of the radiant heat of celestial objects in the infrared, millimeter-wave, and microwave range, as well as images of gravity and magnetic emanations. These could be used to identify various relics from the early days of the universe, leftovers from the era of the big bang. Among the dozen types of emissions scientists tried to find in the sky were fractional electric charges such as quarks; one theory holds that the universe began as a sea of quarks, some of which could have survived the big bang. Other emissions include WIMPs, (weakly interacting massive particles), and "stochastic" or random gravitational radiation, which might give clues to what happened during an early "phase transition" of the universe. Helium-3 was also found in the residues from volcanic eruptions, sometimes encapsulated or trapped between carbon atoms, and identified as remnants from the time of the formation of the earth.
In professional conferences with such titles as "Inner Space/Outer Space," physicists explored the efforts to record the big-bang relic particles of deep space and the efforts being made to understand subatomic particles in earthbound laboratories. Since World War II, physicists had relied for investigation of subatomic particles on accelerators, synchrotrons, and eventually, colliders that raised the particles' speed to near the velocity of light. The objective was to let the particles smash into obstacles or into each other, so they would disintegrate into interesting pieces.
One way to make faster accelerators was to use more magnetic power to accelerate the beams. Superconducting wires seemed to be the answer. Although it was known that the application of a magnetic field to most superconducting materials could make those materials lose their superconductivity, there were other superconducting materials that could successfully be made into wires and coils, wrapped around themselves—with no metallic core—to provide higher magnetic force. Such superconducting magnets were used to increase the sensitivity and effectiveness of magnetic resonance imaging (MRI) devices, employed in medicine to detect diseases such as cancer in soft tissues that x-rays could not reveal. They also became critical components in masers, microwave precursors of lasers that were used for communications and are still used to detect remote astronomical events. In high-energy physics, superconducting magnets were incorporated into the design of new accelerators, to raise the speed of the beams of subatomic particles being hurled at one another. Important discoveries about subatomic particles were made with these new accelerators, such as the Tevatron at the Fermi National Accelerator Laboratory in Illinois. * Superconducting magnet colliders showed such promise that governments in the United States and in Europe agreed to build multibillion-dollar "superconducting supercolliders" (SSCs); one expert calc
ulated that to raise particles to the same energy level as the American SSC would make possible, an accelerator constructed in the same way as the 2-mile-long one at Stanford University would need to be 100,000 light-years long.
Just after completion of the first several miles of the 54-mile tunnel near Waxahachie, Texas, that would hold the American SSC, and a test of the first of the six accelerator stages in which hydrogen ions were successfully smashed, in the early 1990s congressional budget cutters shut down the SSC, arguing that $2 billion had already been spent with little to show for it. Following the cancellation of the SSC, several hundred scientists and engineers left the United States to join the European project; with this shift, the center of research on particle physics relocated to near Geneva, site of the Large Hadron Collider.
In 1975 physicists suggested a new way to study atoms, not by accelerating them but by simultaneously slowing and cooling them. In 1985 Steven Chu of Bell Labs ingeniously used lasers fired from six directions to make what he called an "optical molasses," a "trap" to intercept and slow incoming atoms to speeds measured in inches per second. The trap confined a few thousand atoms to one spot, at a temperature of 240-millionths of a degree above absolute zero. Chu's concept was extended by an even better magnetic-trapping device perfected in the following years by William D. Phillips of the U.S. National Institute of Standards and Technology, then explained theoretically by Phillips, Chu, and Claude Cohen-Tannoudji of the École Normale Supérieure in Paris, who cooled the atoms even further, to within one-millionth of a degree above absolute zero. Chu, Phillips, and Cohen-Tannoudji were awarded the 1997 Nobel Prize for their achievement.
Absolute Zero and the Conquest of Cold Page 24