In the 1850s, however, William Thomson, a British physicist, noticed something odd about Charles’ law: the specter of zero. Lower the temperature and the volume of the balloons gets smaller and smaller. Keep lowering at a steady pace and the balloons keep shrinking at a constant rate, but they cannot go on shrinking forever. There is a point at which gas, in theory, takes up no space at all; Charles’ law says that a balloon of gas must shrink to zero space. Of course, zero space is the smallest possible volume; when a gas reaches this point, it takes up no space at all. (It certainly can’t take up negative space.) If the volume of a gas is related to its temperature, a minimum volume means that there is a minimum temperature. A gas cannot keep getting colder and colder indefinitely; when you can’t shrink the balloon any further, you can’t lower the temperature any further. This is absolute zero. It is the lowest temperature possible, a little more than 273 degrees Celsius below the freezing point of water.
Thomson is better known as Lord Kelvin, and it is for Kelvin that the universal temperature scale is named. In the centigrade scale, zero degrees is the freezing point of water. In the Kelvin scale, zero degrees is absolute zero.
Absolute zero is the state where a container of gas has been drained of all of its energy. This is, in actuality, an unattainable goal. You can never cool an object to absolute zero. You can get very close; thanks to laser cooling, physicists can chill atoms to a few millionths of a degree above the ultimate coldness. However, everything in the universe is conspiring to stop you from actually reaching absolute zero. This is because any object that has energy is bouncing around—and radiating light. For instance, people are made up of molecules of water and a few organic contaminants. All of these atoms are wiggling about in space; the higher the temperature, the faster the atoms wiggle. These wiggling atoms bump into one another, getting their neighbors to wiggle as well.
Say you are trying to cool a banana to absolute zero. To get rid of all of the energy in the banana, you’ve got to stop its atoms from moving around; you have to put it in a box and cool it down. However, the box the banana is in is made of atoms, too. The box’s atoms are wiggling around, and they will bump the banana’s atoms and set them in motion again. Even if you get the banana to float in a perfect vacuum in the center of the box, you can’t stop the wiggling entirely, because dancing particles give off light. Light is constantly coming off of the box and striking the banana, getting the banana’s molecules to move again.
All of the atoms that make up a tweezer, a refrigerator coil, and a tub of liquid nitrogen are moving and radiating, so the banana is constantly absorbing energy from the wiggles and radiation of the box it is in, from the tweezers you use to manipulate the banana, and from the refrigerator coil you use to cool it down. You cannot shield the banana from the box or the tweezer or the coil; the shield, too, is wiggling and radiating. Every object is influenced by the environment it’s in, so it’s impossible to cool anything in the universe—a banana, an ice cube, a dollop of liquid helium—to absolute zero. It is an unbreakable barrier.
Absolute zero was a discovery that had a very different flavor from Newton’s laws. Newton’s equations gave physicists power. They could predict the orbits of the planets and the motion of objects with great accuracy. On the other hand, Kelvin’s discovery of absolute zero told physicists what they couldn’t do. They couldn’t ever reach absolute zero. This barrier was disappointing news to the physics world, but it was the beginning of a new branch of physics: thermodynamics.
Thermodynamics is the study of the way heat and energy behave. Like Kelvin’s discovery of absolute zero, the laws of thermodynamics erected impenetrable barriers that no scientists can ever cross, no matter how hard they try. For instance, thermodynamics tells you that it is impossible to create a perpetual-motion machine. Avid inventors tend to swamp physics departments and science magazines with blueprints for incredible machines—machines that eternally generate power without any source of energy. However, the laws of thermodynamics state that it is impossible to create such a machine. It is another task that cannot be done, no matter how hard you try. It is impossible even to get a machine to run without wasting energy, frittering some of its power into the universe as heat. (Thermodynamics is worse than a casino; you can’t win, no matter how much you work at it. You can’t even break even.)
From thermodynamics came the discipline of statistical mechanics. By looking at the collective motion of groups of atoms, physicists could predict the way matter behaves. For instance, the statistical description of a gas explains Charles’ law. As you raise the temperature of a gas, the average molecule moves faster and smashes harder into the walls of its container. The gas pushes harder on the walls: the pressure goes up. Statistical mechanics—the theory of wiggles—explained some of the basic properties of matter, and it even seemed to explain the nature of light itself.
The nature of light was a problem that had consumed scientists for centuries. Isaac Newton believed that light was composed of little particles that flowed from every bright object. Over time, though, scientists came to believe that light was not in fact a particle, but a wave. In 1801 a British scientist discovered that light interferes with itself, apparently putting the matter to rest once and for all.
Interference happens with all sorts of waves. When you drop a stone into a pond, you create circular ripples in the water—waves. The water bobs up and down, and crests and troughs spread outward in a circular pattern. If you drop two stones at the same time, the ripples interfere with one another. You can see this more clearly if you dip two oscillating pistons into a tub of water. When a crest from one piston runs into a trough from the other, the two cancel out; if you look carefully at the pattern of ripples, you can see lines of still, wave-free water (Figure 45).
The same thing is true of light. If light shines through two small slits, there are areas that are dark—wave-free (Figure 46). (You can see a related effect at home. Hold your fingers together; you should have tiny gaps where some light can get through. Gaze through one of those gaps at a lightbulb and you’ll see faint dark lines, especially near the top and bottom of the gap. These lines, too, are due to the wavelike nature of light.) Waves interfere in this way; particles do not. Thus, the phenomenon of interference seemed to settle the question of light’s nature once and for all. Physicists concluded that light was not a particle, but a wave of electric and magnetic fields.
This was the state of the art in the mid-1800s, and it seemed to mesh perfectly with the laws of statistical mechanics. Statistical mechanics tells you how the molecules of matter wiggle; the wave theory of light implied that these molecular wiggles somehow cause ripples of radiation—light waves. Better yet, the hotter the object is, the faster its molecules move; at the same time, the hotter the object, the more energetic the ripples of light it sends out. This works out perfectly. With light, the faster the wave bobs up and down—the higher its frequency—the more energy it has. (Also, the higher its frequency, the smaller its wavelength: the distance between two wave crests.) Indeed, one of the most important thermodynamic laws—the so-called Stefan-Boltzmann equation—seems to tie the wiggles of molecules to the wiggles of light. It relates the temperature of an object to the total amount of light energy it radiates. This was the biggest victory for statistical mechanics and the wave theory of light. (The equation states that the radiated energy is proportional to the temperature raised to the fourth power. It not only tells how much radiation an object gives off, but also how hot an object gets when irradiated with a given amount of energy. This is the law that physicists used—along with a passage in the book of Isaiah—to determine that heaven is more than 500 degrees Kelvin.)
Figure 45: Interference pattern in water
Figure 46: Light interference. If you turn the book sideways and look along the page, you can see the interference patterns on the page.
Unfortunately, the victory would not last for long. At the turn of the century, two British physicists tried to use t
he wiggle theory to solve a simple problem. It was a fairly straightforward calculation: how much light does an empty cavity radiate? Applying the basic equations of statistical mechanics (which tells how the molecules wiggle) and the equations that describe the way electric and magnetic fields interact (which tells how light wiggles), they came up with an equation that describes what wavelengths of light a cavity radiates at any given temperature.
The so-called Rayleigh-Jeans law, named after the physicists Lord Rayleigh and Sir James Jeans, worked fairly well. It did a good job of predicting the amount of large-wavelength, low-energy light that comes off a hot object. At high energies, though, the equation faltered. The Rayleigh-Jeans law predicted that an object gives off more and more light at smaller and smaller wavelengths (and thus higher and higher energies). Consequently at realms close to zero wavelength, the object gives off an infinite amount of high-energy light. According to the Rayleigh-Jeans equation, every object is constantly radiating an infinite amount of energy, no matter what its temperature is; even an ice cube would be radiating enough ultraviolet rays, x rays, and gamma rays to vaporize everything around. This was the “ultraviolet catastrophe.” Zero wavelength equals infinite energy; zero and infinity conspired to break a nice, neat system of laws. Solving this paradox quickly became the leading puzzle in physics.
Rayleigh and Jeans had done nothing wrong. They used equations that physicists thought were valid, manipulated them in an accepted way, and came out with an answer that didn’t reflect the way the world works. Ice cubes don’t wipe out civilizations with bursts of gamma rays, though following the then-accepted rules of physics led inexorably to that conclusion. One of the laws of physics had to be wrong. But which one?
The Quantum Zero: Infinite Energy
To physicists, vacuum has all particles and forces latent in it. It’s a far richer substance than the philosopher’s nothing.
—SIR MARTIN REES
The ultraviolet catastrophe led to the quantum revolution. Quantum mechanics got rid of the zero in the classical theory of light—removing the infinite energy that supposedly came from every bit of matter in the universe. However, this was not much of a victory. A zero in quantum mechanics means that the entire universe—including the vacuum—is filled with an infinite amount of energy: the zero-point energy. This, in turn, leads to the most bizarre zero in the universe: the phantom force of nothing.
In 1900, German experimenters tried to shed some light on the ultraviolet catastrophe. By making careful measurements of how much radiation came off objects at various temperatures, they showed that the Rayleigh-Jeans formula was, indeed, failing to predict the true amount of light that comes from objects. A young physicist named Max Planck looked at the new data and within hours came up with a new equation that replaced the Rayleigh-Jeans formula. Not only did Planck’s formula explain the new measurements, it solved the ultraviolet catastrophe. The Planck formula did not zoom off to infinity as the wavelength decreased; instead of having the energy get bigger and bigger as the wavelength goes down, it got smaller and smaller again (Figure 47). Unfortunately, though Planck’s formula was correct, its repercussions were more troubling than the ultraviolet catastrophe it solved.
Figure 47: Rayleigh-Jeans goes off to infinity, but Planck stays finite.
The problem arose because the ordinary assumptions of statistical mechanics—the laws of physics—did not lead Planck to his formula. The laws of physics had to change to accommodate the Planck formula. Planck later described what he did as an “act of desperation”; nothing less than desperation would compel a physicist to make such a seemingly nonsensical change in the laws of physics: According to Planck, molecules are forbidden to move in most ways. They vibrate only with certain acceptable energies, called quanta. It is impossible for molecules to have energies in between these acceptable values.
This might not seem like such a strange assumption, but it is not the way the world seems to work. Nature doesn’t move in jumps. It would seem silly to have five-foot-tall people and six-foot-tall people but nothing in between. It would be ridiculous if cars drove at 30 miles an hour and 40 miles an hour, but never at 33 or 38 miles an hour. However, a quantum car would behave in exactly this way. You might be driving along at 30 miles an hour, but when you step on the gas, all of a sudden you would instantly—pop!—be driving 40 miles an hour. Nothing in between is allowed, so to get from 30 to 40 miles an hour, you have to make a quantum leap. In the same way, quantum people could not grow very easily; they would hover at four feet for a number of years, and then, in a fraction of a second—pop!—they would be five feet tall. The quantum hypothesis violates everything our everyday experience tells us.
Even though it doesn’t agree with the way nature seems to work, Planck’s strange hypothesis—that molecular vibrations were quantized—led to the correct formula for the frequencies of light that come off an object. Even though physicists quickly realized that Planck’s equation was right, they did not accept the quantum hypothesis. It was too bizarre to accept.
An unlikely candidate would turn the quantum hypothesis from a pecularity to an accepted fact. Albert Einstein, a twenty-six-year-old patent clerk, showed the physics world that nature worked in quanta rather than in smooth increments. He would later become the chief opponent of the theory he helped create.
Einstein didn’t seem like a revolutionary. When Max Planck was turning the physics world on its head, Albert Einstein was scrambling for a job. Out of money, he took a temporary position at the Swiss patent office, a far cry from the assistantship at a university that he wanted. By 1904 he was married, had a newborn son, and was laboring in the patent office—hardly the path to greatness. However, in March 1905, he wrote a paper that would eventually earn him the Nobel Prize. This paper—which explained the photoelectric effect, brought quantum mechanics into the mainstream. Once quantum mechanics was accepted, so, too, would the mysterious powers of zero.
The photoelectric effect was discovered in 1887 when the German physicist Heinrich Hertz discovered that a beam of ultraviolet light could cause a plate to spark: electrons quite simply pop out of the metal when light shines on it. This phenomenon, causing sparks with a beam of light, was very puzzling to classical physicists. Ultraviolet light is light with a lot of energy, so scientists naturally concluded that it took quite a bit of energy to kick an electron out of an atom. But according to the wave theory of light, there is another way to get high-energy light: make it brighter. A very bright blue light, for instance, might have as much energy as a dim ultra-violet beam; therefore, a bright blue light should be able to kick electrons out of atoms, just as a dim ultraviolet beam can.
This simply is not the case, as experiments quickly showed. Even a dim beam of ultraviolet (high frequency) light causes electrons to get knocked out of the metal. However, if you lower the frequency just a little bit beyond a critical threshold—making the light a wee bit too red—the sparking stops all of a sudden. No matter how bright the beam is, if the light is the wrong color, all the electrons in the metal stay put; none of them can escape. That’s not the sort of thing a light wave would do.
Einstein solved this quandary—the puzzle of the photo-electric effect—but his solution was even more revolutionary than Planck’s hypothesis. While Planck proposed that molecules’ vibrations were quantized, Einstein proposed that light came in little packets of energy called photons. This idea conflicted with the accepted physics of light, because it meant that light was not a wave.
On the other hand, if light energy is bundled into little packets, then the photoelectric effect is easy to explain. The light is acting like little bullets that get shot into the metal. When a bullet hits an electron, it gives it a nudge. If the bullet has enough energy—if its frequency is high enough—then it knocks the electron free. On the other hand, if a light particle doesn’t have enough energy to nudge the electron out, then the electron stays put; the photon skitters away instead.
Einstein’s idea expl
ained the photoelectric effect brilliantly. Light is quantized into photons, directly contradicting the wave theory of light that had not been questioned for more than a century. Indeed, it turns out that light has both a wave nature and a particle nature. Though light acts like a particle sometimes, it acts like a wave at other times. In truth, light is neither particle nor wave, but a strange combination of the two. It’s a hard concept to grasp. However, this idea is at the heart of the quantum theory.
According to quantum theory, everything—light, electrons, protons, small dogs—have both wavelike and particle-like properties. But if objects are particles and waves at the same time, what on earth could they be? Mathematicians know how to describe them: they are wave functions, solutions to a differential equation called the Schrödinger equation. Unfortunately, this mathematical description has no intuitive meaning; it is all but impossible to visualize what these wave functions are.* Worse yet, as physicists discovered the intricacies of quantum mechanics, stranger and stranger things began to appear. Perhaps the weirdest of all is caused by a zero in the equations of quantum mechanics: the zero-point energy.
This strange force is woven into the mathematical equations of the quantum universe. In the mid-1920s a German physicist, Werner Heisenberg, saw that these equations had a shocking consequence: uncertainty. The force of nothing is caused by the Heisenberg uncertainty principle.
The concept of uncertainty pertains to scientists’ ability to describe the properties of a particle. For instance, if we want to find a particular particle, we need to determine the particle’s position and velocity—where it is and how fast it is going. Heisenberg’s uncertainty principle tells us that we can’t do even this simple act. No matter how hard we try, we cannot measure a particle’s position and its velocity with perfect accuracy at the same time. This is because the very act of measuring destroys some of the information we are trying to gather.
Zero Page 14