Despite its austere beauty in a vacuum, light gets corrupted when it interacts with some elements. Sodium can slow light down to 38 mph, almost twenty times slower than sound. Praseodymium can even catch light, hold on to it for a few seconds like a baseball, then toss it in a different direction.
Lasers manipulate light in subtler ways. Remember that electrons are like elevators: they never rise from level 1 to level 3.5 or drop from level 5 to level 1.8. Electrons jump only between whole-number levels. When excited electrons crash back down, they jettison excess energy as light, and because electron movement is so constrained, so too is the color of the light produced. It’s monochromatic—at least in theory. In practice, electrons in different atoms are simultaneously dropping from level 3 to 1, or 4 to 2, or whatever—and every different drop produces a different color. Plus, different atoms emit light at different times. To our eyes, this light looks uniform, but on a photon level, it’s uncoordinated and jumbled.
Lasers circumvent that timing problem by limiting what floors the elevator stops at (as do their cousins, masers, which work the same way but produce non-visible light). The most powerful, most impressive lasers today—capable of producing beams that, for a fraction of a second, produce more power than the whole United States—use crystals of yttrium spiked with neodymium. Inside the laser, a strobe light curls around the neodymium-yttrium crystal and flashes incredibly quickly at incredibly high intensities. This infusion of light excites the electrons in the neodymium and makes them jump way, way higher than normal. To keep with our elevator bit, they might rocket up to the tenth floor. Suffering vertigo, they immediately ride back down to the safety of, say, the second floor. Unlike normal crashes, though, the electrons are so disturbed that they have a breakdown and don’t release their excess energy as light; they shake and release it as heat. Also, relieved at being on the safe second floor, they get off the elevator, dawdle, and don’t bother hurrying down to the ground floor.
In fact, before they can hurry down, the strobe light flashes again. This sends more of the neodymium’s electrons flying up to the tenth floor and crashing back down. When this happens repeatedly, the second floor gets crowded; when there are more electrons on the second floor than the first, the laser has achieved “population inversion.” At this point, if any dawdling electrons do jump to the ground floor, they disturb their already skittish and crowded neighbors and knock them over the balcony, which in turn knocks others down. And notice the simple beauty of this: when the neodymium electrons drop this time, they’re all dropping from two to one at the same time, so they all produce the same color of light. This coherence is the key to a laser. The rest of the laser apparatus cleans up light rays and hones the beams by bouncing them back and forth between two mirrors. But at that point, the neodymium-yttrium crystal has done its work to produce coherent, concentrated light, beams so powerful they can induce thermonuclear fusion, yet so focused they can sculpt a cornea without frying the rest of the eye.
Based on that technological description, lasers may seem more engineering challenges than scientific marvels, yet lasers—and masers, which historically came first—encountered real scientific prejudice when they were developed in the 1950s. Charles Townes remembers that even after he’d built the first working maser, senior scientists would look at him wearily and say, Sorry, Charles, that’s impossible. And these weren’t hacks—small-minded naysayers who lacked the imagination to see the Next Big Thing. Both John von Neumann, who helped design the basic architecture of modern computers (and modern nuclear bombs), and Niels Bohr, who did as much to explain quantum mechanics as anyone, dismissed Townes’s maser to his face as simply “not possible.”
Bohr and von Neumann blew it for a simple reason: they forgot about the duality of light. More specifically, the famous uncertainty principle of quantum mechanics led them astray. Because Werner Heisenberg’s uncertainty principle is so easy to misunderstand—but once understood is a powerful tool for making new forms of matter—the next section will unpack this little riddle about the universe.
If nothing tickles physicists like the dual nature of light, nothing makes physicists wince like hearing someone expound on the uncertainty principle in cases where it doesn’t apply. Despite what you may have heard, it has (almost*) nothing to do with observers changing things by the mere act of observing. All the principle says, in its entirety, is this:
That’s it.
Now, if you translate quantum mechanics into English (always risky), the equation says that the uncertainty in something’s position (Δx) times the uncertainty in its speed and direction (its momentum, Δp) always exceeds or is equal to the number “h divided by four times pi.” (The h stands for Planck’s constant, which is such a small number, about 100 trillion trillion times smaller than one, that the uncertainty principle applies only to tiny, tiny things such as electrons or photons.) In other words, if you know a particle’s position very well, you cannot know its momentum well at all, and vice versa.
Note that these uncertainties aren’t uncertainties about measuring things, as if you had a bad ruler; they’re uncertainties built into nature itself. Remember how light has a reversible nature, part wave, part particle? In dismissing the laser, Bohr and von Neumann got stuck on the ways light acts like particles, or photons. To their ears, lasers sounded so precise and focused that the uncertainty in the photons’ positions would be nil. That meant the uncertainty in the momentum had to be large, which meant the photons could be flying off at any energy or in any direction, which seemed to contradict the idea of a tightly focused beam.
They forgot that light behaves like waves, too, and that the rules for waves are different. For one, how can you tell where a wave is? By its nature, it spreads out—a built-in source of uncertainty. And unlike particles, waves can swallow and combine with other waves. Two rocks thrown into a pond will kick up the highest crests in the area between them, which receives energy from smaller waves on both sides.
In the laser’s case, there aren’t two but trillions of trillions of “rocks” (i.e., electrons) kicking up waves of light, which all mix together. The key point is that the uncertainty principle doesn’t apply to sets of particles, only to individual particles. Within a beam, a set of light particles, it’s impossible to say where any one photon is located. And with such a high uncertainty about each photon’s position inside the beam, you can channel its energy and direction very, very precisely and make a laser. This loophole is difficult to exploit but is enormously powerful once you’ve got your fingers inside it—which is why Time magazine honored Townes by naming him one of its “Men of the Year” (along with Pauling and Segrè) in 1960, and why Townes won a Nobel Prize in 1964 for his maser work.
In fact, scientists soon realized that much more fit inside the loophole than photons. Just as light beams have a dual particle/wave nature, the farther you burrow down and parse electrons and protons and other supposed hard particles, the fuzzier they seem. Matter, at its deepest, most enigmatic quantum level, is indeterminate and wavelike. And because, deep down, the uncertainty principle is a mathematical statement about the limits of drawing boundaries around waves, those particles fall under the aegis of uncertainty, too.
Now again, this works only on minute scales, scales where h, Planck’s constant, a number 100 trillion trillion times smaller than one, isn’t considered small. What embarrasses physicists is when people extrapolate up and out to human beings and claim that ΔxΔp ≥ h/4π really “proves” you cannot observe something in the everyday world without changing it—or, for the heuristically daring, that objectivity itself is a scam and that scientists fool themselves into thinking they “know” anything. In truth, there’s about only one case where uncertainty on a nanoscale affects anything on our macroscale: that outlandish state of matter—Bose-Einstein condensate (BEC)—promised earlier in this chapter.
This story starts in the early 1920s when Satyendra Nath Bose, a chubby, bespectacled Indian physicist, made an error while
working through some quantum mechanics equations during a lecture. It was a sloppy, undergraduate boner, but it intrigued Bose. Unaware of his mistake at first, he’d worked everything out, only to find that the “wrong” answers produced by his mistake agreed very well with experiments on the properties of photons—much better than the “correct” theory.*
So as physicists have done throughout history, Bose decided to pretend that his error was the truth, admit that he didn’t know why, and write a paper. His seeming mistake, plus his obscurity as an Indian, led every established scientific journal in Europe to reject it. Undaunted, Bose sent his paper directly to Albert Einstein. Einstein studied it closely and determined that Bose’s answer was clever—it basically said that certain particles, like photons, could collapse on top of each other until they were indistinguishable. Einstein cleaned the paper up a little, translated it into German, and then expanded Bose’s work into another, separate paper that covered not just photons but whole atoms. Using his celebrity pull, Einstein had both papers published jointly.
In them, Einstein included a few lines pointing out that if atoms got cold enough—billions of times colder than even superconductors—they would condense into a new state of matter. However, the ability to produce atoms that cold so outpaced the technology of the day that not even far-thinking Einstein could comprehend the possibility. He considered his condensate a frivolous curiosity. Amazingly, scientists got a glimpse of Bose-Einstein matter a decade later, in a type of superfluid helium where small pockets of atoms bound themselves together. The Cooper pairs of electrons in superconductors also behave like the BEC in a way. But this binding together in superfluids and superconductors was limited, and not at all like the state Einstein envisioned—his was a cold, sparse mist. Regardless, the helium and BCS people never pursued Einstein’s conjecture, and nothing more happened with the BEC until 1995, when two clever scientists at the University of Colorado conjured some up with a gas of rubidium atoms.
Fittingly, one technical achievement that made real BEC possible was the laser—which was based on ideas first espoused by Bose about photons. That may seem backward, since lasers usually heat things up. But lasers can cool atoms, too, if wielded properly. On a fundamental, nanoscopic level, temperature just measures the average speed of particles. Hot molecules are furious little clashing fists, and cold molecules drag along. So the key to cooling something down is slowing its particles down. In laser cooling, scientists cross a few beams, Ghostbusters-like, and create a trap of “optical molasses.” When the rubidium atoms in the gas hurtled through the molasses, the lasers pinged them with low-intensity photons. The rubidium atoms were bigger and more powerful, so this was like shooting a machine gun at a screaming asteroid. Size disparities notwithstanding, shooting an asteroid with enough bullets will eventually halt it, and that’s exactly what happened to the rubidium atoms. After absorbing photons from all sides, they slowed, and slowed, and slowed some more, and their temperature dropped to just 1/10,000 of a degree above absolute zero.
Still, even that temperature is far too sweltering for the BEC (you can grasp now why Einstein was so pessimistic). So the Colorado duo, Eric Cornell and Carl Wieman, incorporated a second phase of cooling in which a magnet repeatedly sucked off the “hottest” remaining atoms in the rubidium gas. This is basically a sophisticated way of blowing on a spoonful of soup—cooling something down by pushing away warmer atoms. With the energetic atoms gone, the overall temperature kept sinking. By doing this slowly and whisking away only the few hottest atoms each time, the scientists plunged the temperature to a billionth of a degree (0.000000001) above absolute zero. At this point, finally, the sample of two thousand rubidium atoms collapsed into the Bose-Einstein condensate, the coldest, gooeyest, and most fragile mass the universe has ever known.
But to say “two thousand rubidium atoms” obscures what’s so special about the BEC. There weren’t two thousand rubidium atoms as much as one giant marshmallow of a rubidium atom. It was a singularity, and the explanation for why relates back to the uncertainty principle. Again, temperature just measures the average speed of atoms. If the molecules’ temperature dips below a billionth of a degree, that’s not much speed at all—meaning the uncertainty about that speed is absurdly low. It’s basically zero. And because of the wavelike nature of atoms on that level, the uncertainty about their position must be quite large.
So large that, as the two scientists relentlessly cooled the rubidium atoms and squeezed them together, the atoms began to swell, distend, overlap, and finally disappear into each other. This left behind one large ghostly “atom” that, in theory (if it weren’t so fragile), might be capacious enough to see under a microscope. That’s why we can say that in this case, unlike anywhere else, the uncertainty principle has swooped upward and affected something (almost) human-sized. It took less than $100,000 worth of equipment to create this new state of matter, and the BEC held together for only ten seconds before combusting. But it held on long enough to earn Cornell and Wieman the 2001 Nobel Prize.*
As technology keeps improving, scientists have gotten better and better at inducing matter to form the BEC. It’s not like anyone’s taking orders yet, but scientists might soon be able to build “matter lasers” that shoot out ultra-focused beams of atoms thousands of times more powerful than light lasers, or construct “supersolid” ice cubes that can flow through each other without losing their solidity. In our sci-fi future, such things could prove every bit as amazing as light lasers and superfluids have in our own pretty remarkable age.
17
Spheres of Splendor: The Science of Bubbles
Not every breakthrough in periodic-table science has to delve into exotic and intricate states of matter like the BEC. Everyday liquids, solids, and gases still yield secrets now and then, if fortune and the scientific muses collude in the right way. According to legend, as a matter of fact, one of the most important pieces of scientific equipment in history was invented not only over a glass of beer but by a glass of beer.
Donald Glaser—a lowly, thirsty, twenty-five-year-old junior faculty member who frequented bars near the University of Michigan—was staring one night at the bubbles streaming through his lager, and he naturally started thinking particle physics. At the time, 1952, scientists were using knowledge from the Manhattan Project and nuclear science to conjure up exotic and fragile species of particles such as kaons, muons, and pions, ghostly brothers of familiar protons, neutrons, and electrons. Particle physicists suspected, even hoped, that those particles would overthrow the periodic table as the fundamental map of matter, since they’d be able to peer even deeper into subatomic caves.
But to progress further, they needed a better way to “see” those infinitesimal particles and track how they behaved. Over his beer, Glaser—who had short, wavy hair, glasses, and a high forehead—decided bubbles were the answer. Bubbles in liquids form around imperfections or incongruities. Microscopic scratches in a champagne glass are one place they form; dissolved pockets of carbon dioxide in beer are another. As a physicist, Glaser knew that bubbles are especially prone to form as liquids heat up and approach their boiling point (think of a pan of water on the stove). In fact, if you hold a liquid just below its boiling point, it will burst into bubbles if anything agitates it.
This was a good start but still basic physics. What made Glaser stand out were the next mental steps he took. Those rare kaons, muons, and pions appear only when an atom’s nucleus, its dense core, is splintered. In 1952, a device called a cloud chamber existed, in which a “gun” shot ultra-fast atomic torpedoes at cold gas atoms. Muons and kaons and so on sometimes appeared in the chamber after direct strikes, and the gas condensed into liquid drops along the particles’ track. But substituting a liquid for the gas made more sense, Glaser thought. Liquids are thousands of times denser than gases, so aiming the atomic gun at, say, liquid hydrogen would cause far more collisions. Plus, if liquid hydrogen was held a shade below its boiling point, even a little kick
of energy from a ghostly particle would lather up the hydrogen like Glaser’s beer. Glaser also suspected he could photograph the bubble trails and then measure how different particles left different trails or spirals, depending on their size and charge…. By the time he swallowed the final bubble in his own glass, the story goes, Glaser had the whole thing worked out.
It’s a story of serendipity that scientists have long wanted to believe. But sadly, like most legends, it’s not entirely accurate. Glaser did invent the bubble chamber, but through careful experimentation in a lab, not on a pub napkin. Happily, though, the truth is even stranger than the legend. Glaser designed his bubble chamber to work as explained above, but with one modification.
Depending on their size and charge, different subatomic particles make different swirls and spirals as they blast through a bubble chamber. The tracks are actually finely spaced bubbles in a frigid bath of liquid hydrogen. (Courtesy of CERN)
For Lord knows what reason—perhaps lingering undergraduate fascination—this young man decided beer, not hydrogen, was the best liquid to shoot the atomic gun at. He really thought that beer would lead to an epochal breakthrough in subatomic science. You can almost imagine him smuggling Budweiser into the lab at night, perhaps splitting a six-pack between science and his stomach as he filled thimble-sized beakers with America’s finest, heated them almost to boiling, and bombarded them to produce the most exotic particles then known to physics.
Sam Kean Page 26