The Disappearing Spoon: And Other True Tales of Madness, Love, and the History of the World from the Periodic Table of the Elements
Page 29
The pain is all the more acute because the kilogram is the last base unit bound to human strictures. A platinum rod in Paris defined 1.000000… meter through much of the twentieth century, until scientists redefined it with a krypton atom in 1960, fixing it at 1,650,763.73 wavelengths of red-orange light from a krypton-86 atom. This distance is virtually identical to the length of the old rod, but it made the rod obsolete, since that many wavelengths of krypton light would stretch the same distance in any vacuum anywhere. (That’s an e-mailable definition.) Since then, measurement scientists (metrologists) have re-redefined a meter (about three feet) as the distance any light travels in a vacuum in 1/299,792,458 of a second.
Similarly, the official definition of one second used to be about 1/31,556,992 of one trip around the sun (the number of seconds in 365.2425 days). But a few pesky facts made that an inconvenient standard. The length of a year—not just a calendar year, but an astronomical year—varies with every trip because of the sloshing of ocean tides, which drag and slow earth’s orbit. To correct for this, metrologists slip in a “leap second” about every third year, usually when no one’s paying attention, at midnight on December 31. But leap seconds are an ugly, ad hoc solution. And rather than tie a supposedly universal unit of time to the transit of an unremarkable rock around a forgettable star, the U.S. standards bureau has developed cesium-based atomic clocks.
Atomic clocks run on the same leaping and crashing of excited electrons we’ve discussed before. But atomic clocks also exploit a subtler movement, the electrons’ “fine structure.” If the normal jump of an electron resembles a singer jumping an octave from G to G, fine structure resembles a jump from G to G-flat or G-sharp. Fine structure effects are most noticeable in magnetic fields, and they’re caused by things you can safely ignore unless you find yourself in a dense, high-level physics course—such as the magnetic interactions between electrons and protons or corrections due to Einstein’s relativity. The upshot is that after those fine adjustments,* each electron jumps either slightly lower (G-flat) or slightly higher (G-sharp) than expected.
The electron “decides” which jump to make based on its intrinsic spin, so one electron never hits the sharp and the flat on successive leaps. It hits one or the other every time. Inside atomic clocks, which look like tall, skinny pneumatic tubes, a magnet purges all the cesium atoms whose outer electrons jump to one level, call it G-flat. That leaves only atoms with G-sharp electrons, which are gathered into a chamber and excited by an intense microwave. This causes cesium electrons to pop (i.e., jump and crash) and emit photons of light. Each cycle of jumping up and down is elastic and always takes the same (extremely short) amount of time, so the atomic clock can measure time simply by counting photons. Really, whether you purge the G-flat or G-sharp doesn’t matter, but you have to purge one of them because jumping to either level takes a different amount of time, and at the scales metrologists work with, such imprecision is unacceptable.
Cesium proved convenient as the mainspring for atomic clocks because it has one electron exposed in its outermost shell, with no nearby electrons to muffle it. Cesium’s heavy, lumbering atoms are fat targets for the maser that strums them as well. Still, even in plodding cesium, the outer electron is a quick bugger. Instead of a few dozen or few thousand times per second, it performs 9,192,631,770 back-and-forths every one-Mississippi. Scientists picked that ungainly number instead of cutting themselves off at 9,192,631,769 or letting things drag on until 9,192,631,771 because it matched their best guess for a second back in 1955, when they built the first cesium clock. Regardless, 9,192,631,770 is now fixed. It became the first base-unit definition to achieve universal e-mailability, and it even helped liberate the meter from its platinum rod after 1960.
Scientists adopted the cesium standard as the world’s official measurement of time in the 1960s, replacing the astronomical second, and while the cesium standard has profited science by ensuring precision and accuracy worldwide, humanity has undeniably lost something. Since before even the ancient Egyptians and Babylonians, human beings used the stars and seasons to track time and record their most important moments. Cesium severed that link with the heavens, effaced it just as surely as urban streetlamps blot out constellations. However fine an element, cesium lacks the mythic feeling of the moon or sun. Besides, even the argument for switching to cesium—its universality, since cesium electrons should vibrate at the same frequency in every pocket of the universe—may no longer be a safe bet.
* * *
If anything runs deeper than a mathematician’s love of variables, it’s a scientist’s love of constants. The charge of the electron, the strength of gravity, the speed of light—no matter the experiment, no matter the circumstances, those parameters never vary. If they did, scientists would have to chuck the precision that separates “hard” sciences from social sciences like economics, where whims and sheer human idiocy make universal laws impossible.
Even more seductive to scientists, because more abstract and universal, are fundamental constants. Obviously, the numerical value of a particle’s size or speed would change if we arbitrarily decided that meters should be longer or if the kilogram suddenly shrank (ahem). Fundamental constants, however, don’t depend on measurement. Like , they’re pure, fixed numbers, and also like , they pop up in all sorts of contexts that seem tantalizingly explainable but that have so far resisted all explanation.
The best-known dimensionless constant is the fine structure constant, which is related to the fine splitting of electrons. In short, it controls how tightly negative electrons are bound to the positive nucleus. It also determines the strength of some nuclear processes. In fact, if the fine structure constant—which I’ll refer to as alpha, because that’s what scientists call it—if alpha had been slightly smaller right after the big bang, nuclear fusion in stars would never have gotten hot enough to fuse carbon. Conversely, if alpha had grown slightly larger, carbon atoms would all have disintegrated aeons ago, long before finding their way into us. That alpha avoided this atomic Scylla and Charybdis makes scientists thankful, naturally, but also very antsy, because they cannot explain how it succeeded. Even a good, inveterate atheist like physicist Richard Feynman once said of the fine structure constant, “All good theoretical physicists put this number up on their wall and worry about it…. It’s one of the greatest damn mysteries of physics: a magic number that comes to us with no understanding by man. You might say the ‘hand of God’ wrote that number, and we don’t know how He pushed His pencil.”
Historically, that didn’t stop people from trying to decipher this scientific mene, mene, tekel, upharsin. English astronomer Arthur Eddington, who during a solar eclipse in 1919 provided the first experimental proof of Einstein’s relativity, grew fascinated with alpha. Eddington had a penchant, and it must be said a talent, for numerology,* and in the early 1900s, after alpha was measured to be around 1/136, Eddington began concocting “proofs” that alpha equaled exactly 1/136, partly because he found a mathematical link between 136 and 666. (One colleague derisively suggested rewriting the book of Revelation to take this “finding” into account.) Later measurements showed that alpha was closer to 1/137, but Eddington just tossed a 1 into his formula somewhere and continued on as if his sand castle hadn’t crumbled (earning him the immortal nickname Sir Arthur Adding-One). A friend who later ran across Eddington in a cloakroom in Stockholm was chagrined to see that he insisted on hanging his hat on peg 137.
Today alpha equals 1/137.0359 or so. Regardless, its value makes the periodic table possible. It allows atoms to exist and also allows them to react with sufficient vigor to form compounds, since electrons neither roam too freely from their nuclei nor cling too closely. This just-right balance has led many scientists to conclude that the universe couldn’t have hit upon its fine structure constant by accident. Theologians, being more explicit, say alpha proves that a creator has “programmed” the universe to produce both molecules and, possibly, life. That’s why it was such a big deal
in 1976 when a Soviet (now American) scientist named Alexander Shlyakhter scrutinized a bizarre site in Africa called Oklo and declared that alpha, a fundamental and invariant constant of the universe, was getting bigger.
Oklo is a galactic marvel: the only natural nuclear fission reactor known to exist. It stirred to life some 1.7 billion years ago, and when French miners unearthed the dormant site in 1972, it caused a scientific roar. Some scientists argued that Oklo couldn’t have happened, while some fringe groups pounced on Oklo as “evidence” for outlandish pet theories such as long-lost African civilizations and crash landings by nuclear-powered alien star cruisers. Actually, as nuclear scientists determined, Oklo was powered by nothing but uranium, water, and blue-green algae (i.e., pond scum). Really. Algae in a river near Oklo produced excess oxygen after undergoing photosynthesis. The oxygen made the water so acidic that as it trickled underground through loose soil, it dissolved the uranium from the bedrock. All uranium back then had a higher concentration of the bomb-ready uranium-235 isotope—about 3 percent, compared to 0.7 percent today. So the water was volatile already, and when underground algae filtered the water, the uranium was concentrated in one spot, achieving a critical mass.
Though necessary, a critical mass wasn’t sufficient. In general, for a chain reaction to occur, uranium nuclei must not only be struck by neutrons, they must absorb them. When pure uranium fissions, its atoms shoot out “fast” neutrons that bounce off neighbors like stones skipped across water. Those are basically duds, wasted neutrons. Oklo uranium went nuclear only because the river water slowed the neutrons down enough for neighboring nuclei to snag them. Without the water, the reaction never would have begun.
But there’s more. Fission also produces heat, obviously. And the reason there’s not a big crater in Africa today is that when the uranium got hot, it boiled the water away. With no water, the neutrons became too fast to absorb, and the process ground to a halt. Only when the uranium cooled down did water trickle back in—which slowed the neutrons and restarted the reactor. It was a nuclear Old Faithful, self-regulating, and it consumed 13,000 pounds of uranium over 150,000 years at sixteen sites around Oklo, in on/off cycles of 150 minutes.
How did scientists piece that tale together 1.7 billion years later? With elements. Elements are mixed thoroughly in the earth’s crust, so the ratios of different isotopes should be the same everywhere. At Oklo, the uranium-235 concentration was 0.003 to 0.3 percent less than normal—a huge difference. But what determined that Oklo was a natural nuke and not the remnants of a smuggling operation for rogue terrorists was the overabundance of useless elements such as neodymium. Neodymium mostly comes in three even-numbered flavors, 142, 144, and 146. Uranium fission reactors produce odd-numbered neodymium at higher rates than normal. In fact, when scientists analyzed the neodymium concentrations at Oklo and subtracted out the natural neodymium, they found that Oklo’s nuclear “signature” matched that of a modern, man-made fission reactor. Amazing.
Still, if neodymium matched, other elements didn’t. When Shlyakhter compared the Oklo nuclear waste to modern waste in 1976, he found that too little of some types of samarium had formed. By itself, that’s not so thrilling. But again, nuclear processes are reproducible to a stunning degree; elements such as samarium don’t just fail to form. So samarium’s digression hinted to Shlyakhter that something had been off back then. Taking a hell of a leap, he calculated that if only the fine structure constant had been just a fraction smaller when Oklo went nuclear, the discrepancies were easy to explain. In this, he resembled the Indian physicist Bose, who didn’t claim to know why his “wrong” equations about photons explained so much; he only knew they did. The problem was, alpha is a fundamental constant. It can’t vary, not according to physics. Worse for some, if alpha varied, probably no one (or, rather, no One) had “tuned” alpha to produce life after all.
With so much at stake, many scientists since 1976 have reinterpreted and challenged the alpha-Oklo link. The changes they’re measuring are so small and the geological record so piecemeal after 1.7 billion years, it seems unlikely anyone will ever prove anything definitive about alpha from Oklo data. But again, never underestimate the value of throwing an idea out there. Shlyakhter’s samarium work whetted the appetite of dozens of ambitious physicists who wanted to knock off old theories, and the study of changing constants is now an active field. One boost to these scientists was the realization that even if alpha has changed very little since “only” 1.7 billion years ago, it might have shifted rapidly during the first billion years of the universe, a time of primordial chaos. As a matter of fact, after investigating star systems called quasars and interstellar dust clouds, some Australian astronomers* claim they’ve detected the first real evidence of inconstants.
Quasars are black holes that tear apart and cannibalize other stars, violence that releases gobs and gobs of light energy. Of course, when astronomers collect that light, they’re not looking at events in real time, but events that took place long, long ago, since light takes time to cross the universe. What the Australians did was examine how huge storms of interstellar space dust affected the passage of ancient quasar light. When light passes through a dust cloud, vaporized elements in the cloud absorb it. But unlike something opaque, which absorbs all light, the elements in the cloud absorb light at specific frequencies. Moreover, similar to atomic clocks, elements absorb light not of one narrow color but of two very finely split colors.
The Australians had little luck with some elements in the dust clouds; it turns out those elements would hardly notice if alpha vacillated every day. So they expanded their search to elements such as chromium, which proved highly sensitive to alpha: the smaller alpha was in the past, the redder the light that chromium absorbed and the narrower the spaces between its G-flat and G-sharp levels. By analyzing the gap that chromium and other elements produced billions of years ago near the quasar and comparing it with atoms in the lab today, scientists can judge whether alpha has changed in the meantime. And though, like all scientists—especially ones proposing something controversial—the Australians hedge and couch their findings in scientific language about such and such only “being consistent with the hypothesis” of this and that, they do think their ultrafine measurements indicate that alpha changed by up to 0.001 percent over ten billion years.
Now, honestly, that may seem a ridiculous amount to squabble over, like Bill Gates fighting for pennies on the sidewalk. But the magnitude is less important than the possibility of a fundamental constant changing.* Many scientists dispute the results from Australia, but if those results hold up—or if any of the other scientists working on variable constants find proof positive—scientists would have to rethink the big bang, because the only laws of the universe they know would not quite have held from the beginning.* A variable alpha would overthrow Einsteinian physics in the same way Einstein deposed Newton and Newton deposed medieval Scholastic physics. And as the next section shows, a drifting alpha might also revolutionize how scientists explore the cosmos for signs of life.
We’ve already met Enrico Fermi in rather poor circumstances—he died of beryllium poisoning after some brash experiments and won a Nobel Prize for discovering transuranic elements he didn’t discover. But it isn’t right just to leave you with a negative impression of this dynamo. Scientists loved Fermi universally and without reserve. He’s the namesake of element one hundred, fermium, and he’s regarded as the last great dual-purpose, theoretical-cum-experimental scientist, someone equally likely to have grease from laboratory machines on his hands as chalk from the blackboard. He had a devilishly quick mind as well. During scientific meetings with colleagues, they sometimes needed to run to their offices to look up arcane equations to resolve some point; often as not when they returned, Fermi, unable to wait, had derived the entire equation from scratch and had the answer they needed. Once, he asked junior colleagues to figure out how many millimeters thick the dust could get on the famously dirty windows in his lab before th
e dust avalanched under its own weight and sloughed onto the floor. History doesn’t record the answer, only the impish* question.
Not even Fermi, however, could wrap his head around one hauntingly simple question. As noted earlier, many philosophers marvel that the universe seems fine-tuned to produce life because certain fundamental constants have a “perfect” value. Moreover, scientists have long believed—in the same spirit they believe a second shouldn’t be based on our planet’s orbit—that earth is not cosmically special. Given that ordinariness, as well as the immense numbers of stars and planets, and the aeons that have passed since the big bang (and leaving aside any sticky religious issues), the universe should rightfully be swarming with life. Yet not only have we never met alien creatures, we’ve never even gotten a hello. As Fermi brooded on those contradictory facts over lunch one day, he cried out to his colleagues, as if he expected an answer, “Then where is everybody?”
His colleagues burst out laughing at what’s now known as “Fermi’s paradox.” But other scientists took Fermi seriously, and they really believed they could get at an answer. The best-known attempt came in 1961, when astrophysicist Frank Drake laid out what’s now known as the Drake Equation. Like the uncertainty principle, the Drake Equation has had a layer of interpretation laid over it that obscures what it really says. In short, it’s a series of guesses: about how many stars exist in the galaxy, what fraction of those have earthlike planets, what fraction of those planets have intelligent life, what fraction of those life forms would want to make contact, and so on. Drake originally calculated* that ten sociable civilizations existed in our galaxy. But again, that was just an informed guess, which led many scientists to renounce it as flatulent philosophizing. How on earth, for instance, can we psychoanalyze aliens and figure out what percent want to chat?