The quartz crystal’s remarkable ability to expand and contract in “equal time” was first exploited by radio engineers in the 1920s, who used it to lock radio transmissions to consistent frequencies. In 1928, W. A. Marrison of Bell Labs built the first clock that kept time from the regular vibrations of a quartz crystal. Quartz clocks lost or gained only a thousandth of a second per day, and were far less vulnerable to atmospheric changes in temperature or humidity, not to mention movement, than pendulum clocks. Once again, the accuracy with which we measured time had increased by several orders of magnitude.
For the first few decades after Marrison’s invention, quartz clocks became the de facto timekeeping devices for scientific or industrial use; standard U.S. time was kept by quartz clocks starting in the 1930s. But by the 1970s, the technology had gotten cheap enough for a mass market, with the emergence of the first quartz-based wristwatches. Today, just about every consumer appliance that has a clock on it—microwaves, alarm clocks, wristwatches, automobile clocks—all run on the equal time of quartz piezoelectricity. That transformation was predictable enough. Someone invents a better clock, and the first iterations are too expensive for consumer use. But eventually the price falls, and the new clock enters mainstream life. No surprise there. Once again, the surprise comes from somewhere else, from some other field that wouldn’t initially seem to be all that dependent on time. New ways of measuring create new possibilities for making. With quartz time, that new possibility was computation.
A microprocessor is an extraordinary technological achievement on many levels, but few are as essential as this: computer chips are masters of time discipline. Think of the coordination needs of the industrial factory: thousands of short, repetitive tasks performed in proper sequence by hundreds of individuals. A microprocessor requires the same kind of time discipline, only the units being coordinated are bits of information instead of the hands and bodies of millworkers. (When Charles Babbage first invented a programmable computer in the middle of the Victorian Age, he called the CPU “the mill” for a reason.) And instead of thousands of operations per minute, the microprocessor is executing billions of calculations per second, while shuffling information in and out of other microchips on the circuit board. Those operations are all coordinated by a master clock, now almost without exception made of quartz. (This is why tinkering with your computer to make it go faster than it was engineered to run is called “overclocking.”) A modern computer is the assemblage of many different technologies and modes of knowledge: the symbolic logic of programming languages, the electrical engineering of the circuit board, the visual language of interface design. But without the microsecond accuracy of a quartz clock, modern computers would be useless.
The accuracy of the quartz clock made its pendulum predecessors seem hopelessly erratic. But it had a similar effect on the ultimate timekeepers: the earth and the sun. Once we started measuring days with quartz clocks, we discovered that the length of the day was not as reliable as we had thought. Days shortened or lengthened in semi-chaotic ways thanks to the drag of the tides on the surface of the planet, wind blowing over mountain ranges, or the inner motion of the earth’s molten core. If we really wanted to keep exact time, we couldn’t rely on the earth’s rotation. We needed a better timepiece. Quartz let us “see” that the seemingly equal times of a solar day weren’t nearly as equal as we had assumed. It was, in a way, the deathblow to the pre-Copernican universe. Not only was the earth not the center of the universe, but its rotation wasn’t even consistent enough to define a day accurately. A block of vibrating sand could do the job much better.
—
KEEPING PROPER TIME IS ULTIMATELY all about finding—or making—things that oscillate in consistent rhythms: the sun rising in the sky, the moon waxing and waning, the altar lamp, the quartz crystal. The discovery of the atom in the early days of the twentieth century—led by scientists such as Niels Bohr and Werner Heisenberg—set in motion a series of spectacular and deadly innovations in energy and weaponry: nuclear power plants, hydrogen bombs. But the new science of the atom also revealed a less celebrated, but equally significant, discovery: the most consistent oscillator known to man. Studying the behavior of electrons orbiting within a cesium atom, Bohr noticed that they moved with an astonishing regularity. Untroubled by the chaotic drag of mountain ranges or tides, the electrons tapped out a rhythm that was several orders of magnitude more reliable than the earth’s rotation.
The first atomic clocks were built in the mid-1950s, and immediately set a new standard of accuracy: we were now capable of measuring nanoseconds, a thousand times more accurate than the microseconds of quartz. That leap forward was what ultimately enabled the International Conference of Weights and Measures in 1967 to declare that it was time to reinvent time. In the new era, the master time for the planet would be measured in atomic seconds: “the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom.” A day was no longer the time it took the earth to complete one rotation. A day became 86,400 atomic seconds, ticked off on 270 synchronized atomic clocks around the world.
The old timekeepers didn’t die off completely, though. Modern atomic clocks actually tick off the seconds using a quartz mechanism, relying on the cesium atom and its electrons to correct any random aberrations in the quartz timekeeping. And the world’s atomic clocks are reset every year based on the chaotic drift of the earth’s orbit, adding or gaining a second so that the atomic and solar rhythms don’t get too far out of sync. The multiple scientific fields of time discipline—astronomy, electromechanics, subatomic physics—are all embedded within the master clock.
The rise of the nanosecond might seem like an arcane shift, interesting only to the sort of person who attends a conference on weights and measures. And yet everyday life has been radically transformed by the rise of atomic time. Global air travel, telephone networks, financial markets—all rely on the nanosecond accuracy of the atomic clock. (Rid the world of these modern clocks, and the much vilified practice of high-frequency trading would disappear in a nanosecond.) Every time you glance down at your smartphone to check your location, you are unwittingly consulting a network of twenty-four atomic clocks housed in satellites in low-earth orbit above you. Those satellites are sending out the most elemental of signals, again and again, in perpetuity: the time is 11:48:25.084738 … the time is 11:48:25.084739… . When your phone tries to figure out its location, it pulls down at least three of these time stamps from satellites, each reporting a slightly different time thanks to the duration it takes the signal to travel from satellite to the GPS receiver in your hand. A satellite reporting a later time is closer than one reporting an earlier time. Since the satellites have perfectly predictable locations, the phone can calculate its exact position by triangulating among the three different time stamps. Like the naval navigators of the eighteenth century, GPS determines your location by comparing clocks. This is in fact one of the recurring stories of the history of the clock: each new advance in timekeeping enables a corresponding advance in our mastery of geography—from ships, to railroads, to air traffic, to GPS. It’s an idea that Einstein would have appreciated: measuring time turns out to be key to measuring space.
Professor Charles H. Townes, executive of the physics department at Columbia University, is shown with the “atomic clock” in the university’s physics department. Date released: January 25, 1955.
The next time you glance down at your phone to check what time it is or where you are, the way you might have glanced at a watch or a map just two decades ago, think about the immense, layered network of human ingenuity that has been put in place to make that gesture possible. Embedded in your ability to tell the time is the understanding of how electrons circulate within cesium atoms; the knowledge of how to send microwave signals from satellites and how to measure the exact speed with which they travel; the ability to position satellites in reliable orbits abov
e the earth, and of course the actual rocket science needed to get them off the ground; the ability to trigger steady vibrations in a block of silicon dioxide—not to mention all the advances in computation and microelectronics and network science necessary to process and represent that information on your phone. You don’t need to know any of these things to tell the time now, but that’s the way progress works: the more we build up these vast repositories of scientific and technological understanding, the more we conceal them. Your mind is silently assisted by all that knowledge each time you check your phone to see what time it is, but the knowledge itself is hidden from view. That is a great convenience, of course, but it can obscure just how far we’ve come since Galileo’s altar-lamp daydreams in the Duomo of Pisa.
—
AT FIRST GLANCE, THE STORY of time’s measurement would seem to be all about acceleration, dividing up the day into smaller and smaller increments so that we can move things faster: bodies, dollars, bits. But time in the atomic age has also moved in the exact opposite direction: slowing things down, not speeding them up; measuring in eons, not microseconds. In the 1890s, while working on her doctoral thesis in Paris, Marie Curie proposed for the first time that radiation was not some kind of chemical reaction between molecules, but something intrinsic to the atom—a discovery so critical to the development of physics, in fact, that she would become the first woman ever to win a Nobel Prize. Her research quickly drew the attention of her husband, Pierre Curie, who abandoned his own research into crystals to focus on radiation. Together they discovered that radioactive elements decayed at constant rates. The half-life of carbon 14, for instance, is 5,730 years. Leave some carbon 14 lying around for five thousand years or so, and you’ll find that half of it is gone.
Once again, science had discovered a new source of “equal time”—only this clock wasn’t ticking out the microseconds of quartz oscillations, or the nanoseconds of cesium electrons. Radiocarbon decay was ticking on the scale of centuries or millennia. Pierre Curie had surmised that the decay rate of certain elements might be used as a “clock” to determine the age of rocks. But the technique, now popularly known as carbon dating, wasn’t perfected until the late 1940s. Most clocks focus on measuring the present: What time is it right now? But radiocarbon clocks are all about the past. Different elements decay at wildly different rates, which means that they are like clocks running at different time scales. Carbon 14 “ticks” every five thousand years, but potassium 40 “ticks” every 1.3 billion years. That makes radiocarbon dating an ideal clock for the deep time of human history, while potassium 40 measures geologic time, the history of the planet itself. Radiometric dating has been critical in determining the age of the earth itself, establishing the most convincing scientific evidence that the biblical story of the earth being six thousand years old is just that: a story, not fact. We have immense knowledge about the prehistoric migrations of humans across the globe in large part thanks to carbon dating. In a sense, the “equal time” of radioactive decay has turned prehistoric time into history. When Homo sapiens first crossed the Bering Land Bridge into the Americas more than ten thousand years ago, there were no historians capable of writing down a narrative account of their journey. Yet their story was nonetheless captured by the carbon in their bones and the charcoal deposits they left behind at campsites. It was a story written in the language of atomic physics. But we couldn’t read that story without a new kind of clock. Without radiometric dating, “the deep time” of human migrations or geologic change would be like a history book where all the pages have been randomly shuffled: teeming with facts but lacking chronology and causation. Knowing what time it was turned that raw data into meaning.
—
HIGH IN THE SOUTHERN SNAKE MOUNTAINS in eastern Nevada, a grove of bristlecone pines grows in the dry, alkaline soil. The pines are small trees for conifers, rarely more than thirty feet high, gnarled by the constant winds rolling across the desert range. We know from carbon dating (and tree rings) that some of them are more than five thousand years old—the oldest living things on the planet.
At some point, several years from now, a clock will be buried in the soil beneath those pines, a clock designed to measure time on the scale of civilizations, not seconds. It will be—as its primary designer, the computer scientist Danny Hillis, puts it—“a clock that ticks once a year. The century hand advances once every 100 years, and the cuckoo comes out on the millennium.” It is being engineered to keep time for at least ten thousand years, roughly the length of human civilization to date. It is an exercise in a different kind of time discipline: the discipline of avoiding short-term thinking, of forcing yourself to think about our actions and their consequences on the scale of centuries and millennia. Borrowing a wonderful phrase from the musician and artist Brian Eno, the device is called “the Clock of the Long Now.”
The Clock of the Long Now
The organization behind this device, the Long Now Foundation—cofounded by Hillis, Eno, Stewart Brand, and a few other visionaries—aims to build a number of ten-thousand-year clocks. (The first one is being constructed for a mountainside location in West Texas.) Why go to such extravagant lengths to build a clock that might tick only once in your lifetime? Because new modes of measuring force us to think about the world in a new light. Just as the microseconds of quartz and cesium opened up new ideas that transformed everyday life in countless ways, the slow time of the Long Now clock helps us think in new ways about the future. As Long Now board member Kevin Kelly puts it:
If you have a Clock ticking for 10,000 years what kinds of generational-scale questions and projects will it suggest? If a Clock can keep going for ten millennia, shouldn’t we make sure our civilization does as well? If the Clock keeps going after we are personally long dead, why not attempt other projects that require future generations to finish? The larger question is, as virologist Jonas Salk once asked, “Are we being good ancestors?”
This is the strange paradox of time in the atomic age: we live in ever shorter increments, guided by clocks that tick invisibly with immaculate precision; we have short attention spans and have surrendered our natural rhythms to the abstract grid of clock time. And yet simultaneously, we have the capacity to imagine and record histories that are thousands or millions of years old, to trace chains of cause and effect that span dozens of generations. We can wonder what time it is and glance down at our phone and get an answer that is accurate to the split-second, but we can also appreciate that the answer was, in a sense, five hundred years in the making: from Galileo’s altar lamp to Niels Bohr’s cesium, from the chronometer to Sputnik. Compared to an ordinary human being from Galileo’s age, our time horizons have expanded in both directions: from the microsecond to the millennium.
Which measure of time will win out in the end: our narrow focus on the short term, or our gift for the long now? Will we be high-frequency traders or good ancestors? For that question, only time will tell.
6. Light
Imagine some alien civilization viewing Earth from across the galaxies, looking for signs of intelligent life. For millions of years, there would be almost nothing to report: the daily flux of weather moving across the planet, the creep of glaciers spreading and retreating every hundred thousand years or so, the incremental drift of continents. But starting about a century ago, a momentous change would suddenly be visible: at night, the planet’s surface would glow with the streetlights of cities, first in the United States and Europe, then spreading steadily across the globe, growing in intensity. Viewed from space, the emergence of artificial lighting would arguably have been the single most significant change in the planet’s history since the Chicxulub asteroid collided with Earth sixty-five million years ago, coating the planet in a cloud of superheated ash and dust. From space, all the transformations that marked the rise of human civilization would be an afterthought: opposable thumbs, written language, the printing press—all these would pale beside the brilliance of Homo lumens.
Viewed from
the surface of the earth, of course, the invention of artificial light had more rivals in terms of visible innovations, but its arrival marked a threshold point in human society. Today’s night sky now shines six thousand times brighter than it did just 150 years ago. Artificial light has transformed the way we work and sleep, helped create global networks of communication, and may soon enable radical breakthroughs in energy production. The lightbulb is so bound up in the popular sense of innovation that it has become a metaphor for new ideas themselves: the “lightbulb” moment has replaced Archimedes’s eureka as the expression most likely to be invoked to celebrate a sudden conceptual leap.
One of the odd things about artificial light is how stagnant it was as a technology for centuries. This is particularly striking given that artificial light arrived via the very first technology, when humans originally mastered the controlled fire more than a hundred thousand years ago. The Babylonians and Romans developed oil-based lamps, but that technology virtually disappeared during the (appropriately named) Dark Ages. For almost two thousand years, all the way to the dawn of the industrial age, the candle was the reigning solution for indoor lighting. Candles made from beeswax were highly prized but too expensive for anyone but the clergy or the aristocracy. Most people made do with tallow candles, which burned animal fat to produce a tolerable flicker, accompanied by a foul odor and thick smoke.
As our nursery rhymes remind us, candle-making was a popular vocation during this period. Parisian tax rolls from 1292 listed seventy-two “chandlers,” as they were called, doing business in the city. But most ordinary households made their own tallow candles, an arduous process that could go on for days: heating up containers of animal fat, and dipping wicks into them. In a diary entry from 1743, the president of Harvard noted that he had produced seventy-eight pounds of tallow candles in two days of work, a quantity that he managed to burn through two months later.
How We Got to Now: Six Innovations That Made the Modern World Page 14