by Bill Bryson
Josiah Willard Gibbs, the brilliant but retiring New England academic whose masterwork on chemical reactions has been called “the Principia of thermodynamics.” (credit 8.1)
Undaunted—well, perhaps mildly daunted—Planck turned to other matters.2 We shall turn to these ourselves in a moment, but first we must make a slight (but relevant!) detour to Cleveland, Ohio, and an institution then known as the Case School of Applied Science. There, in the 1880s, a physicist of early middle years named Albert Michelson, assisted by his friend the chemist Edward Morley, embarked on a series of experiments that produced curious and disturbing results that would have great ramifications for much of what followed.
What Michelson and Morley did, without actually intending to, was undermine a longstanding belief in something called the luminiferous ether, a stable, invisible, weightless, frictionless and unfortunately wholly imaginary medium that was thought to permeate the universe. Conceived by Descartes, embraced by Newton, and venerated by nearly everyone ever since, the ether held a position of absolute centrality in nineteenth-century physics as a way of explaining how light travelled across the emptiness of space. It was especially needed in the 1800s because light and electromagnetism were now seen as waves, which is to say types of vibrations. Vibrations must occur in something; hence the need for, and lasting devotion to, an ether. As late as 1909, the great British physicist J. J. Thomson was insisting: “The ether is not a fantastic creation of the speculative philosopher; it is as essential to us as the air we breathe”—this more than four years after it was pretty incontestably established that it didn’t exist. People, in short, were really attached to the ether.
The dour and often luckless Max Planck, whose early work on thermodynamics ended disappointingly in 1891 when he found that his discoveries had already been made. He would have greater success later with quantum theory. (credit 8.2)
If you needed to illustrate the idea of nineteenth-century America as a land of opportunity, you could hardly improve on the life of Albert Michelson. Born in 1852 on the German-Polish border to a family of poor Jewish merchants, he came to the United States with his family as an infant and grew up in a mining camp in California’s gold rush country where his father ran a dry goods business. Too poor to pay for college, he travelled to Washington, DC, and took to loitering by the front door of the White House so that he could fall in beside Ulysses S. Grant when the President emerged for his daily constitutional. (It was clearly a more innocent age.) In the course of these walks, Michelson so ingratiated himself with the President that Grant agreed to secure for him a free place at the US Naval Academy. It was there that Michelson learned his physics.
Left: Edward Morley (credit 8.3) Right: Albert Michelson (credit 8.4) Their careful measurements of the speed of light in 1887 disproved the existence of the “luminiferous ether,” an invisible medium that was thought to permeate the universe. The belief in an ether had been central to physics since the time of Newton.
Ten years later, by now a professor at the Case School in Cleveland, Michelson became interested in trying to measure something called the ether drift—a kind of headwind produced by moving objects as they ploughed through space. One of the predictions of Newtonian physics was that the speed of light as it pushed through the ether should vary with respect to an observer depending on whether the observer was moving towards the source of light or away from it, but no-one had figured out a way to measure this. It occurred to Michelson that if you took careful measurements, with a very precise instrument, at opposite seasons, and compared light’s travel time between the two, you would have your answer.
Michelson talked Alexander Graham Bell, newly enriched inventor of the telephone, into providing the funds to build an ingenious and sensitive instrument of Michelson’s own devising called an interferometer, which could measure the velocity of light with great precision. Then, assisted by the genial but shadowy Morley, Michelson embarked on years of fastidious measurements. The work was delicate and exhausting, and had to be suspended for a time to permit Michelson a brief but comprehensive nervous breakdown, but by 1887 they had their results. They were not at all what the two scientists had expected to find.
As Caltech astrophysicist Kip S. Thorne has written: “The speed of light turned out to be the same in all directions and at all seasons.” It was the first hint in two hundred years—in exactly two hundred years, in fact—that Newton’s laws might not apply all the time everywhere. The Michelson-Morley outcome became, in the words of William H. Cropper, “probably the most famous negative result in the history of physics.” Michelson was awarded a Nobel Prize in physics for the work—the first American so honoured—but not for twenty years. Meanwhile, the Michelson-Morley experiments would hover unpleasantly, like a musty odour, in the background of scientific thought.
Remarkably, and despite his findings, when the twentieth century dawned Michelson counted himself among those who believed that the work of science was nearly at an end, with “only a few turrets and pinnacles to be added, a few roof bosses to be carved,” in the words of a writer in Nature.
In fact, of course, the world was about to enter a century of science where many people wouldn’t understand anything and none would understand everything. Scientists would soon find themselves adrift in a bewildering realm of particles and antiparticles, where things pop in and out of existence in spans of time that make nanoseconds look plodding and uneventful, where everything is strange. Science was moving from a world of macrophysics, where objects could be seen and held and measured, to one of microphysics, where events transpire with inconceivable swiftness on scales of magnitude far below the limits of imagining. We were about to enter the quantum age, and the first person to push on the door was the so-far unfortunate Max Planck.
In 1900, now a theoretical physicist at the University of Berlin, and at the somewhat advanced age of forty-two, Planck unveiled a new “quantum theory,” which posited that energy is not a continuous thing like flowing water but comes in individualized packets, which he called quanta. This was a novel concept, and a good one. In the short term it would help to provide a solution to the puzzle of the Michelson-Morley experiments in that it demonstrated that light needn’t be a wave after all. In the longer term it would lay the foundation for the whole of modern physics. It was, at all events, the first clue that the world was about to change.
But the landmark event—the dawn of a new age—came in 1905 when there appeared in the German physics journal Annalen der Physik a series of papers by a young Swiss bureaucrat who had no university affiliation, no access to a laboratory and the regular use of no library greater than that of the national patent office in Bern, where he was employed as a technical examiner third class. (An application to be promoted to technical examiner second class had recently been rejected.)
His name was Albert Einstein, and in that one eventful year he submitted to Annalen der Physik five papers, of which three, according to C. P Snow, “were among the greatest in the history of physics”—one examining the photoelectric effect by means of Planck’s new quantum theory, one on the behaviour of small particles in suspension (what is known as Brownian motion), and one outlining a Special Theory of Relativity.
The first won its author a Nobel Prize and explained the nature of light (and also helped to make television possible, among other things).3 The second provided proof that atoms do indeed exist—a fact that had, surprisingly, been in some dispute. The third merely changed the world.
Einstein was born in Ulm, in southern Germany, in 1879, but grew up in Munich. Little in his early life suggested the greatness to come. Famously, he didn’t learn to speak until he was three. In the 1890s, his father’s electrical business failing, the family moved to Milan, but Albert, by now a teenager, went to Switzerland to continue his education—though he failed his college entrance exams on the first try. In 1896 he gave up his German citizenship to avoid military conscription and entered the Zurich Polytechnic Institute on a four-year
course designed to churn out high-school science teachers. He was a bright but not outstanding student.
In 1900 he graduated and within a few months was beginning to contribute papers to Annalen der Physik. His very first paper, on the physics of fluids in drinking straws (of all things), appeared in the same issue as Planck’s quantum theory. From 1902 to 1904 he produced a series of papers on statistical mechanics, only to discover that the quietly productive J. Willard Gibbs in Connecticut had done that work as well, in his Elementary Principles of Statistical Mechanics of 1901.
Albert had fallen in love with a fellow student, a Hungarian named Mileva Maric. In 1901 they had a child out of wedlock, a daughter, who was discreetly put up for adoption. Einstein never saw his child. Two years later, he and Maric were married. In between these events, in 1902, Einstein took a job with the Swiss patent office, where he stayed for the next seven years. He enjoyed the work: it was challenging enough to engage his mind, but not so challenging as to distract him from his physics. This was the background against which he produced the Special Theory of Relativity in 1905.
Albert Einstein’s famous equation E=mc2, written in his own hand in a manuscript of 1912, seven years after he first devised it. The manuscript came up for auction at Sotheby’s in New York in 1996, but was withdrawn when the bidding stalled at $3.3 million. (The owner reportedly had hoped to get over $4 million.) It was later sold privately to a museum in Israel. (credit 8.5)
“On the Electrodynamics of Moving Bodies” is one of the most extraordinary scientific papers ever published, as much for how it was presented as for what it said. It had no footnotes or citations, contained almost no mathematics, made no mention of any work that had influenced or preceded it, and acknowledged the help of just one individual, a colleague at the patent office named Michele Besso. It was, wrote C. P Snow, as if Einstein “had reached the conclusions by pure thought, unaided, without listening to the opinions of others. To a surprisingly large extent, that is precisely what he had done.”
His famous equation, E = mc2, did not appear with the paper, but came in a brief supplement that followed a few months later. As you will recall from schooldays, E in the equation stands for energy, m for mass and c2 for the speed of light squared.
In simplest terms, what the equation says is that mass and energy have an equivalence. They are two forms of the same thing: energy is liberated matter; matter is energy waiting to happen. Since c2 (the speed of light times itself) is a truly enormous number, what the equation is saying is that there is a huge amount—a really huge amount—of energy bound up in every material thing.4
You may not feel outstandingly robust, but if you are an average-sized adult you will contain within your modest frame no less than 7 × 1018 joules of potential energy—enough to explode with the force of thirty very large hydrogen bombs, assuming you knew how to liberate it and really wished to make a point. Everything has this kind of energy trapped within it. We’re just not very good at getting it out. Even a uranium bomb—the most energetic thing we have produced yet—releases less than 1 per cent of the energy it could release if only we were more cunning.
Among much else, Einstein’s theory explained how radiation worked: how a lump of uranium could throw out constant streams of high-level energy without melting away like an ice cube. (It could do it by converting mass to energy extremely efficiently à la E = mc2.) It explained how stars could burn for billions of years without racing through their fuel. (Ditto.) At a stroke, in a simple formula, Einstein endowed geologists and astronomers with the luxury of billions of years. Above all, the special theory showed that the speed of light was constant and supreme. Nothing could overtake it. It brought light (no pun intended exactly) to the very heart of our understanding of the nature of the universe. Not incidentally, it also solved the problem of the luminiferous ether by making it clear that it didn’t exist. Einstein gave us a universe that didn’t need it.
Einstein and his wife, Mileva, photographed in Bern during his annus mirabilis of 1905. In that year, Einstein published three papers that revolutionized physics. In the same year he was turned down for two teaching jobs. (credit 8.6)
Physicists as a rule are not over-attentive to the pronouncements of Swiss patent-office clerks and so, despite the abundance of useful tidings they offered, Einstein’s papers attracted little notice. Having just solved several of the deepest mysteries of the universe, Einstein applied for a job as a university lecturer and was rejected, and then for one as a high-school teacher and was rejected there as well. So he went back to his job as an examiner third class—but of course he kept thinking. He hadn’t even come close to finishing yet.
When the poet Paul Valéry once asked Einstein if he kept a notebook to record his ideas, Einstein looked at him with mild but genuine surprise. “Oh, that’s not necessary,” he replied. “It’s so seldom I have one.” I need hardly point out that when he did get one it tended to be good. Einstein’s next idea was one of the greatest that anyone has ever had—indeed, the very greatest, according to Boorse, Motz and Weaver in their thoughtful history of atomic science. “As the creation of a single mind,” they write, “it is undoubtedly the highest intellectual achievement of humanity,” which is of course as good as a compliment can get.
In 1907, or so it has sometimes been written, Albert Einstein saw a workman fall off a roof and began to think about gravity. Alas, like many good stories this one appears to be apocryphal. According to Einstein himself, he was simply sitting in a chair when the problem of gravity occurred to him.
Actually, what occurred to Einstein was something more like the beginning of a solution to the problem of gravity, since it had been evident to him from the outset that one thing missing from the special theory was gravity. What was “special” about the special theory was that it dealt with things moving in an essentially unimpeded state. But what happened when a thing in motion—light, above all—encountered an obstacle such as gravity? It was a question that would occupy his thoughts for most of the next decade and led to the publication in early 1917 of a paper entitled “Cosmological Considerations on the General Theory of Relativity.” The Special Theory of Relativity of 1905 was a profound and important piece of work, of course; but, as C. P. Snow once observed, if Einstein hadn’t thought of it when he did someone else would have, probably within five years; it was an idea waiting to happen. But the General Theory was something else altogether. “Without it,” wrote Snow in 1979, “it is likely that we should still be waiting for the theory today.”
With his pipe, genially self-effacing manner and electrified hair, Einstein was too splendid a figure to remain permanently obscure and in 1919, the war over, the world suddenly discovered him. Almost at once his theories of relativity developed a reputation for being impossible for an ordinary person to grasp. Matters were not helped, as David Bodanis points out in his superb book E = mc2, when the New York Times decided to do a story, and—for reasons that can never fail to excite wonder—sent the paper’s golfing correspondent, one Henry Crouch, to conduct the interview.
Crouch was hopelessly out of his depth, and got nearly everything wrong. Among the more lasting errors in his report was the assertion that Einstein had found a publisher daring enough to publish a book that only twelve men “in all the world could comprehend.” There was no such book, no such publisher, no such circle of learned men, but the notion stuck anyway. Soon the number of people who could grasp relativity had been reduced even further in the popular imagination—and the scientific establishment, it must be said, did little to disturb the myth.
When a journalist asked the British astronomer Sir Arthur Eddington if it was true that he was one of only three people in the world who could understand Einstein’s relativity theories, Eddington considered deeply for a moment and replied: “I am trying to think who the third person is.” In fact, the problem with relativity wasn’t that it involved a lot of differential equations, Lorentz transformations and other complicated mathematics (tho
ugh it did—even Einstein needed help with some of it), but that it was just so thoroughly non-intuitive.
In essence what relativity says is that space and time are not absolute, but relative both to the observer and to the thing being observed, and the faster one moves the more pronounced these effects become. We can never accelerate ourselves to the speed of light, and the harder we try (and the faster we go) the more distorted we will become, relative to an outside observer.
Almost at once, popularizers of science tried to come up with ways to make these concepts accessible to a general audience. One of the more successful attempts—commercially at least—was The ABC of Relativity by the mathematician and philosopher Bertrand Russell. In it, Russell employed an image that has been used many times since. He asked the reader to envision a train 100 yards long moving at 60 per cent of the speed of light. To someone standing on a platform watching it pass, the train would appear to be only 80 yards long and everything on it would be similarly compressed. If we could hear the passengers on the train speak, their voices would sound slurred and sluggish, like a record played at too slow a speed, and their movements would appear similarly ponderous. Even the clocks on the train would seem to be running at only four-fifths of their normal speed.
However—and here’s the thing—people on the train would have no sense of these distortions. To them, everything on the train would seem quite normal. It would be us on the platform who looked weirdly compressed and slowed down. It is all to do, you see, with your position relative to the moving object.
This effect actually happens every time you move. Fly across the United States and you will step from the plane a quinzillionth of a second, or something, younger than those you left behind. Even in walking across the room you will very slightly alter your own experience of time and space. It has been calculated that a baseball thrown at 160 kilometres an hour will pick up 0.000000000002 grams of mass on its way to home plate. So the effects of relativity are real and have been measured. The problem is that such changes are much too small to make the tiniest detectable difference to us. But for other things in the universe—light, gravity, the universe itself—these are matters of consequence.