A Short History of Nearly Everything
Page 14
Of course the sagging mattress analogy can take us only so far because it doesn't incorporate the effect of time. But then our brains can take us only so far because it is so nearly impossible to envision a dimension comprising three parts space to one part time, all interwoven like the threads in a plaid fabric. At all events, I think we can agree that this was an awfully big thought for a young man staring out the window of a patent office in the capital of Switzerland.
Among much else, Einstein's general theory of relativity suggested that the universe must be either expanding or contracting. But Einstein was not a cosmologist, and he accepted the prevailing wisdom that the universe was fixed and eternal. More or less reflexively, he dropped into his equations something called the cosmological constant, which arbitrarily counterbalanced the effects of gravity, serving as a kind of mathematical pause button. Books on the history of science always forgive Einstein this lapse, but it was actually a fairly appalling piece of science and he knew it. He called it "the biggest blunder of my life."
Coincidentally, at about the time that Einstein was affixing a cosmological constant to his theory, at the Lowell Observatory in Arizona, an astronomer with the cheerily intergalactic name of Vesto Slipher (who was in fact from Indiana) was taking spectrographic readings of distant stars and discovering that they appeared to be moving away from us. The universe wasn't static. The stars Slipher looked at showed unmistakable signs of a Doppler shift * 19 --the same mechanism behind that distinctive stretched-out yee-yummm sound cars make as they flash past on a racetrack. The phenomenon also applies to light, and in the case of receding galaxies it is known as a red shift (because light moving away from us shifts toward the red end of the spectrum; approaching light shifts to blue).
Slipher was the first to notice this effect with light and to realize its potential importance for understanding the motions of the cosmos. Unfortunately no one much noticed him. The Lowell Observatory, as you will recall, was a bit of an oddity thanks to Percival Lowell's obsession with Martian canals, which in the 1910s made it, in every sense, an outpost of astronomical endeavor. Slipher was unaware of Einstein's theory of relativity, and the world was equally unaware of Slipher. So his finding had no impact.
Glory instead would pass to a large mass of ego named Edwin Hubble. Hubble was born in 1889, ten years after Einstein, in a small Missouri town on the edge of the Ozarks and grew up there and in Wheaton, Illinois, a suburb of Chicago. His father was a successful insurance executive, so life was always comfortable, and Edwin enjoyed a wealth of physical endowments, too. He was a strong and gifted athlete, charming, smart, and immensely good-looking--"handsome almost to a fault," in the description of William H. Cropper, "an Adonis" in the words of another admirer. According to his own accounts, he also managed to fit into his life more or less constant acts of valor--rescuing drowning swimmers, leading frightened men to safety across the battlefields of France, embarrassing world-champion boxers with knockdown punches in exhibition bouts. It all seemed too good to be true. It was. For all his gifts, Hubble was also an inveterate liar.
This was more than a little odd, for Hubble's life was filled from an early age with a level of distinction that was at times almost ludicrously golden. At a single high school track meet in 1906, he won the pole vault, shot put, discus, hammer throw, standing high jump, and running high jump, and was on the winning mile-relay team--that is seven first places in one meet--and came in third in the broad jump. In the same year, he set a state record for the high jump in Illinois.
As a scholar he was equally proficient, and had no trouble gaining admission to study physics and astronomy at the University of Chicago (where, coincidentally, the head of the department was now Albert Michelson). There he was selected to be one of the first Rhodes scholars at Oxford. Three years of English life evidently turned his head, for he returned to Wheaton in 1913 wearing an Inverness cape, smoking a pipe, and talking with a peculiarly orotund accent--not quite British but not quite not--that would remain with him for life. Though he later claimed to have passed most of the second decade of the century practicing law in Kentucky, in fact he worked as a high school teacher and basketball coach in New Albany, Indiana, before belatedly attaining his doctorate and passing briefly through the Army. (He arrived in France one month before the Armistice and almost certainly never heard a shot fired in anger.)
In 1919, now aged thirty, he moved to California and took up a position at the Mount Wilson Observatory near Los Angeles. Swiftly, and more than a little unexpectedly, he became the most outstanding astronomer of the twentieth century.
It is worth pausing for a moment to consider just how little was known of the cosmos at this time. Astronomers today believe there are perhaps 140 billion galaxies in the visible universe. That's a huge number, much bigger than merely saying it would lead you to suppose. If galaxies were frozen peas, it would be enough to fill a large auditorium--the old Boston Garden, say, or the Royal Albert Hall. (An astrophysicist named Bruce Gregory has actually computed this.) In 1919, when Hubble first put his head to the eyepiece, the number of these galaxies that were known to us was exactly one: the Milky Way. Everything else was thought to be either part of the Milky Way itself or one of many distant, peripheral puffs of gas. Hubble quickly demonstrated how wrong that belief was.
Over the next decade, Hubble tackled two of the most fundamental questions of the universe: how old is it, and how big? To answer both it is necessary to know two things--how far away certain galaxies are and how fast they are flying away from us (what is known as their recessional velocity). The red shift gives the speed at which galaxies are retiring, but doesn't tell us how far away they are to begin with. For that you need what are known as "standard candles"--stars whose brightness can be reliably calculated and used as benchmarks to measure the brightness (and hence relative distance) of other stars.
Hubble's luck was to come along soon after an ingenious woman named Henrietta Swan Leavitt had figured out a way to do so. Leavitt worked at the Harvard College Observatory as a computer, as they were known. Computers spent their lives studying photographic plates of stars and making computations--hence the name. It was little more than drudgery by another name, but it was as close as women could get to real astronomy at Harvard--or indeed pretty much anywhere--in those days. The system, however unfair, did have certain unexpected benefits: it meant that half the finest minds available were directed to work that would otherwise have attracted little reflective attention, and it ensured that women ended up with an appreciation of the fine structure of the cosmos that often eluded their male counterparts.
One Harvard computer, Annie Jump Cannon, used her repetitive acquaintance with the stars to devise a system of stellar classifications so practical that it is still in use today. Leavitt's contribution was even more profound. She noticed that a type of star known as a Cepheid variable (after the constellation Cepheus, where it first was identified) pulsated with a regular rhythm--a kind of stellar heartbeat. Cepheids are quite rare, but at least one of them is well known to most of us. Polaris, the Pole Star, is a Cepheid.
We now know that Cepheids throb as they do because they are elderly stars that have moved past their "main sequence phase," in the parlance of astronomers, and become red giants. The chemistry of red giants is a little weighty for our purposes here (it requires an appreciation for the properties of singly ionized helium atoms, among quite a lot else), but put simply it means that they burn their remaining fuel in a way that produces a very rhythmic, very reliable brightening and dimming. Leavitt's genius was to realize that by comparing the relative magnitudes of Cepheids at different points in the sky you could work out where they were in relation to each other. They could be used as "standard candles"--a term she coined and still in universal use. The method provided only relative distances, not absolute distances, but even so it was the first time that anyone had come up with a usable way to measure the large-scale universe.
(Just to put these insights into pe
rspective, it is perhaps worth noting that at the time Leavitt and Cannon were inferring fundamental properties of the cosmos from dim smudges on photographic plates, the Harvard astronomer William H. Pickering, who could of course peer into a first-class telescope as often as he wanted, was developing his seminal theory that dark patches on the Moon were caused by swarms of seasonally migrating insects.)
Combining Leavitt's cosmic yardstick with Vesto Slipher's handy red shifts, Edwin Hubble now began to measure selected points in space with a fresh eye. In 1923 he showed that a puff of distant gossamer in the Andromeda constellation known as M31 wasn't a gas cloud at all but a blaze of stars, a galaxy in its own right, a hundred thousand light-years across and at least nine hundred thousand light-years away. The universe was vaster--vastly vaster--than anyone had ever supposed. In 1924 he produced a landmark paper, "Cepheids in Spiral Nebulae" ( nebulae, from the Latin for "clouds," was his word for galaxies), showing that the universe consisted not just of the Milky Way but of lots of independent galaxies--"island universes"--many of them bigger than the Milky Way and much more distant.
This finding alone would have ensured Hubble's reputation, but he now turned to the question of working out just how much vaster the universe was, and made an even more striking discovery. Hubble began to measure the spectra of distant galaxies--the business that Slipher had begun in Arizona. Using Mount Wilson's new hundred-inch Hooker telescope and some clever inferences, he worked out that all the galaxies in the sky (except for our own local cluster) are moving away from us. Moreover, their speed and distance were neatly proportional: the further away the galaxy, the faster it was moving.
This was truly startling. The universe was expanding, swiftly and evenly in all directions. It didn't take a huge amount of imagination to read backwards from this and realize that it must therefore have started from some central point. Far from being the stable, fixed, eternal void that everyone had always assumed, this was a universe that had a beginning. It might therefore also have an end.
The wonder, as Stephen Hawking has noted, is that no one had hit on the idea of the expanding universe before. A static universe, as should have been obvious to Newton and every thinking astronomer since, would collapse in upon itself. There was also the problem that if stars had been burning indefinitely in a static universe they'd have made the whole intolerably hot--certainly much too hot for the likes of us. An expanding universe resolved much of this at a stroke.
Hubble was a much better observer than a thinker and didn't immediately appreciate the full implications of what he had found. Partly this was because he was woefully ignorant of Einstein's General Theory of Relativity. This was quite remarkable because, for one thing, Einstein and his theory were world famous by now. Moreover, in 1929 Albert Michelson--now in his twilight years but still one of the world's most alert and esteemed scientists--accepted a position at Mount Wilson to measure the velocity of light with his trusty interferometer, and must surely have at least mentioned to him the applicability of Einstein's theory to his own findings.
At all events, Hubble failed to make theoretical hay when the chance was there. Instead, it was left to a Belgian priest-scholar (with a Ph.D. from MIT) named Georges Lemaître to bring together the two strands in his own "fireworks theory," which suggested that the universe began as a geometrical point, a "primeval atom," which burst into glory and had been moving apart ever since. It was an idea that very neatly anticipated the modern conception of the Big Bang but was so far ahead of its time that Lemaître seldom gets more than the sentence or two that we have given him here. The world would need additional decades, and the inadvertent discovery of cosmic background radiation by Penzias and Wilson at their hissing antenna in New Jersey, before the Big Bang would begin to move from interesting idea to established theory.
Neither Hubble nor Einstein would be much of a part of that big story. Though no one would have guessed it at the time, both men had done about as much as they were ever going to do.
In 1936 Hubble produced a popular book called The Realm of the Nebulae , which explained in flattering style his own considerable achievements. Here at last he showed that he had acquainted himself with Einstein's theory--up to a point anyway: he gave it four pages out of about two hundred.
Hubble died of a heart attack in 1953. One last small oddity awaited him. For reasons cloaked in mystery, his wife declined to have a funeral and never revealed what she did with his body. Half a century later the whereabouts of the century's greatest astronomer remain unknown. For a memorial you must look to the sky and the Hubble Space Telescope, launched in 1990 and named in his honor.
9 THE MIGHTY ATOM
WHILE EINSTEIN AND Hubble were productively unraveling the large-scale structure of the cosmos, others were struggling to understand something closer to hand but in its way just as remote: the tiny and ever- mysterious atom.
The great Caltech physicist Richard Feynman once observed that if you had to reduce scientific history to one important statement it would be "All things are made of atoms." They are everywhere and they constitute every thing. Look around you. It is all atoms. Not just the solid things like walls and tables and sofas, but the air in between. And they are there in numbers that you really cannot conceive.
The basic working arrangement of atoms is the molecule (from the Latin for "little mass"). A molecule is simply two or more atoms working together in a more or less stable arrangement: add two atoms of hydrogen to one of oxygen and you have a molecule of water. Chemists tend to think in terms of molecules rather than elements in much the way that writers tend to think in terms of words and not letters, so it is molecules they count, and these are numerous to say the least. At sea level, at a temperature of 32 degrees Fahrenheit, one cubic centimeter of air (that is, a space about the size of a sugar cube) will contain 45 billion billion molecules. And they are in every single cubic centimeter you see around you. Think how many cubic centimeters there are in the world outside your window--how many sugar cubes it would take to fill that view. Then think how many it would take to build a universe. Atoms, in short, are very abundant.
They are also fantastically durable. Because they are so long lived, atoms really get around. Every atom you possess has almost certainly passed through several stars and been part of millions of organisms on its way to becoming you. We are each so atomically numerous and so vigorously recycled at death that a significant number of our atoms--up to a billion for each of us, it has been suggested--probably once belonged to Shakespeare. A billion more each came from Buddha and Genghis Khan and Beethoven, and any other historical figure you care to name. (The personages have to be historical, apparently, as it takes the atoms some decades to become thoroughly redistributed; however much you may wish it, you are not yet one with Elvis Presley.)
So we are all reincarnations--though short-lived ones. When we die our atoms will disassemble and move off to find new uses elsewhere--as part of a leaf or other human being or drop of dew. Atoms, however, go on practically forever. Nobody actually knows how long an atom can survive, but according to Martin Rees it is probably about 10 35 years--a number so big that even I am happy to express it in notation.
Above all, atoms are tiny--very tiny indeed. Half a million of them lined up shoulder to shoulder could hide behind a human hair. On such a scale an individual atom is essentially impossible to imagine, but we can of course try.
Start with a millimeter, which is a line this long: -. Now imagine that line divided into a thousand equal widths. Each of those widths is a micron. This is the scale of microorganisms. A typical paramecium, for instance, is about two microns wide, 0.002 millimeters, which is really very small. If you wanted to see with your naked eye a paramecium swimming in a drop of water, you would have to enlarge the drop until it was some forty feet across. However, if you wanted to see the atoms in the same drop, you would have to make the drop fifteen miles across.
Atoms, in other words, exist on a scale of minuteness of another order a
ltogether. To get down to the scale of atoms, you would need to take each one of those micron slices and shave it into ten thousand finer widths. That's the scale of an atom: one ten-millionth of a millimeter. It is a degree of slenderness way beyond the capacity of our imaginations, but you can get some idea of the proportions if you bear in mind that one atom is to the width of a millimeter line as the thickness of a sheet of paper is to the height of the Empire State Building.
It is of course the abundance and extreme durability of atoms that makes them so useful, and the tininess that makes them so hard to detect and understand. The realization that atoms are these three things--small, numerous, practically indestructible--and that all things are made from them first occurred not to Antoine-Laurent Lavoisier, as you might expect, or even to Henry Cavendish or Humphry Davy, but rather to a spare and lightly educated English Quaker named John Dalton, whom we first encountered in the chapter on chemistry.
Dalton was born in 1766 on the edge of the Lake District near Cockermouth to a family of poor but devout Quaker weavers. (Four years later the poet William Wordsworth would also join the world at Cockermouth.) He was an exceptionally bright student--so very bright indeed that at the improbably youthful age of twelve he was put in charge of the local Quaker school. This perhaps says as much about the school as about Dalton's precocity, but perhaps not: we know from his diaries that at about this time he was reading Newton's Principia in the original Latin and other works of a similarly challenging nature. At fifteen, still schoolmastering, he took a job in the nearby town of Kendal, and a decade after that he moved to Manchester, scarcely stirring from there for the remaining fifty years of his life. In Manchester he became something of an intellectual whirlwind, producing books and papers on subjects ranging from meteorology to grammar. Color blindness, a condition from which he suffered, was for a long time called Daltonism because of his studies. But it was a plump book called A New System of Chemical Philosophy , published in 1808, that established his reputation.