Book Read Free

A Short History of Nearly Everything: Special Illustrated Edition

Page 5

by Bill Bryson


  On 15 January 1934 the journal Physical Review published a very concise abstract of a presentation that had been conducted by Zwicky and Baade the previous month at Stanford University. Despite its extreme brevity—one paragraph of twenty-four lines—the abstract contained an enormous amount of new science: it provided the first reference to supernovae and to neutron stars; convincingly explained their method of formation; correctly calculated the scale of their explosiveness; and, as a kind of concluding bonus, connected supernova explosions to the production of a mysterious new phenomenon called cosmic rays, which had recently been found swarming through the universe. These ideas were revolutionary, to say the least. The existence of neutron stars wouldn’t be confirmed for thirty-four years. The cosmic rays notion, though considered plausible, hasn’t been verified yet. Altogether, the abstract was, in the words of Caltech astrophysicist Kip S. Thorne, “one of the most prescient documents in the history of physics and astronomy.”

  The brilliant but volatile astrophysicist Fritz Zwicky, whose paper on supernovae revealed revolutionary scientific ideas, yet whose later theories on black holes and dark matter would largely be dismissed by colleagues who considered him an “irritating buffoon.” (credit 3.4)

  Interestingly, Zwicky had almost no understanding of why any of this would happen. According to Thorne, “he did not understand the laws of physics well enough to be able to substantiate his ideas.” Zwicky’s talent was for big ideas. Others—Baade mostly—were left to do the mathematical sweeping up.

  Zwicky was also the first to recognize that there wasn’t nearly enough visible mass in the universe to hold galaxies together, and that there must be some other gravitational influence—what we now call dark matter. One thing he failed to see was that if a neutron star shrank enough it would become so dense that even light couldn’t escape its immense gravitational pull. You would have a black hole. Unfortunately, Zwicky was held in such disdain by most of his colleagues that his ideas attracted almost no notice. When, five years later, the great Robert Oppenheimer turned his attention to neutron stars in a landmark paper, he made not a single reference to any of Zwicky’s work, even though Zwicky had been working for years on the same problem in an office just down the corridor. Zwicky’s deductions concerning dark matter wouldn’t attract serious attention for nearly four decades. We can only assume that he did a lot of push-ups in this period.

  Surprisingly little of the universe is visible to us when we incline our heads to the sky. Only about six thousand stars are visible to the naked eye from Earth, and only about two thousand can be seen from any one spot. With binoculars the number of stars you can see from a single location rises to about fifty thousand, and with a small 2-inch telescope it leaps to three hundred thousand. With a 16-inch telescope, such as Evans uses, you begin to count not in stars but in galaxies. From his deck, Evans supposes he can see between fifty thousand and one hundred thousand galaxies, each containing tens of billions of stars. These are of course respectable numbers, but even with so much to take in, supernovae are extremely rare. A star can burn for billions of years, but it dies just once and quickly, and only a few dying stars explode. Most expire quietly, like a camp fire at dawn. In a typical galaxy, consisting of a hundred billion stars, a supernova will occur on average once every two or three hundred years. Looking for a supernova, therefore, was a little like standing on the observation platform of the Empire State Building with a telescope and searching windows around Manhattan in the hope of finding, let us say, someone lighting a twenty-first birthday cake.

  So when a hopeful and softly spoken minister got in touch to ask if they had any usable field charts for hunting supernovae, the astronomical community thought he was out of his mind. At the time Evans had a 10-inch telescope—a very respectable size for amateur star-gazing, but hardly the sort of thing with which to do serious cosmology—and he was proposing to find one of the universe’s rarer phenomena. In the whole of astronomical history before Evans started looking in 1980, fewer than sixty supernovae had been found. (At the time I visited him, in August 2001, he had just recorded his thirty-fourth visual discovery; a thirty-fifth followed three months later, and a thirty-sixth in early 2003.)

  The Reverend Robert Evans with the sixteen-inch telescope he uses to spot supernovae from the sun-deck of his home in New South Wales, Australia. The world’s most successful individual hunter of supernovae, Evans has recorded three dozen sightings. (credit 3.5)

  Evans, however, had certain advantages. Most observers, like most people generally, are in the northern hemisphere, so he had a lot of sky largely to himself, especially at first. He also had speed and his uncanny memory. Large telescopes are cumbersome things, and much of their operational time is consumed in being manoeuvred into position. Evans could swing his little 16-inch telescope around like a tail-gunner in a dogfight, spending no more than a couple of seconds on any particular point in the sky. In consequence, he could observe perhaps four hundred galaxies in an evening while a large professional telescope would be lucky to do fifty or sixty.

  (credit 3.6)

  Looking for supernovae is mostly a matter of not finding them. From 1980 to 1996 he averaged two discoveries a year—not a huge payoff for hundreds of nights of peering and peering. Once he found three in fifteen days, but another time he went three years without finding any at all.

  “There is actually a certain value in not finding anything,” he said. “It helps cosmologists to work out the rate at which galaxies are evolving. It’s one of those rare areas where the absence of evidence is evidence.”

  On a table beside the telescope were stacks of photos and papers relevant to his pursuits, and he showed me some of them now. If you have ever looked through popular astronomical publications, and at some time you must have, you will know that they are generally full of richly luminous colour photos of distant nebulae and the like—fairy-lit clouds of celestial light of the most delicate and moving splendour. Evans’s working images are nothing like that. They are just blurry black-and-white photos with little points of haloed brightness. One he showed me depicted a swarm of stars in which lurked a trifling flare that I had to put close to my face to discern. This, Evans told me, was a star in a constellation called Fornax from a galaxy known to astronomy as NGC1365. (NGC stands for New General Catalogue, where these things are recorded. Once it was a heavy book on someone’s desk in Dublin; today, needless to say, it’s a database.) For sixty million years, the light from this star’s spectacular demise travelled unceasingly through space until one night in August 2001 it arrived at Earth in the form of a puff of radiance, the tiniest brightening, in the night sky. It was, of course, Robert Evans on his eucalypt-scented hillside who spotted it.

  “There’s something satisfying, I think,” Evans said, “about the idea of light travelling for millions of years through space and just at the right moment as it reaches Earth someone looks at the right bit of sky and sees it. It just seems right that an event of that magnitude should be witnessed.”

  Supernovae do much more than simply impart a sense of wonder. They come in several types (one of them discovered by Evans), and of these, one in particular, known as the Ia supernova, is important to astronomy because these supernovae always explode in the same way, with the same critical mass. For this reason they can be used as “standard candles”—benchmarks by which to measure the brightness (and hence relative distance) of other stars, and thus to measure the expansion rate of the universe.

  In 1987 Saul Perlmutter at the Lawrence Berkeley Laboratory in California, needing more Ia supernovae than visual sightings were providing, set out to find a more systematic method of searching for them. Perlmutter devised a nifty system using sophisticated computers and charge-coupled devices—in essence, really good digital cameras. It automated supernova hunting. Telescopes could now take thousands of pictures and let a computer detect the tell-tale bright spots that marked a supernova explosion. In five years, with the new technique, Perlmutter and his c
olleagues at Berkeley found forty-two supernovae. Now even amateurs are finding supernovae with charge-coupled devices. “With CCDs you can aim a telescope at the sky and go watch television,” Evans said with a touch of dismay. “It took all the romance out of it.”

  I asked him if he was tempted to adopt the new technology. “Oh, no,” he said, “I enjoy my way too much. Besides”—he gave a nod at the photo of his latest supernova and smiled—“I can still beat them sometimes.”

  The question that naturally occurs is: what would it be like if a star exploded nearby? Our nearest stellar neighbour, as we have seen, is Alpha Centauri, 4.3 light years away. I had imagined that if there were an explosion there we would have 4.3 years to watch the light of this magnificent event spreading across the sky, as if tipped from a giant can. What would it be like if we had four years and four months to watch an inescapable doom advancing towards us, knowing that when it finally arrived it would blow the skin right off our bones? Would people still go to work? Would farmers plant crops? Would anyone deliver them to the shops?

  North American rock art, dating from about a thousand years ago, is thought probably to record one of the great astronomical events of historical times—the supernova explosion that created the Crab Nebula in July 1054. It was one of only a handful of supernova events near enough to Earth to be seen without a telescope. (credit 3.7)

  Weeks later, back in the town in New Hampshire where I then lived, I put these questions to John Thorstensen, an astronomer at Dartmouth College. “Oh no,” he said, laughing. “The news of such an event travels out at the speed of light, but so does the destructiveness, so you’d learn about it and die from it in the same instant. But don’t worry, because it’s not going to happen.”

  For the blast of a supernova explosion to kill you, he explained, you would have to be “ridiculously close”—probably within ten light years or so. “The danger would be various types of radiation—cosmic rays and so on.” These would produce fabulous auroras, shimmering curtains of spooky light that would fill the whole sky. This would not be a good thing. Anything potent enough to put on such a show could well blow away the magnetosphere, the magnetic zone high above the Earth that normally protects us from ultraviolet rays and other cosmic assaults. Without the magnetosphere anyone unfortunate enough to step into sunlight would pretty quickly take on the appearance of, let us say, an overcooked pizza.

  Only half a dozen times in recorded history have supernovae been close enough to be visible to the naked eye.

  The gaseous swirl of the Crab Nebula, all that is left of a giant star, about ten times the mass of our own Sun, that blew apart in 1604. Though 6,500 light years from Earth, the explosion was clearly visible in daylight for over three weeks and at night for almost two years. It was dubbed the Crab Nebula in the nineteenth century because of the shape it appeared to present when viewed through early telescopes. (credit 3.8)

  The reason we can be reasonably confident that such an event won’t happen in our corner of the galaxy, Thorstensen said, is that it takes a particular kind of star to make a supernova in the first place. A candidate star must be ten to twenty times as massive as our own Sun, and “we don’t have anything of the requisite size that’s that close. The universe is a mercifully big place.” The nearest likely candidate, he added, is Betelgeuse, whose various sputterings have for years suggested that something interestingly unstable is going on there. But Betelgeuse is five hundred light years away—a safe distance.

  Only half a dozen times in recorded history have supernovae been close enough to be visible to the naked eye. One was a blast in 1054 that created the Crab Nebula. Another, in 1604, made a star bright enough to be seen during the day for over three weeks. The most recent was in 1987, when a supernova flared in a zone of the cosmos known as the Large Magellanic Cloud, but that was only barely visible and only in the southern hemisphere—and it was a comfortably safe 169,000 light years away.

  Supernovae are significant to us in one other decidedly central way. Without them we wouldn’t be here. You will recall the cosmological conundrum with which we ended the first chapter—that the Big Bang created lots of light gases but no heavy elements. Those came later, but for a very long time nobody could figure out how they came later. The problem was that you needed something really hot—hotter even than the middle of the hottest stars—to forge carbon and iron and the other elements without which we would be distressingly immaterial. Supernovae provided the explanation, and it was an English cosmologist almost as singular in manner as Fritz Zwicky who worked it out.

  He was a Yorkshireman named Fred Hoyle. Hoyle, who died in 2001, was described in an obituary in Nature as a “cosmologist and controversialist,” and both of those he most certainly was. He was, according to Nature’s obituary, “embroiled in controversy for most of his life” and “put his name to much rubbish.” He claimed, for instance, and without evidence, that the Natural History Museum’s treasured fossil of an archaeopteryx was a forgery along the lines of the Piltdown hoax, causing much exasperation to the museum’s palaeontologists, who had to spend days fielding phone calls from journalists all over the world. He also believed that the Earth was seeded from space not only by life but also by many of its diseases, such as influenza and bubonic plague, and suggested at one point that humans evolved projecting noses with the nostrils underneath as a way of keeping cosmic pathogens from falling into them.

  A view of the night sky and the constellation Orion, recognizable by the row of three stars of roughly equal brightness at centre-right, which collectively are known as Orion’s belt. Directly above the belt is Betelgeuse, a red supergiant with an actual luminosity 13,000 times that of our own Sun. Such massive stars are comparatively short-lived and Betelgeuse is destined to become a supernova one day, but at a distance of over 500 light years it is probably no threat to life on Earth. (credit 3.9)

  It was he who coined the term Big Bang, in a moment of facetiousness, for a radio broadcast in 1952. He pointed out that nothing in our understanding of physics could account for why everything, gathered to a point, would suddenly and dramatically begin to expand. Hoyle favoured a steady-state theory in which the universe was constantly expanding and continually creating new matter as it went. Hoyle also realized that if stars imploded they would liberate huge amounts of heat—100 million degrees or more, enough to begin to generate the heavier elements in a process known as nucleosynthesis. In 1957, working with others, Hoyle showed how the heavier elements were formed in supernova explosions. For this work, W. A. Fowler, one of his collaborators, received a Nobel Prize. Hoyle, shamefully, did not.

  The English cosmologist Fred Hoyle, who coined the term “Big Bang” and showed how supernova explosions could have generated the necessary heat to create the heavy elements that led to the formation of rocky planets and, eventually, us. (credit 3.10)

  According to Hoyle’s theory, an exploding star would generate enough heat to create all the new elements and spray them into the cosmos where they would form gaseous clouds—the interstellar medium, as it is known—that could eventually coalesce into new solar systems. With the new theories it became possible at last to construct plausible scenarios for how we got here. What we now think we know is this:

  About 4.6 billion years ago, a great swirl of gas and dust some 24 billion kilometres across accumulated in space where we are now and began to aggregate. Virtually all of it—99.9 per cent of the mass of the solar system—went to make the Sun. Out of the floating material that was left over, two microscopic grains floated close enough together to be joined by electrostatic forces. This was the moment of conception for our planet. All over the inchoate solar system, the same was happening. Colliding dust grains formed larger and larger clumps. Eventually the clumps grew large enough to be called planetesimals. As these endlessly bumped and collided, they fractured or split or recombined in endless random permutations, but in every encounter there was a winner, and some of the winners grew big enough to dominate the orb
it around which they travelled.

  It all happened remarkably quickly. To grow from a tiny cluster of grains to a baby planet some hundreds of kilometres across is thought to have taken only a few tens of thousands of years. In just 200 million years, possibly less, the Earth was essentially formed, though still molten and subject to constant bombardment from all the debris that remained floating about.

  At this point, about 4.4 billion years ago, an object the size of Mars crashed into the Earth, blowing out enough material to form a companion sphere, the Moon. Within weeks, it is thought, the flung material had reassembled itself into a single clump, and within a year it had formed into the spherical rock that companions us yet. Most of the lunar material, it is thought, came from the Earth’s crust, not its core, which is why the Moon has so little iron while we have a lot. The theory, incidentally, is almost always presented as a recent one, but in fact it was first proposed in the 1940s by Reginald Daly of Harvard. The only recent thing about it is people paying any attention to it.

 

‹ Prev