Book Read Free

A Short History of Nearly Everything

Page 15

by Bill Bryson


  There, in a short chapter of just five pages (out of the book's more than nine hundred), people of learning first encountered atoms in something approaching their modern conception. Dalton's simple insight was that at the root of all matter are exceedingly tiny, irreducible particles. "We might as well attempt to introduce a new planet into the solar system or annihilate one already in existence, as to create or destroy a particle of hydrogen," he wrote.

  Neither the idea of atoms nor the term itself was exactly new. Both had been developed by the ancient Greeks. Dalton's contribution was to consider the relative sizes and characters of these atoms and how they fit together. He knew, for instance, that hydrogen was the lightest element, so he gave it an atomic weight of one. He believed also that water consisted of seven parts of oxygen to one of hydrogen, and so he gave oxygen an atomic weight of seven. By such means was he able to arrive at the relative weights of the known elements. He wasn't always terribly accurate--oxygen's atomic weight is actually sixteen, not seven--but the principle was sound and formed the basis for all of modern chemistry and much of the rest of modern science.

  The work made Dalton famous--albeit in a low-key, English Quaker sort of way. In 1826, the French chemist P .J. Pelletier traveled to Manchester to meet the atomic hero. Pelletier expected to find him attached to some grand institution, so he was astounded to discover him teaching elementary arithmetic to boys in a small school on a back street. According to the scientific historian E. J. Holmyard, a confused Pelletier, upon beholding the great man, stammered:

  "Est-ce que j'ai l'honneur de m'addresser à Monsieur Dalton?" for he could hardly believe his eyes that this was the chemist of European fame, teaching a boy his first four rules. "Yes," said the matter-of-fact Quaker. "Wilt thou sit down whilst I put this lad right about his arithmetic?"

  Although Dalton tried to avoid all honors, he was elected to the Royal Society against his wishes, showered with medals, and given a handsome government pension. When he died in 1844, forty thousand people viewed the coffin, and the funeral cortege stretched for two miles. His entry in the Dictionary of National Biography is one of the longest, rivaled in length only by those of Darwin and Lyell among nineteenth-century men of science.

  For a century after Dalton made his proposal, it remained entirely hypothetical, and a few eminent scientists--notably the Viennese physicist Ernst Mach, for whom is named the speed of sound--doubted the existence of atoms at all. "Atoms cannot be perceived by the senses . . . they are things of thought," he wrote. The existence of atoms was so doubtfully held in the German-speaking world in particular that it was said to have played a part in the suicide of the great theoretical physicist, and atomic enthusiast, Ludwig Boltzmann in 1906.

  It was Einstein who provided the first incontrovertible evidence of atoms' existence with his paper on Brownian motion in 1905, but this attracted little attention and in any case Einstein was soon to become consumed with his work on general relativity. So the first real hero of the atomic age, if not the first personage on the scene, was Ernest Rutherford.

  Rutherford was born in 1871 in the "back blocks" of New Zealand to parents who had emigrated from Scotland to raise a little flax and a lot of children (to paraphrase Steven Weinberg). Growing up in a remote part of a remote country, he was about as far from the mainstream of science as it was possible to be, but in 1895 he won a scholarship that took him to the Cavendish Laboratory at Cambridge University, which was about to become the hottest place in the world to do physics.

  Physicists are notoriously scornful of scientists from other fields. When the wife of the great Austrian physicist Wolfgang Pauli left him for a chemist, he was staggered with disbelief. "Had she taken a bullfighter I would have understood," he remarked in wonder to a friend. "But a chemist . . ."

  It was a feeling Rutherford would have understood. "All science is either physics or stamp collecting," he once said, in a line that has been used many times since. There is a certain engaging irony therefore that when he won the Nobel Prize in 1908, it was in chemistry, not physics.

  Rutherford was a lucky man--lucky to be a genius, but even luckier to live at a time when physics and chemistry were so exciting and so compatible (his own sentiments notwithstanding). Never again would they quite so comfortably overlap.

  For all his success, Rutherford was not an especially brilliant man and was actually pretty terrible at mathematics. Often during lectures he would get so lost in his own equations that he would give up halfway through and tell the students to work it out for themselves. According to his longtime colleague James Chadwick, discoverer of the neutron, he wasn't even particularly clever at experimentation. He was simply tenacious and open-minded. For brilliance he substituted shrewdness and a kind of daring. His mind, in the words of one biographer, was "always operating out towards the frontiers, as far as he could see, and that was a great deal further than most other men." Confronted with an intractable problem, he was prepared to work at it harder and longer than most people and to be more receptive to unorthodox explanations. His greatest breakthrough came because he was prepared to spend immensely tedious hours sitting at a screen counting alpha particle scintillations, as they were known--the sort of work that would normally have been farmed out. He was one of the first to see--possibly the very first--that the power inherent in the atom could, if harnessed, make bombs powerful enough to "make this old world vanish in smoke."

  Physically he was big and booming, with a voice that made the timid shrink. Once when told that Rutherford was about to make a radio broadcast across the Atlantic, a colleague drily asked: "Why use radio?" He also had a huge amount of good-natured confidence. When someone remarked to him that he seemed always to be at the crest of a wave, he responded, "Well, after all, I made the wave, didn't I?" C. P. Snow recalled how once in a Cambridge tailor's he overheard Rutherford remark: "Every day I grow in girth. And in mentality."

  But both girth and fame were far ahead of him in 1895 when he fetched up at the Cavendish. * 20 It was a singularly eventful period in science. In the year of his arrival in Cambridge, Wilhelm Roentgen discovered X rays at the University of Würzburg in Germany, and the next year Henri Becquerel discovered radioactivity. And the Cavendish itself was about to embark on a long period of greatness. In 1897, J. J. Thomson and colleagues would discover the electron there, in 1911 C. T. R. Wilson would produce the first particle detector there (as we shall see), and in 1932 James Chadwick would discover the neutron there. Further still in the future, James Watson and Francis Crick would discover the structure of DNA at the Cavendish in 1953.

  In the beginning Rutherford worked on radio waves, and with some distinction--he managed to transmit a crisp signal more than a mile, a very reasonable achievement for the time--but gave it up when he was persuaded by a senior colleague that radio had little future. On the whole, however, Rutherford didn't thrive at the Cavendish. After three years there, feeling he was going nowhere, he took a post at McGill University in Montreal, and there he began his long and steady rise to greatness. By the time he received his Nobel Prize (for "investigations into the disintegration of the elements, and the chemistry of radioactive substances," according to the official citation) he had moved on to Manchester University, and it was there, in fact, that he would do his most important work in determining the structure and nature of the atom.

  By the early twentieth century it was known that atoms were made of parts--Thomson's discovery of the electron had established that--but it wasn't known how many parts there were or how they fit together or what shape they took. Some physicists thought that atoms might be cube shaped, because cubes can be packed together so neatly without any wasted space. The more general view, however, was that an atom was more like a currant bun or a plum pudding: a dense, solid object that carried a positive charge but that was studded with negatively charged electrons, like the currants in a currant bun.

  In 1910, Rutherford (assisted by his student Hans Geiger, who would later invent the radiati
on detector that bears his name) fired ionized helium atoms, or alpha particles, at a sheet of gold foil. * 21 To Rutherford's astonishment, some of the particles bounced back. It was as if, he said, he had fired a fifteen-inch shell at a sheet of paper and it rebounded into his lap. This was just not supposed to happen. After considerable reflection he realized there could be only one possible explanation: the particles that bounced back were striking something small and dense at the heart of the atom, while the other particles sailed through unimpeded. An atom, Rutherford realized, was mostly empty space, with a very dense nucleus at the center. This was a most gratifying discovery, but it presented one immediate problem. By all the laws of conventional physics, atoms shouldn't therefore exist.

  Let us pause for a moment and consider the structure of the atom as we know it now. Every atom is made from three kinds of elementary particles: protons, which have a positive electrical charge; electrons, which have a negative electrical charge; and neutrons, which have no charge. Protons and neutrons are packed into the nucleus, while electrons spin around outside. The number of protons is what gives an atom its chemical identity. An atom with one proton is an atom of hydrogen, one with two protons is helium, with three protons is lithium, and so on up the scale. Each time you add a proton you get a new element. (Because the number of protons in an atom is always balanced by an equal number of electrons, you will sometimes see it written that it is the number of electrons that defines an element; it comes to the same thing. The way it was explained to me is that protons give an atom its identity, electrons its personality.)

  Neutrons don't influence an atom's identity, but they do add to its mass. The number of neutrons is generally about the same as the number of protons, but they can vary up and down slightly. Add a neutron or two and you get an isotope. The terms you hear in reference to dating techniques in archeology refer to isotopes--carbon-14, for instance, which is an atom of carbon with six protons and eight neutrons (the fourteen being the sum of the two).

  Neutrons and protons occupy the atom's nucleus. The nucleus of an atom is tiny--only one millionth of a billionth of the full volume of the atom--but fantastically dense, since it contains virtually all the atom's mass. As Cropper has put it, if an atom were expanded to the size of a cathedral, the nucleus would be only about the size of a fly--but a fly many thousands of times heavier than the cathedral. It was this spaciousness--this resounding, unexpected roominess--that had Rutherford scratching his head in 1910.

  It is still a fairly astounding notion to consider that atoms are mostly empty space, and that the solidity we experience all around us is an illusion. When two objects come together in the real world--billiard balls are most often used for illustration--they don't actually strike each other. "Rather," as Timothy Ferris explains, "the negatively charged fields of the two balls repel each other . . . were it not for their electrical charges they could, like galaxies, pass right through each other unscathed." When you sit in a chair, you are not actually sitting there, but levitating above it at a height of one angstrom (a hundred millionth of a centimeter), your electrons and its electrons implacably opposed to any closer intimacy.

  The picture that nearly everybody has in mind of an atom is of an electron or two flying around a nucleus, like planets orbiting a sun. This image was created in 1904, based on little more than clever guesswork, by a Japanese physicist named Hantaro Nagaoka. It is completely wrong, but durable just the same. As Isaac Asimov liked to note, it inspired generations of science fiction writers to create stories of worlds within worlds, in which atoms become tiny inhabited solar systems or our solar system turns out to be merely a mote in some much larger scheme. Even now CERN, the European Organization for Nuclear Research, uses Nagaoka's image as a logo on its website. In fact, as physicists were soon to realize, electrons are not like orbiting planets at all, but more like the blades of a spinning fan, managing to fill every bit of space in their orbits simultaneously (but with the crucial difference that the blades of a fan only seem to be everywhere at once; electrons are ).

  Needless to say, very little of this was understood in 1910 or for many years afterward. Rutherford's finding presented some large and immediate problems, not least that no electron should be able to orbit a nucleus without crashing. Conventional electrodynamic theory demanded that a flying electron should very quickly run out of energy--in only an instant or so--and spiral into the nucleus, with disastrous consequences for both. There was also the problem of how protons with their positive charges could bundle together inside the nucleus without blowing themselves and the rest of the atom apart. Clearly whatever was going on down there in the world of the very small was not governed by the laws that applied in the macro world where our expectations reside.

  As physicists began to delve into this subatomic realm, they realized that it wasn't merely different from anything we knew, but different from anything ever imagined. "Because atomic behavior is so unlike ordinary experience," Richard Feynman once observed, "it is very difficult to get used to and it appears peculiar and mysterious to everyone, both to the novice and to the experienced physicist." When Feynman made that comment, physicists had had half a century to adjust to the strangeness of atomic behavior. So think how it must have felt to Rutherford and his colleagues in the early 1910s when it was all brand new.

  One of the people working with Rutherford was a mild and affable young Dane named Niels Bohr. In 1913, while puzzling over the structure of the atom, Bohr had an idea so exciting that he postponed his honeymoon to write what became a landmark paper. Because physicists couldn't see anything so small as an atom, they had to try to work out its structure from how it behaved when they did things to it, as Rutherford had done by firing alpha particles at foil. Sometimes, not surprisingly, the results of these experiments were puzzling. One puzzle that had been around for a long time had to do with spectrum readings of the wavelengths of hydrogen. These produced patterns showing that hydrogen atoms emitted energy at certain wavelengths but not others. It was rather as if someone under surveillance kept turning up at particular locations but was never observed traveling between them. No one could understand why this should be.

  It was while puzzling over this problem that Bohr was struck by a solution and dashed off his famous paper. Called "On the Constitutions of Atoms and Molecules," the paper explained how electrons could keep from falling into the nucleus by suggesting that they could occupy only certain well-defined orbits. According to the new theory, an electron moving between orbits would disappear from one and reappear instantaneously in another without visiting the space between . This idea--the famous "quantum leap"--is of course utterly strange, but it was too good not to be true. It not only kept electrons from spiraling catastrophically into the nucleus; it also explained hydrogen's bewildering wavelengths. The electrons only appeared in certain orbits because they only existed in certain orbits. It was a dazzling insight, and it won Bohr the 1922 Nobel Prize in physics, the year after Einstein received his.

  Meanwhile the tireless Rutherford, now back at Cambridge as J. J. Thomson's successor as head of the Cavendish Laboratory, came up with a model that explained why the nuclei didn't blow up. He saw that they must be offset by some type of neutralizing particles, which he called neutrons. The idea was simple and appealing, but not easy to prove. Rutherford's associate, James Chadwick, devoted eleven intensive years to hunting for neutrons before finally succeeding in 1932. He, too, was awarded with a Nobel Prize in physics, in 1935. As Boorse and his colleagues point out in their history of the subject, the delay in discovery was probably a very good thing as mastery of the neutron was essential to the development of the atomic bomb. (Because neutrons have no charge, they aren't repelled by the electrical fields at the heart of an atom and thus could be fired like tiny torpedoes into an atomic nucleus, setting off the destructive process known as fission.) Had the neutron been isolated in the 1920s, they note, it is "very likely the atomic bomb would have been developed first in Europe, undoubtedly by
the Germans."

  As it was, the Europeans had their hands full trying to understand the strange behavior of the electron. The principal problem they faced was that the electron sometimes behaved like a particle and sometimes like a wave. This impossible duality drove physicists nearly mad. For the next decade all across Europe they furiously thought and scribbled and offered competing hypotheses. In France, Prince Louis-Victor de Broglie, the scion of a ducal family, found that certain anomalies in the behavior of electrons disappeared when one regarded them as waves. The observation excited the attention of the Austrian Erwin Schrödinger, who made some deft refinements and devised a handy system called wave mechanics. At almost the same time the German physicist Werner Heisenberg came up with a competing theory called matrix mechanics. This was so mathematically complex that hardly anyone really understood it, including Heisenberg himself ("I do not even know what a matrix is ," Heisenberg despaired to a friend at one point), but it did seem to solve certain problems that Schrödinger's waves failed to explain.

  The upshot is that physics had two theories, based on conflicting premises, that produced the same results. It was an impossible situation.

  Finally, in 1926, Heisenberg came up with a celebrated compromise, producing a new discipline that came to be known as quantum mechanics. At the heart of it was Heisenberg's Uncertainty Principle, which states that the electron is a particle but a particle that can be described in terms of waves. The uncertainty around which the theory is built is that we can know the path an electron takes as it moves through a space or we can know where it is at a given instant, but we cannot know both. * 22 Any attempt to measure one will unavoidably disturb the other. This isn't a matter of simply needing more precise instruments; it is an immutable property of the universe.

 

‹ Prev