Coming of Age in the Milky Way

Home > Other > Coming of Age in the Milky Way > Page 30
Coming of Age in the Milky Way Page 30

by Timothy Ferris


  PART THREE

  CREATION

  O landless void, O skyless void,

  O nebulous, purposeless space,

  Eternal and timeless,

  Become the world, extend!

  —Tahitian creation tale

  What really interests me is whether God had any choice in the creation of the world.

  —Einstein

  15

  THE QUANTUM AND ITS DISCONTENTS

  What is the path? There is no path.

  —Niels Bohr, quoting Goethe

  Progress in physics has always moved from the intuitive toward the abstract.

  —Max Born

  The act of exploration alters the perspective of the explorer; Odysseus and Marco Polo and Columbus returned home as changed men. So it has been with scientific investigation of the extremities of scale, from the grand sweep of cosmological space down to the cramped and frantic world of the subatomic particles: These journeys changed us, challenging many of the scientific and philosophical conceptions we had most cherished. Some had to be discarded, like baggage left behind on a trek across a desert. Others were altered and repaired almost beyond recognition, like the veteran mountaineer’s hand-hammered pitons or the old seaman’s knife with its twine-encrusted handle and bone-thin blade. Exploration of the realm of the galaxies extended the reach of human vision by a factor of some 1026 larger than the human scale, and brought about the revolution we identify with relativity, which revealed that the Newtonian world view was but a parochialism in a wider universe where space is curved and time becomes pliant. Exploration of the subatomic realm carried us far into the realm of the small, to some 10•15 of the human scale, and it, too, wrought a revolution. This was quantum physics, and all that it touched it transformed.

  Quantum theory was born in 1900, when Max Planck realized that he could account for what was called the black-body curve—the spectrum of energy generated by a perfectly radiating object —only if he abandoned the classical assumption that energy is emitted continuously and replaced it with the unprecedented hypothesis that energy comes in discrete units. Planck called these units quanta, after the Greco-Latin word for “how much” (as in quantity), and he defined them in terms of the quantum of action, symbolized by the letter h. Planck was no revolutionary—at age forty-two he was an old man by the standards of mathematical science, and a pillar of nineteenth-century German high culture to boot—but he readily appreciated that the quantum principle would shatter much of the classical physics to which he had devoted his career. “The greater [its] difficulties,” he wrote, “… the more significant, it finally will show itself to be for the broadening and deepening of our whole knowledge in physics.”1 His words proved prophetic: Constantly changing and developing, altering its coloration as unpredictably as a reflection in a soap bubble, quantum physics soon expanded into virtually every area of physics, and Planck’s h came to be regarded as a fundamental constant of nature, on a par with Einstein’s c, the velocity of light.

  The quantum principle was very strange—it was, as Gamow remarked, as if one could drink a pint of beer or no beer at all, but were barred by a law of nature from drinking any quantity of beer between zero and one pint—and it got stranger as it evolved. The decisive break with classical physics came in 1927, when the young German physicist Werner Heisenberg arrived at the indeterminacy principle. Heisenberg found that one can learn either the exact position of a given particle or its exact trajectory, but not both. If, for instance, we watch a proton fly through a cloud chamber, we can by recording its track discern the direction in which it is moving, but in the process of plowing through the water vapor in the chamber the proton will have slowed down, robbing us of information about just where it was at any given instant. Alternately, we can irradiate the proton—take a flash photograph of it, so to speak—and thus determine its exact location at a given instant, but the light or other radiation we employ to take the photograph will knock the proton off its appointed rounds, depriving us of precise knowledge of where it would have gone had we left it alone. We are, therefore, limited in our knowledge of the subatomic world: We can extract only partial answers, the nature of which are decided to some extent by the questions we choose to ask. When Heisenberg calculated the inescapable minimum amount of uncertainty that limits our understanding of events on the small scale, he found that it is defined by nothing other than h, Planck’s quantum of action.

  The Scale of the Known Universe

  Radius (meters) Characteristic Objects

  1026 Observable universe

  1024 Superclusters of galaxies

  1023 Clusters of galaxies

  1022 Groups of galaxies (e.g., the Local Group)

  1021 Milky Way galaxy

  1018 Giant nebulae, molecular clouds

  1012 Solar system

  1011 Outer atmospheres of red giant stars

  109 Sun

  108 Giant planets (e.g., Jupiter)

  107 Dwarf stars, Earthlike planets

  105 Asteroids, comet nuclei

  104 Neutron stars

  1 Human beings

  10−2 DNA molecules (long axis)

  10−5 Living cells

  10−9 DNA molecules (short axis)

  10−10 Atoms

  10−14 Nuclei of heavy atoms

  10−15 Protons, neutrons

  10−35 Planck length: Quantum of space; radius of “dimensionless” particles in string theory

  Quantum indeterminacy does not depend upon the design of the experimental apparatus employed to investigate the subatomic world. It is, so far as anyone can tell, an absolute limitation, one that the most wizardly scholars of an advanced extraterrestrial civilization would share with the humblest string-and-sealing-wax physicist on Earth. In classical atomic physics it had been assumed that one could, in principle, measure the precise locations and trajectories of billions of particles—protons, say—and from the resulting data make exact predictions about where the protons would be at some time in the future. Heisenberg showed that this assumption was false—that we can never know everything about the behavior of even one particle, much less myriads of them, and, therefore, can never make predictions about the future that will be completely accurate in every detail. This marked a fundamental change in the world view of physics. It revealed that not only matter and energy but knowledge itself is quantized.

  The more closely physicists examined the subatomic world, the larger indeterminacy loomed. When a photon strikes an atom, boosting an electron into a higher orbit, the electron moves from the lower to the upper orbit instantaneously, without having traversed the intervening space. The orbital radii themselves are quantized, and the electron simply ceases to exist at one point, simultaneously appearing at another. This is the famously confounding “quantum leap,” and it is no mere philosophical poser; unless it is taken seriously, the behavior of atoms cannot be predicted accurately. Similarly, we saw earlier, it is by virtue of quantum indeterminacy that protons can leap the Coulomb barrier, permitting nuclear fusion to occur at a sufficiently robust rate to keep the stars shining.

  Those who find such considerations nonsensical are in good company; as Niels Bohr remarked, when one of his students at Copenhagen complained that quantum mechanics made him giddy, “If anybody says he can think about quantum problems without getting giddy, that only shows he has not understood the first thing about them.”2 The reason, however, is simply that we human beings, having grown up in the macroscopic world, tend to think of things in terms of macroscopic similes—subatomic particles are like buckshot, light waves are like waves in the ocean, atoms are like little solar systems, and so forth—and these similes break down on the microscopic scale.

  Our mental pictures are drawn from our visual perceptions of the world around us. But the world as perceived by the eye is itself exposed as an illusion when scrutinized on the microscopic scale. A bar of gold, though it looks solid, is composed almost entirely of empty space: The nucleus of each of i
ts atoms is so small that if one atom were enlarged a million billion times, until its outer electron shell was as big as greater Los Angeles, its nucleus would still be only about the size of a compact car parked downtown. (The electron shells would be zones of insubstantial heat lightning, each a mile or so thick, separated by many miles of space.) Nor, to return to the old classical metaphor, does a cue ball strike a billiard ball. Rather, the negatively charged fields of the two balls repel each other; on the subatomic scale, the billiard balls are as spacious as galaxies, and were it not for their like electrical charges they could, like galaxies, pass right through each other unscathed.

  The quantum revolution has been painful, but we can thank it for having delivered us from several of the illusions that afflicted the classical world view.

  One such was the delusion of apartness—the assumption that man is separate from nature and that acts of observation can, therefore, be conducted with complete objectivity. Traditionally, scientists were free to think of themselves as passive observers, sealed off by a pane of laboratory glass or a telescope’s lens from the outer world they examined. But on the microscopic level, every act of observation is disruptive—countless photons of starlight die upon the eye, protons smash into accelerator targets—and the manner in which we choose to make the observation (to “collapse the wave function,” as the physicists say) influences the results of the interaction. Subatomic particles sometimes resemble particles, sometimes waves, depending upon how we examine them. They are not “really” one or the other—and, in any event, the two images are mathematically equivalent. Rather, they are participants in an act of observation, the nature of which influences the qualities they present to us. Quantum physics obliges us to take seriously what had previously been a more purely philosophical consideration: That we do not see things in themselves, but only aspects of things. What we see in an electron path in a bubble chamber is not an electron, and what we see in the sky are not stars, any more than a recording of Caruso’s voice is Caruso. By revealing that the observer plays a role in the observed, quantum physics did for physics what Darwin had done in the life sciences: It tore down walls, reuniting mind with the wider universe.

  Likewise with the dilemma of strict causation. Classical physics was deterministic: If A, then B; the bullet fired at the window shatters the glass. On the quantum scale this is only probably true: Most of the particles in the bullet encounter those of the glass, but some go elsewhere, and the trajectory of any one of them can be predicted only by invoking the statistics of probabilities. Einstein was deeply troubled by this aspect of the new physics. “God does not play dice,” he said, and he argued that the indeterminacy principle, though useful in practice, does not represent the fundamental relationship between mind and nature. As he wrote to his friend and colleague Max Born:

  I find the idea quite intolerable that an electron exposed to radiation should choose of its own free will, not only its moment to jump off, but also its direction. In that case, I would rather be a cobbler, or even an employee in a gaming-house, than a physicist.3

  Einstein presented Bohr with a series of thought experiments aimed at disproving the theory of quantum indeterminacy. He was then near the peak of his powers, and his ideas were often startling in their originality and ingenuity, but Bohr and his students found flaws in them all. Nothing in nature, then or now, indicates that the universe is built upon a strictly deterministic underpinning, and no philosopher has been able to prove that we need to believe in hidden, deterministic mechanisms—“hidden variables”—that produce no observable results.

  Defeated in battle if unbowed in the greater campaign, Einstein took refuge in the long view: “Quantum mechanics is certainly imposing,” he told Born, “but an inner voice tells me that it is not yet the real thing. The theory says a lot, but does not really bring us any closer to the secret of the Old one.’ I, at any rate, am convinced that He is not playing at dice.”4 Ultimately, Einstein insisted, he would be proved right: “I am quite convinced that someone will eventually come up with a theory whose objects, connected by laws, are not probabilities but considered facts.”5

  That might conceivably be so, but it is not clear why we should wish it to be so. Strict causation, for all its classical pedigree, was ultimately a monstrous doctrine. Consider its stark formulation by the French mathematician Pierre-Simon de Laplace:

  An intelligence knowing, at a given instance of time, all forces acting in nature, as well as the momentary position of all things of which the universe consists, would be able to comprehend the motions of the largest bodies of the world and those of the lightest atoms in one single formula, provided his intellect were sufficiently powerful to subject all data to analysis; to him nothing would be uncertain, both past and future would be present in his eyes.6

  What was there here worth clinging to, against the hard evidence of quantum physics? The invocation of an all-knowing intelligence, which could only be that of God? The depiction of men as machines, deprived of free will? The pretense that every occurrence, from the radioactive decay of a barium atom to the Battle of Hastings, was fated to occur just when and how it did, in a universe devoid of originality and surprise? We are free (or fated) to answer, none of the above, and to recoil from Laplace’s deterministic vision with a revulsion just as deep as Einstein’s over Bohr’s interpretation of Heisenberg. Quantum indeterminacy may have nothing to do with human will, but as a matter of philosophical taste there are good reasons to celebrate the return of chance to the fundamental affairs of the world.

  And, of course, the test of a scientific theory has to do less with whether one finds it philosophically palatable than with whether it works. Quantum physics works. It depicts the world as an assembly of animated fields, and the field equations are often too abstract to seem familiar, but they tell the story of the subatomic world more accurately than do the homier metaphors to which the prior intellectual history of our species had accustomed us.

  Not that the quantum precept escaped without its share of growing pains. Far from it: In charting the unfamiliar terrain of the small, its practitioners embroiled themselves in misconceptions and perplexities that made the astronomers’ earlier bewilderments over spiral nebulae and the age of the sun look inviting by comparison. Quantum numbers and airy abstractions were hurled at tough problems with both hands, until microphysics in its darkest days was justly and scathingly compared to Ptolemaic cosmology, with its wheels within wheels and its abandonment of all but the most abstract claim to model the real world. There were too many particles, so many that physicists eventually were obliged to consult a booklet, the Particle Properties Data Handbook, just to keep track of them. “If I could remember the names of all these particles I would have been a botanist,”7 fumed Enrico Fermi, and the physicist Martinus Veltman later mused that “as the number of particles increases all we are doing is increasing our ignorance.”8 There were, for some decades, too many theories as well, and many inconsistencies among them. A few physicists became so frustrated that they quit science altogether.

  Yet it is turmoil and confusion and not calm assurance that mark the growth of the mind, and when the dust began to clear, quantum physics emerged as not only a vital and rapidly developing field of science, but as one of the greatest intellectual achievements in the history of human thought. Though by no means complete, it was now able to make accurate predictions about an imposing array of phenomena, from optics and computer design to the shining of the stars, and to do so in terms of theoretical structures that could already be seen to possess a beauty and scope worthy of the universe they sought to describe.

  The patchwork of theories that came to constitute quantum physics by the final quarter of the century was known collectively as the standard model. Viewed by its lights, the world is composed of two general categories of particles—those of fractional spin (Vi), called fermions, after Enrico Fermi, and those of integer spin (0, 1 or 2), called bosons, after Satyendra Nath Bose, who, with Einstein
, developed the statistical laws that govern their behavior.*

  Fermions comprise matter. They obey what is called the Pauli exclusion principle, enunciated by the Austrian physicist Wolfgang Pauli in 1925, which establishes that no two fermions can occupy a given quantum state at the same time. It is owing to this characteristic of fermions that only a limited number of electrons can occupy each shell in an atom, and that there is an upper limit to the number of protons and neutrons that can be assembled to form a stable atomic nucleus. Protons, neutrons, and electrons are all fermions.

  Bosons convey force. To hazard a hyperbolic image, one might think of the fermions as akin to ice skaters who are busy tossing medicine balls back and forth; the medicine balls are bosons, and the change in the trajectory of each skater that occurs when they throw or catch the balls betrays, in Newtonian language, the presence of a force.* Bosons do not obey the exclusion principle, and consequently several different forces can operate in the same place at the same time: The atoms in this book, for instance, are simultaneously subject both to the electrical attraction among their protons and electrons and to the gravitational force of the earth.

  There are four known fundamental forces (or classes of interactions, in quantum terminology)—gravitation, electromagnetism, and the strong and weak nuclear forces. Each plays a distinct role. Gravitation, the universal attraction of all particles of matter for one another, holds each star and planet together, and retains planets in their orbits around stars and stars in their orbits in galaxies. Electromagnetism, the attraction of particles with opposite electrical or magnetic charge for one another, produces light and all other forms of electromagnetic radiation, including the long-wavelength radiation called radio waves and the short-wavelength radiation called X rays and gamma rays. Electromagnetism also bundles atoms together as molecules, making it responsible for the structure of matter as we know it. The strong nuclear force binds protons and neutrons (known as nucleons) together in the nuclei of atoms, and binds the elementary particles called quarks together to form each nucleon. The weak nuclear force mediates the process of radioactive decay, the source of energy emitted by the chunks of radium studied by Rutherford and the Curies.

 

‹ Prev