The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for the Ultimate Theory

Home > Other > The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for the Ultimate Theory > Page 13
The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for the Ultimate Theory Page 13

by Brian Greene


  Although we have described this in the case of electrons, similar experiments lead to the conclusion that all matter has a wave-like character. But how does this jibe with our real-world experience of matter as being solid and sturdy, and in no way wave-like? Well, de Broglie set down a formula for the wavelength of matter waves, and it shows that the wavelength is proportional to Planck's constant ħ. (More precisely, the wavelength is given by ħ divided by the material body's momentum.) Since ħ is so small, the resulting wavelengths are similarly minuscule compared with everyday scales. This is why the wave-like character of matter becomes directly apparent only upon careful microscopic investigation. just as the large value of c, the speed of light, obscures much of the true nature of space and time, the smallness of ħ obscures the wave-like aspects of matter in the day-to-day world.

  Waves of What?

  The interference phenomenon found by Davisson and Germer made the wave-like nature of electrons tangibly evident. But waves of what? One early suggestion made by Austrian physicist Erwin Schrödinger was that the waves were "smeared-out" electrons. This captured some of the "feeling" of an electron wave, but it was too rough. When you smear something out, part of it is here and part of it is there. However, one never encounters half of an electron or a third of an electron or any other fraction, for that matter. This makes it hard to grasp what a smeared electron actually is. As an alternative, in 1926 German physicist Max Born sharply refined Schrödinger's interpretation of an electron wave, and it is his interpretation—amplified by Bohr and his colleagues—that is still with us today. Born's suggestion is one of the strangest features of quantum theory, but is supported nonetheless by an enormous amount of experimental data. He asserted that an electron wave must be interpreted from the standpoint of probability. Places where the magnitude (a bit more correctly, the square of magnitude) of the wave is large are places where the electron is more likely to be found; places where the magnitude is small are places where the electron is less likely to be found. An example is illustrated in Figure 4.9.

  This is truly a peculiar idea. What business does probability have in the formulation of fundamental physics? We are accustomed to probability showing up in horse races, in coin tosses, and at the roulette table, but in those cases it merely reflects our incomplete knowledge. If we knew precisely the speed of the roulette wheel, the weight and hardness of the white marble, the location and speed of the marble when it drops to the wheel, the exact specifications of the material constituting the cubicles and so on, and if we made use of sufficiently powerful computers to carry out our calculations we would, according to classical physics, be able to predict with certainty where the marble would settle. Gambling casinos rely on your inability to ascertain all of this information and to do the necessary calculations prior to placing your bet. But we see that probability as encountered at the roulette table does not reflect anything particularly fundamental about how the world works. Quantum mechanics, on the contrary, injects the concept of probability into the universe at a far deeper level. According to Born and more than half a century of subsequent experiments, the wave nature of matter implies that matter itself must be described fundamentally in a probabilistic manner. For macroscopic objects like a coffee cup or the roulette wheel, de Broglie's rule shows that the wave-like character is virtually unnoticeable and for most ordinary purposes the associated quantum-mechanical probability can be completely ignored. But at a microscopic level we learn that the best we can ever do is say that an electron has a particular probability of being found at any given location.

  The probabilistic interpretation has the virtue that if an electron wave does what other waves can do—for instance, slam into some obstacle and develop all sorts of distinct ripples—it does not mean that the electron itself has shattered into separate pieces. Rather, it means that there are now a number of locations where the electron might be found with a non-negligible probability. In practice this means that if a particular experiment involving an electron is repeated over and over again in an absolutely identical manner, the same answer for, say, the measured position of an electron will not be found over and over again. Rather, the subsequent repeats of the experiment will yield a variety of different results with the property that the number of times the electron is found at any given location is governed by the shape of the electron's probability wave. If the probability wave (more precisely, the square of the probability wave) is twice as large at location A than at location B, then the theory predicts that in a sequence of many repeats of the experiment the electron will be found at location A twice as often as at location B. Exact outcomes of experiments cannot be predicted; the best we can do is predict the probability that any given outcome may occur.

  Even so, as long as we can determine mathematically the precise form of probability waves, their probabilistic predictions can be tested by repeating a given experiment numerous times, thereby experimentally measuring the likelihood of getting one particular result or another. Just a few months after de Broglie's suggestion, Schrödinger took the decisive step toward this end by determining an equation that governs the shape and the evolution of probability waves, or as they came to be known, wave functions. It was not long before Schrödinger's equation and the probabilistic interpretation were being used to make wonderfully accurate predictions. By 1927, therefore, classical innocence had been lost. Gone were the days of a clockwork universe whose individual constituents were set in motion at some moment in the past and obediently fulfilled their inescapable, uniquely determined destiny. According to quantum mechanics, the universe evolves according to a rigorous and precise mathematical formalism, but this framework determines only the probability that any particular future will happen—not which future actually ensues.

  Many find this conclusion troubling or even downright unacceptable. Einstein was one. In one of physics' most time-honored utterances, Einstein admonished the quantum stalwarts that "God does not play dice with the Universe." He felt that probability was turning up in fundamental physics because of a subtle version of the reason it turns up at the roulette wheel: some basic incompleteness in our understanding. The universe, in Einstein's view, had no room for a future whose precise form involves an element of chance. Physics should predict how the universe evolves, not merely the likelihood that any particular evolution might occur. But experiment after experiment—some of the most convincing ones being carried out after his death—convincingly confirm that Einstein was wrong. As the British theoretical physicist Stephen Hawking has said, on this point "Einstein was confused, not the quantum theory."6

  Nevertheless, the debate about what quantum mechanics really means continues unabated. Everyone agrees on how to use the equations of quantum theory to make accurate predictions. But there is no consensus on what it really means to have probability waves, nor on how a particle "chooses" which of its many possible futures to follow, nor even on whether it really does choose or instead splits off like a branching tributary to live out all possible futures in an ever-expanding arena of parallel universes. These interpretational issues are worthy of a book-length discussion in their own right, and, in fact, there are many excellent books that espouse one or another way of thinking about quantum theory. But what appears certain is that no matter how you interpret quantum mechanics, it undeniably shows that the universe is founded on principles that, from the standpoint of our day-to-day experiences, are bizarre.

  The meta-lesson of both relativity and quantum mechanics is that when we deeply probe the fundamental workings of the universe we may come upon aspects that are vastly different from our expectations. The boldness of asking deep questions may require unforeseen flexibility if we are to accept the answers.

  Feynman's Perspective

  Richard Feynman was one of the greatest theoretical physicists since Einstein. He fully accepted the probabilistic core of quantum mechanics, but in the years following World War II he offered a powerful new way of thinking about the theory. From the stan
dpoint of numerical predictions, Feynman's perspective agrees exactly with all that went before. But its formulation is quite different. Let's describe it in the context of the electron two-slit experiment.

  The troubling thing about Figure 4.8 is that we envision each electron as passing through either the left slit or the right slit and therefore we expect the union of Figures 4.4 and 4.5, as in Figure 4.6, to represent the resulting data accurately. An electron that passes through the right slit should not care that there also happens to be a left slit, and vice versa. But somehow it does. The interference pattern generated requires an overlapping and an intermingling between something sensitive to both slits, even if we fire electrons one by one. Schrödinger de Broglie, and Born explained this phenomenon by associating a probability wave to each electron. Like the water waves in Figure 4.7, the electron's probability wave "sees" both slits and is subject to the same kind of interference from intermingling. Places where the probability wave is augmented by the intermingling, like the places of significant jostling in Figure 4.7, are locations where the electron is likely to be found; places where the probability wave is diminished by the intermingling, like the places of minimal or no jostling in Figure 4.7, are locations where the electron is unlikely or never to be found. Electrons hit the phosphorescent screen one by one, distributed according to this probability profile, and thereby build up an interference pattern like that in Figure 4.8.

  Feynman took a different tack. He challenged the basic classical assumption that each electron either goes through the left slit or the right slit. You might think this to be such a basic property of how things work that challenging it is fatuous. After all, can't you look in the region between the slits and the phosphorescent screen to determine through which slit each electron passes? You can. But now you have changed the experiment. To see the electron you must do something to it—for instance, you can shine light on it, that is, bounce photons off it. Now, on everyday scales photons act as negligible little probes that bounce off trees, paintings, and people with essentially no effect on the state of motion of these comparatively large material bodies. But electrons are little wisps of matter. Regardless of how gingerly you carry out your determination of the slit through which it passed, photons that bounce off the electron necessarily affect its subsequent motion. And this change in motion changes the results of our experiment. If you disturb the experiment just enough to determine the slit through which each electron passes, experiments show that the results change from that of Figure 4.8 and become like that of Figure 4.6! The quantum world ensures that once it has been established that each electron has gone through either the left slit or the right slit, the interference between the two slits disappears.

  And so Feynman was justified in leveling his challenge since—although our experience in the world seems to require that each electron pass through one or the other of the slits—by the late 1920s physicists realized that any attempt to verify this seemingly basic quality of reality ruins the experiment.

  Feynman proclaimed that each electron that makes it through to the phosphorescent screen actually goes through both slits. It sounds crazy, but hang on: Things get even more wild. Feynman argued that in traveling from the source to a given point on the phosphorescent screen each individual electron actually traverses every possible trajectory simultaneously; a few of the trajectories are illustrated in Figure 4.10. It goes in a nice orderly way through the left slit. It simultaneously also goes in a nice orderly way through the right slit. It heads toward the left slit, but suddenly changes course and heads through the right. It meanders back and forth, finally passing through the left slit. It goes on a long journey to the Andromeda galaxy before turning back and passing through the left slit on its way to the screen. And on and on it goes—the electron, according to Feynman, simultaneously "sniffs" out every possible path connecting its starting location with its final destination.

  Feynman showed that he could assign a number to each of these paths in such a way that their combined average yields exactly the same result for the probability calculated using the wave-function approach. And so from Feynman's perspective no probability wave needs to be associated with the electron. Instead, we have to imagine something equally if not more bizarre. The probability that the electron—always viewed as a particle through and through—arrives at any chosen point on the screen is built up from the combined effect of every possible way of getting there. This is known as Feynman's "sum-over-paths" approach to quantum mechanics.7

  At this point your classical upbringing is balking: How can one electron simultaneously take different paths—and no less than an infinite number of them? This seems like a defensible objection, but quantum mechanics—the physics of our world—requires that you hold such pedestrian complaints in abeyance. The result of calculations using Feynman's approach agree with those of the wave function method, which agree with experiments. You must allow nature to dictate what is and what is not sensible. As Feynman once wrote, "[Quantum mechanics] describes nature as absurd from the point of view of common sense. And it fully agrees with experiment. So I hope you can accept nature as She is—absurd."8

  But no matter how absurd nature is when examined on microscopic scales, things must conspire so that we recover the familiar prosaic happenings of the world experienced on everyday scales. To this end, Feynman showed that if you examine the motion of large objects—like baseballs, airplanes, or planets, all large in comparison with subatomic particles—his rule for assigning numbers to each path ensures that all paths but one cancel each other out when their contributions are combined. In effect, only one of the infinity of paths matters as far as the motion of the object is concerned. And this trajectory is precisely the one emerging from Newton's laws of motion. This is why in the everyday world it seems to us that objects—like a ball tossed in the air—follow a single, unique, and predictable trajectory from their origin to their destination. But for microscopic objects, Feynman's rule for assigning numbers to paths shows that many different paths can and often do contribute to an object's motion. In the double-slit experiment, for example, some of these paths pass through different slits, giving rise to the interference pattern observed. In the microscopic realm we therefore cannot assert that an electron passes through only one slit or the other. The interference pattern and Feynman's alternative formulation of quantum mechanics emphatically attest to the contrary.

  Just as we may find that varying interpretations of a book or a film can be more or less helpful in aiding our understanding of different aspects of the work, the same is true of the different approaches to quantum mechanics. Although their predictions always agree completely, the wave function approach and Feynman's sum-over-paths approach give us different ways of thinking about what's going on. As we shall see later on, for some applications, one or the other approach can provide an invaluable explanatory framework.

  Quantum Weirdness

  By now you should have some sense of the dramatically new way that the universe works according to quantum mechanics. If you have not as yet fallen victim to Bohr's dizziness dictum, the quantum weirdness we now discuss should at least make you feel a bit lightheaded.

  Even more so than with the theories of relativity, it is hard to embrace quantum mechanics viscerally—to think like a miniature person born and raised in the microscopic realm. There is, though, one aspect of the theory that can act as a guidepost for your intuition, as it is the hallmark feature that fundamentally differentiates quantum from classical reasoning. It is the uncertainty principle, discovered by the German physicist Werner Heisenberg in 1927.

  This principle grows out of an objection that may have occurred to you earlier. We noted that the act of determining the slit through which each electron passes (its position) necessarily disturbs its subsequent motion (its velocity). But just as we can assure ourselves of someone's presence either by gently touching them or by giving them an overzealous slap on the back, why can't we determine the electron's position
with an "ever gentler" light source in order to have an ever decreasing impact on its motion? From the standpoint of nineteenth-century physics we can. By using an ever dimmer lamp (and an ever more sensitive light detector) we can have a vanishingly small impact on the electron's motion. But quantum mechanics itself illuminates a flaw in this reasoning. As we turn down the intensity of the light source we now know that we are decreasing the number of photons it emits. Once we get down to emitting individual photons we cannot dim the light any further without actually turning it off. There is a fundamental quantum-mechanical limit to the "gentleness" of our probe. And hence, there is always a minimal disruption that we cause to the electron's velocity through our measurement of its position.

  Well, that's almost correct. Planck's law tells us that the energy of a single photon is proportional to its frequency (inversely proportional to its wavelength). By using light of lower and lower frequency (larger and larger wavelength) we can therefore produce ever gentler individual photons. But here's the catch. When we bounce a wave off of an object, the information we receive is only enough to determine the object's position to within a margin of error equal to the wave's wavelength. To get an intuitive feel for this important fact, imagine trying to pinpoint the location of a large, slightly submerged rock by the way it affects passing ocean waves. As the waves approach the rock, they form a nice orderly train of one up-and-down wave cycle followed by another. After passing by the rock, the individual wave cycles are distorted—the telltale sign of the submerged rock's presence. But like the finest set of tick marks on a ruler, the individual up-and-down wave cycles are the finest units making up the wave-train, and therefore by examining solely how they are disrupted we can determine the rock's location only to within a margin of error equal to the length of the wave cycles, that is, the wave's wavelength. In the case of light, the constituent photons are, roughly speaking, the individual wave cycles (with the height of the wave cycles being determined by the number of photons); a photon, therefore, can be used to pinpoint an object's location only to within a precision of one wavelength.

 

‹ Prev