by Brian Greene
A fourth approach, developed by the Italian physicists Giancarlo Ghirardi, Alberto Rimini, and Tullio Weber, makes the bold move of modifying Schrödinger's equation in a clever way that results in hardly any effect on the evolution of wavefunctions for individual particles, but has a dramatic impact on quantum evolution when applied to "big" everyday objects. The proposed modification envisions that wavefunctions are inherently unstable; even without any meddling, these researchers suggest, sooner or later every wavefunction collapses, of its own accord, to a spiked shape. For an individual particle, Ghirardi, Rimini, and Weber postulate that wavefunction collapse happens spontaneously and randomly, kicking in, on average, only once every billion years or so. 10 This is so infrequent that it entails only the slightest change to the usual quantum mechanical description of individual particles, and that's good, since quantum mechanics describes the microworld with unprecedented accuracy. But for large objects such as experimenters and their equipment, which have billions and billions of particles, the odds are high that in a tiny fraction of any given second the posited spontaneous collapse will kick in for at least one constituent particle, causing its wavefunction to collapse. And, as argued by Ghirardi, Rimini, Weber, and others, the entangled nature of all the individual wavefunctions in a large object ensures that this collapse initiates a kind of quantum domino effect in which the wavefunctions of all the constituent particles collapse as well. As this happens in a brief fraction of a second, the proposed modification ensures that large objects are essentially always in one definite configuration: pointers on measuring equipment always point to one definite value; the moon is always at one definite location in the sky; brains inside experimenters always have one definite experience; cats are always either dead or alive.
Each of these approaches, as well as a number of others I won't discuss, has its supporters and detractors. The "wavefunction as knowledge" approach finesses the issue of wavefunction collapse by denying any reality for wavefunctions, turning them instead into mere descriptors of what we know. But why, a detractor asks, should fundamental physics be so closely tied to human awareness? If we were not here to observe the world, would wavefunctions never collapse, or, perhaps, would the very concept of a wavefunction not exist? Was the universe a vastly different place before human consciousness evolved on planet earth? What if, instead of human experimenters, mice or ants or amoebas or computers are the only observers? Is the change in their "knowledge" adequate to be associated with the collapse of a wavefunction? 11
By contrast, the Many Worlds interpretation avoids the whole matter of wavefunction collapse, since in this approach wavefunctions don't collapse. But the price to pay is an enormous proliferation of universes, something that many a detractor has found intolerably exorbitant. 12 Bohm's approach also avoids wavefunction collapse; but, its detractors claim, in granting independent reality to both particles and waves, the theory lacks economy. Moreover, the detractors correctly argue, in Bohm's formulation the wavefunction can exert faster-than-light influences on the particles it pushes. Supporters note that the former complaint is subjective at best, and the latter conforms to the nonlocality Bell proved unavoidable, so neither criticism is convincing. Nevertheless, perhaps unjustifiably, Bohm's approach has never caught on. 13 The Ghirardi-Rimini-Weber approach deals with wavefunction collapse directly, by changing the equations to incorporate a new spontaneous collapse mechanism. But, detractors point out, there is as yet not a shred of experimental evidence supporting the proposed modification to Schrödinger's equation.
Research seeking a solid and fully transparent connection between the formalism of quantum mechanics and the experience of everyday life will no doubt go on for some time to come, and it's hard to say which, if any, of the known approaches will ultimately achieve a majority consensus. Were physicists to be polled today, I don't think there would be an overwhelming favorite. Unfortunately, experimental input is of limited help. While the Ghirardi-Rimini-Weber proposal does make predictions that can, in certain situations, differ from standard stage one / stage two quantum mechanics, the deviations are too small to be tested with today's technology. The situation with the other three proposals is worse because they stymie experimental adjudication even more definitively. They agree fully with the standard approach, and so each yields the same predictions for things that can be observed and measured. They differ only regarding what happens backstage, as it were. They only differ, that is, regarding what quantum mechanics implies for the underlying nature of reality.
Even though the quantum measurement problem remains unsolved, during the last few decades a framework has been under development that, while still incomplete, has widespread support as a likely ingredient of any viable solution. It's called decoherence.
Decoherence and Quantum Reality
When you first encounter the probabilistic aspect of quantum mechanics, a natural reaction is to think that it is no more exotic than the probabilities that arise in coin tosses or roulette wheels. But when you learn about quantum interference, you realize that probability enters quantum mechanics in a far more fundamental way. In everyday examples, various outcomes—heads versus tails, red versus black, one lottery number versus another—are assigned probabilities with the understanding that one or another result will definitely happen and that each result is the end product of an independent, definite history. When a coin is tossed, sometimes the spinning motion is just right for the toss to come out heads and sometimes it's just right for the toss to come out tails. The 50-50 probability we assign to each outcome refers not just to the final result—heads or tails— but also to the histories that lead to each outcome. Half of the possible ways you can toss a coin result in heads, and half result in tails. The histories themselves, though, are totally separate, isolated alternatives. There is no sense in which different motions of the coin reinforce each other or cancel each other out. They're all independent.
But in quantum mechanics, things are different. The alternate paths an electron can follow from the two slits to the detector are not separate, isolated histories. The possible histories commingle to produce the observed outcome. Some paths reinforce each other, while others cancel each other out. Such quantum interference between the various possible histories is responsible for the pattern of light and dark bands on the detector screen. Thus, the telltale difference between the quantum and the classical notions of probability is that the former is subject to interference and the latter is not.
Decoherence is a widespread phenomenon that forms a bridge between the quantum physics of the small and the classical physics of the not-so-small by suppressing quantum interference—that is, by diminishing sharply the core difference between quantum and classical probabilities. The importance of decoherence was realized way back in the early days of quantum theory, but its modern incarnation dates from a seminal paper by the German physicist Dieter Zeh in 1970, 14 and has since been developed by many researchers, including Erich Joos, also from Germany, and Wojciech Zurek, of the Los Alamos National Laboratory in New Mexico.
Here's the idea. When Schrödinger's equation is applied in a simple situation such as single, isolated photons passing through a screen with two slits, it gives rise to the famous interference pattern. But there are two very special features of this laboratory example that are not characteristic of real-world happenings. First, the things we encounter in day-to-day life are larger and more complicated than a single photon. Second, the things we encounter in day-to-day life are not isolated: they interact with us and with the environment. The book now in your hands is subject to human contact and, more generally, is continually struck by photons and air molecules. Moreover, since the book itself is made of many molecules and atoms, these constantly jittering constituents are continually bouncing off each other as well. The same is true for pointers on measuring devices, for cats, for human brains, and for just about everything you encounter in daily life. On astrophysical scales, the earth, the moon, asteroids, and the other
planets are continually bombarded by photons from the sun. Even a grain of dust floating in the darkness of outer space is subject to continual hits from low-energy microwave photons that have been streaming through space since a short time after the big bang. And so, to understand what quantum mechanics says about real-world happenings—as opposed to pristine laboratory experiments—we should apply Schrödinger's equation to these more complex, messier situations.
In essence, this is what Zeh emphasized, and his work, together with that of many others who have followed, has revealed something quite wonderful. Although photons and air molecules are too small to have any significant effect on the motion of a big object like this book or a cat, they are able to do something else. They continually "nudge" the big object's wavefunction, or, in physics-speak, they disturb its coherence: they blur its orderly sequence of crest followed by trough followed by crest. This is critical, because a wavefunction's orderliness is central to generating interference effects (see Figure 4.2). And so, much as adding tagging devices to the double-slit experiment blurs the resulting wavefunction and thereby washes out interference effects, the constant bombardment of objects by constituents of their environment also washes out the possibility of intereference phenomena. In turn, once quantum interference is no longer possible, the probabilities inherent to quantum mechanics are, for all practical purposes, just like the probabilities inherent to coin tosses and roulette wheels. Once environmental decoherence blurs a wavefunction, the exotic nature of quantum probabilities melts into the more familiar probabilities of day-to-day life. 15 This suggests a resolution of the quantum measurement puzzle, one that, if realized, would be just about the best thing we could hope for. I'll describe it first in the most optimistic light, and then stress what still needs to be done.
If a wavefunction for an isolated electron shows that it has, say, a 50 percent chance of being here and a 50 percent chance of being there, we must interpret these probabilities using the full-fledged weirdness of quantum mechanics. Since both of the alternatives can reveal themselves by commingling and generating an interference pattern, we must think of them as equally real. In loose language, there's a sense in which the electron is at both locations. What happens now if we measure the electron's position with a nonisolated, everyday-sized laboratory instrument? Well, corresponding to the electron's ambiguous whereabouts, the pointer on the instrument has a 50 percent chance of pointing to this value and a 50 percent chance of pointing to that value. But because of decoherence, the pointer will not be in a ghostly mixture of pointing at both values; because of decoherence, we can interpret these probabilities in the usual, classical, everyday sense. Just as a coin has a 50 percent chance of landing heads and a 50 percent chance of landing tails, but lands either heads or tails, the pointer has a 50 percent chance of pointing to this value and a 50 percent chance of pointing to that value, but it will definitely point to one or the other.
Similar reasoning applies for all other complex, nonisolated objects. If a quantum calculation reveals that a cat, sitting in a closed box, has a 50 percent chance of being dead and a 50 percent chance of being alive— because there is a 50 percent chance that an electron will hit a booby-trap mechanism that subjects the cat to poison gas and a 50 percent chance that the electron misses the booby trap—decoherence suggests that the cat will not be in some absurd mixed state of being both dead and alive. Although decades of heated debate have been devoted to issues like What does it mean for a cat to be both dead and alive? How does the act of opening the box and observing the cat force it to choose a definite status, dead or alive?, decoherence suggests that long before you open the box, the environment has already completed billions of observations that, in almost no time at all, turned all mysterious quantum probabilities into their less mysterious classical counterparts. Long before you look at it, the environment has compelled the cat to take on one, single, definite condition. Decoherence forces much of the weirdness of quantum physics to "leak" from large objects since, bit by bit, the quantum weirdness is carried away by the innumerable impinging particles from the environment.
It's hard to imagine a more satisfying solution to the quantum measurement problem. By being more realistic and abandoning the simplifying assumption that ignores the environment—a simplification that was crucial to making progress during the early development of the field—we would find that quantum mechanics has a built-in solution. Human consciousness, human experimenters, and human observations would no longer play a special role since they (we!) would simply be elements of the environment, like air molecules and photons, which can interact with a given physical system. There would also no longer be a stage one / stage two split between the evolution of the objects and the experimenter who measures them. Everything—observed and observer—would be on an equal footing. Everything—observed and observer—would be subject to precisely the same quantum mechanical law as is set down in Schrödinger's equation. The act of measurement would no longer be special; it would merely be one specific example of contact with the environment.
Is that it? Does decoherence resolve the quantum measurement problem? Is decoherence responsible for wavefunctions' closing the door on all but one of the potential outcomes to which they can lead? Some think so. Researchers like Robert Griffiths, of Carnegie Mellon; Roland Omnès, of Orsay; the Nobel laureate Murray Gell-Mann, of the Santa Fe Institute; and Jim Hartle, of the University of California at Santa Barbara, have made great progress and claim that they have developed decoherence into a complete framework (called decoherent histories ) that solves the measurement problem. Others, like myself, are intrigued but not yet fully convinced. You see, the power of decoherence is that it successfully removes the artificial barrier Bohr erected between large and small physical systems, making everything subject to the same quantum mechanical formulas. This is important progress and I think Bohr would have found it gratifying. Although the unresolved quantum measurement problem never diminished physicists' ability to reconcile theoretical calculations with experimental data, it did lead Bohr and his colleagues to articulate a quantum mechanical framework with some distinctly awkward features. Many found the framework's need for fuzzy words about wavefunction collapse or the imprecise notion of "large" systems belonging to the dominion of classical physics, unnerving. To a significant extent, by taking account of decoherence, researchers have rendered these vague ideas unnecessary.
However, a key issue that I skirted in the description above is that even though decoherence suppresses quantum interference and thereby coaxes weird quantum probabilities to be like their familiar classical counterparts, each of the potential outcomes embodied in a wavefunction still vies for realization. And so we are still left wondering how one outcome "wins" and where the many other possibilities "go" when that actually happens. When a coin is tossed, classical physics gives an answer to the analogous question. It says that if you examine the way the coin is set spinning with adequate precision, you can, in principle, predict whether it will land heads or tails. On closer inspection, then, precisely one outcome is determined by details you initially overlooked. The same cannot be said in quantum physics. Decoherence allows quantum probabilities to be interpreted much like classical ones, but does not provide any finer details that select one of the many possible outcomes to actually happen.
Much in the spirit of Bohr, some physicists believe that searching for such an explanation of how a single, definite outcome arises is misguided. These physicists argue that quantum mechanics, with its updating to include decoherence, is a sharply formulated theory whose predictions account for the behavior of laboratory measuring devices. And according to this view, that is the goal of science. To seek an explanation of what's really going on, to strive for an understanding of how a particular outcome came to be, to hunt for a level of reality beyond detector readings and computerprintouts betrays an unreasonable intellectual greediness.
Many others, including me, have a different perspective. Explaining data is
what science is about. But many physicists believe that science is also about embracing the theories data confirms and going further by using them to gain maximal insight into the nature of reality. I strongly suspect that there is much insight to be gained by pushing onward toward a complete solution of the measurement problem.
Thus, although there is wide agreement that environment-induced decoherence is a crucial part of the structure spanning the quantum-to-classical divide, and while many are hopeful that these considerations will one day coalesce into a complete and cogent connection between the two, far from everyone is convinced that the bridge has yet been fully built.
Quantum Mechanics and the Arrow of Time
So where do we stand on the measurement problem, and what does it mean for the arrow of time? Broadly speaking, there are two classes of proposals for linking common experience with quantum reality. In the first class (for example, wavefunction as knowledge; Many Worlds; decoherence), Schrödinger's equation is the be-all and end-all of the story; the proposals simply provide different ways of interpreting what the equation means for physical reality. In the second class (for example, Bohm; Ghirardi-Rimini-Weber), Schrödinger's equation must be supplemented with other equations (in Bohm's case, an equation that shows how a wavefunction pushes a particle around) or it must be modified (in the Ghirardi-Rimini-Weber case, to incorporate a new, explicit collapse mechanism). A key question for determining the impact on time's arrow is whether these proposals introduce a fundamental asymmetry between one direction in time and the other. Remember, Schrödinger's equation, just like those of Newton, Maxwell, and Einstein, treats forward and backward in time on a completely equal footing. It provides no arrow to temporal evolution. Do any of the proposals change this?