Book Read Free

The Many Worlds of Hugh Everett III: Multiple Universes, Mutual Assured Destruction, and the Meltdown of a Nuclear Family

Page 42

by Peter Byrne


  Zeh is widely credited with being the first physicist to analyze decoherence as a possible solution to the measurement problem.20 In Zeh’s scheme, each element in a superposition over a certain size or level of complexity “decouples” from the others as the physical process of entanglement with the environment destroys the interferences encoded in their shared wave function. Each element of the superposition delinks from the composite wave function describing the erstwhile superposition and assumes its own, separate wave function. Zeh cited Everett’s relative states formulation as analogous to his decoupling theory, which also employs the concept of a universal wave function.

  Echoing Everett, Zeh claimed that the EPR paradox and the mystery of nonlocality are solved by his theory, as each measurement result happens in a different “world component,” and no superluminal contradictions occur. Zeh recalls that he did not know about Everett’s theory until shortly after he wrote his now-famous paper, “On the interpretation of measurement in quantum theory,” which was, in effect, “an argument for the Everett interpretation.”

  Decades later, after reviewing the mini-papers (from the basement) that Everett had submitted to Wheeler in 1956, Zeh commented that Everett’s “derivation” of the Born Rule from the quantum formalism was more of a plausibility argument than a precise derivation, but that it made sense overall as being the only reasonable choice for a probability measure if objective reality is represented by the universal wave function. He said the metaphor of the splitting amoeba shows that Everett viewed the branching as physically real. And he decided that, “Everett seems to come very close to decoherence. One may assume that Everett would some day have discovered the full implications of decoherence, but when he refers to the ‘environment,’ he seems to mean classical correlations, not microscopic entanglement.”21

  Everett did not realize the importance of solving the preferred basis problem, says Zeh, but this problem, according to many (but certainly not the majority of) physicists and philosophers who think about these matters, has since been resolved by decoherence theory, which treats the universe as completely quantum mechanical and models how classical worlds emerge (decohere) from within the universal superposition of all physically possible events.

  Unfortunately for Zeh, it turned out that associating one’s career with Everett’s theory was not a wise move at the time. Zeh’s (soon to be classic) paper was turned down by several prominent journals; one editor called it “senseless.” And a Nobel Prize winning heavyweight at the University of Heidelberg, J. Hans D. Jensen, sent a copy of Zeh’s paper to Rosenfeld, who replied acerbically, urging Jensen to make sure that “such a concentrate of wildest nonsense is not being distributed around the world with your blessing.” Jensen told Zeh to “let sleeping dogs lie,” and Zeh’s academic career was negatively impacted by his inability to stop kicking the dog.22

  In 1970, the paper was published in a new journal, Foundations of Physics, refereed by Wigner, Margenau, Bohm, de Broglie and other dissenters from the Copenhagen interpretation. DeWitt and Zeh hooked up at the Varenna conference that same year and each championed Everett’s work in his own way. Zeh’s decoupling analysis evolved into the now widely accepted theory of how quantum systems transition into classical systems through decoherence—a process which is constantly taking place and does not require the causal agency of consciousness or the presence of an external observer. But until the early 1980s, decoherence theory was ignored or maligned by many mainstream physicists; Zeh calls that period “the dark ages.”23 But the maverick from Heidelberg blazed a trail soon widened by others, including Wojciech Zurek, Erich Joos, Murray Gell-Mann, Stephen Hawking, and James B. Hartle: all of whom credit Everett with inspiring them to think beyond the limitations on understanding imposed by the Copenhagen interpretation and the collapse postulate.

  In the late 1970s, Wojciech Zurek was a graduate student at the University of Texas. After hearing Wheeler and Deutsch talk about the preferred basis problem in the Everett interpretation, Zurek was inspired to tackle it. By 1981, he had mathematically modeled a method of predicting how the continuous information transfer caused by the entangling of objects with their environments automatically selects the preferred basis—the menu of possible choices that includes the classical world of our experience.24 This addressed the central problem with Everett’s formalism, which did not satisfactorily demonstrate how specific classical systems (branch histories) emerge out of superpositions of all physically possible events. Preferred basis25 is the measurement problem writ large: why does our world (or a collection of worlds like ours) emerge from an infinity of alternatives?

  Zurek, who works at the Los Alamos National Laboratory, is concerned with local dynamics, with information transfers. He explains decoherence in terms of “quantum Darwinism.” Making an analogy to biological natural selection, Zurek shows how certain possible branches in the multiverse become more lasting and robust than others by repeatedly copying themselves into the quantum environment. Obviating the need for postulating wave function collapse, Zurek says that macroscopic observers perceive the classicality of these foliating branches without disturbing their smooth evolution through time as per the Schrödinger equation. As a metaphor, consider how differently positioned readers can access the information in this sentence by intercepting select copies of the text that proliferate throughout the photon environment at different angles. You cannot read this sentence from behind the book, but there are a vast number of angles from which you can view it and incorporate its information into your brain. Meanwhile, endless cascades of photons ricochet off the printed page in various directions holding similar information and recording it (redundantly)in the environment.

  Physicists still talk about how Zurek put decoherence on the map in October 1991 with his article “Decoherence and the Transition from Quantum to Classical” in Physics Today.26 He showed how the local environment dynamically “sucks information out” of the superposed quantum system with which it is correlating, entangling—leaving behind our classical world. The residual information is not destroyed—it is hidden from our sight, dispersed into other correlations between the system and the environment.

  This is heady stuff! Zurek says it “makes the branching analogy in Everett’s writings more literal.” Inspired by the role that entanglement plays in decoherence, and the relativity of Everett’s “relative states,” Zurek went on to derive a probability measure for quantum mechanics that does not require assuming the Born rule (as Everett seems to have done, if only unconsciously).27

  Zurek does not know if Everett’s worlds are real or not, but he credits him for breaking up an intellectual log jam in physics. The many worlds interpretation gave physicists “permission” to treat the environment as quantum mechanical by viewing the universe as a closed system describable by a non-collapsing universal wave function. Permission was granted to move beyond the limitations imposed on inquiry by Bohr and von Neumann. But “quantum Darwinism” does not contradict Bohr, nor von Neumann, says Zurek. Without collapsing the wave function, per se, decoherence, in effect, smoothly inserts a quantum-classical partition at the moment of measurement of a branching or non-branching quantum system. Keeping intact the linearity of the Schrödinger equation, it is conducive to single world and many worlds interpretations, without ruling out either.28

  Decoherence theory does not resolve the most vexing question about the Everett worlds, as articulated by Zurek:

  One might regard [quantum] states as purely epistemic (as did Bohr) or attribute to them ‘existence.’ Technical results … suggest that truth lies somewhere between these two extremes. It is therefore not clear whether one is forced is attribute ‘reality’ to all branches of the universal state vector [i.e., universal wave function].29

  In the realm of interpretation, Zeh has long been opposed to Bohr’s ontology. In a letter to Wheeler in 1980, he complained:

  I expect the Copenhagen interpretation will sometime be called the greatest sophism in t
he history of science, but I would consider it a terrible injustice if – when someday a solution should be found – some people claim that this is of course what Bohr always meant, only because he was sufficiently vague.30

  Zeh focuses on the non-local aspect of a decoherent event as the effect of local entanglement propagating at light speed like a giant, relativistic “zipper.” Within the giant superposition of the universal wave function, Zeh views only the subjective consciousness of each observer as “real.” He takes what he calls a “many minds” approach (although the minds are virtual copies of brains, not some supra-physical consciousness or Mind). In his model, all branches, including the particular decohered branch of each conscious observer can only be known as “a heuristic picture—not a physical process.”31 To some extent, this singles out the concept of consciousness as a special process—a burden which Everett did not impose upon his physical theory which considers all the branches as equally real.32

  In sum, both Zurek and Zeh address the problem of describing how macroscopic systems decohere from coherent superpositions without changing the Schrödinger equation. They model how quantum systems behave as they entangle with their immediate surroundings and the remainder of the quantum universe. In decoherence theory, the universe—including measured systems, measuring apparatuses, and their respective environments—evolves according to the time-dependent Schrödinger equation. And a branching observer can be included in the universal wave function: Unlike the more restrictive Copenhagen interpretation, which defines the observer as classical and forever external to the observed microsystem.

  Convinced by experiments showing mesoscopic objects decohering from coherent superpositions, many theoretical physicists and philosophers accept the decoherence model as explanatory of the quantum to classical transition without having to decide whether or not it is best interpreted in terms of an Everettian multiverse, or a single universe.33 In this sense, decoherence theory subsumes the standard conception of measurement with its postulated discontinuity of wave function collapse. As cosmologist James B. Hartle observes, “In short, classical physics is an approximate emergent feature of the kind of entirely quantum universe that Everett talked about.”34

  Nous

  In November 1984, the philosophical journal, Nous, published a special issue on foundational questions in quantum mechanics. It was dominated by discussions about the meaning of entanglement and the many worlds interpretation. Echoing Wheeler’s fluctuations on the many worlds question, some philosophers were simultaneously attracted by the notion of a universal wave function as a solution to the measurement problem and repelled and disconcerted by its logical implications.

  Howard Stein of the University of Chicago found Everett’s branching worlds to be “a bizarre notion with no compensating gain,” because the many worlds theory was, in his opinion (as well as Everett’s) not falsifiable by experiment:

  On the other hand, the view that [wave function collapse] never occurs, and that all processes are governed by quantum mechanics, is one that deserves to remain in the field … It is worth remembering that great advances in physics have sometimes resulted from the discovery of effects, predicted by theories, that had long seemed unlikely to occur and that many physicists regarded as in principle impossible to detect…. The problem may not be ripe for solution.35

  Stein’s colleague, physicist Robert Geroch, upheld the core of Everett’s theory in a way that avoided using the concept of “splitting” observers or branching universes. He found in Everett a new way of conceptualizing probability in a totally quantum universe; he was attracted by its compatibility with relativity and its implications for a theory of quantum gravity.36 He proposed a kind of “one-world version of Everett.”37

  Philosopher Richard A. Healey of the University of California, Los Angeles raised the question of how an observer maintains a sense of identity while constantly dividing into beings with shared pasts and divergent futures. This question particularly bothers philosophers who are attracted by the non-collapse model, but are unwilling to let go of a notion of an indivisible personal identity. Some influential philosophers and physicists are not at all comfortable with the idea that “mind” is a purely physical phenomenon.38 Healy was not of this mentalist school, but his understanding of Everett reflected a common attitude among physicists and philosophers who wished to extirpate the collapse postulate via a universal wave function, but declined to accept the consequence that their bodies—and/or their immortal souls—split.

  Healey remarked that Wheeler had recently withdrawn his support of Everett, even though the relative states theory “fit naturally with Wheeler’s conception of super space, considered as an approach to quantizing space-time as well as its contents.” In the end, Healy bemoaned the lack of consensus on what Everett was actually saying: “The interpretation needs interpreting.”39

  Complexity and information

  In 1989, the Santa Fe Institute in Santa Fe, New Mexico hosted a remarkable conference, “Complexity, Entropy and the Physics of Information.” Zurek, Zeh, and Wheeler made presentations on the relation of information theory to quantum mechanics, as did a score of other prominent physicists. Everett was widely credited as bringing information theory to bear on quantum mechanics, and the debate over whether or not his branching universes are physically real, or simply a conceptual tool, continued.

  The conference issued a manifesto—“The specter of information is haunting science.” It affirmed the importance of understanding thermodynamics, the arrow of time, and the measurement problem as “transfers of information.” The usefulness of Everett’s universal wave function was acknowledged:

  The distinction between what is and what is known to be, so clear in classical physics, is blurred, and perhaps does not exist at all on a quantum level. For instance, energetically insignificant interactions of an object with its quantum environment suffice to destroy its quantum nature. It is as if the ‘watchful eye’ of the environment ‘monitoring’ the state of the quantum system forced it to behave in an effectively classical manner. Yet, even phenomena involving gravity, which happen on the most macroscopic of all the scales, bear the imprint of quantum mechanics.

  In fact, it was recently suggested that the whole Universe—including configurations of its gravitational field—may and should be described by means of quantum theory.

  Interpreting results of the calculations performed in such a ‘Wave function of the Universe’ is difficult, as the rules of thumb usually involved in discussions of experiments on atoms, photons, and electrons assume that the ‘measuring apparatus’ as well as the ‘observer’ are much larger than the quantum system. This is clearly not the case when the quantum system is the whole Universe.40

  Decoherence theory was one of the main themes of the conference—as was the role of algorithmic randomness, which defines the information content of an object based on the theory of computation rather than on probabilities.41 Importantly, Hartle and Murray Gell-Mann presented “Quantum Mechanics in the Light of Quantum Cosmology,” a theory of decoherence known as “consistent histories.”42 Hartle had been thinking about Everett since the late 1960s, when he derived a probability measure from the quantum formalism without postulating the Born rule or wave function collapse.43

  Hartle and Gell-Mann credited Everett with suggesting how to apply quantum mechanics to cosmology. They considered their “decohering sets of histories” theory as an “extension” of his work. Using the Feynman path integrals, they painted a picture of the initial conditions of the universe when it was completely quantum mechanical. Their method treats the Everett “worlds” as “histories” giving “definite meaning to Everett’s ‘branches.’”44 They assign probability weights to possible histories of the universe, and, importantly, include observers in the wave function.

  Hartle declines to state whether or not he considers the branching histories outside the one we experience to be physically real, or purely computational. And he says that “pred
ictions and tests of the theory are not affected by whether or not you take one view or the other.”45

  Everett, of course, settled for describing all of the branches as “equally real,” which, given that our branch is real, would mean that all of the branches are real.

  Conference participant Jonathan J. Halliwell of MIT later wrote an article for Scientific American, “Quantum Cosmology and the Creation of the Universe.” He explained that cosmologists owe Everett a debt for opening the door to a completely quantum universe. The magazine ran photographs of the most important figures in the history of quantum cosmology: Schrödinger, Gamow, Wheeler, DeWitt, Hawking and “Hugh Everett III, a student of Wheeler in the 1950s at Princeton [who] solved the observer-observed problem with his ‘many worlds’ interpretation.”46

  And so, even as his ashes sank into a Virginia landfill, Everett’s intellectual progeny were hard at work taking his theory to new levels.

  39 Everett goes to Oxford

  Cosmologists, even more than laboratory physicists, must find the usual interpretive rules of quantum mechanics a bit frustrating…. It would seem to me that the theory is exclusively concerned with ‘results of measurement’ and has nothing to say about anything else. When the ‘system’ in question is the whole world where is the ‘measurer’ to be found? Inside, rather than outside, presumably. What exactly qualifies some subsystems to play this role? Was the world wave function waiting to jump for thousands of millions of years until a single-celled living creature appeared? Or did it have to wait a little longer for some more highly qualified measurer - with a PhD? If the theory is to apply to anything but idealized laboratory operations, are we not obligated to admit that more or less ‘measurement-like’ processes are going on more or less all the time more or less everywhere?

 

‹ Prev