Book Read Free

Einstein's Unfinished Revolution

Page 18

by Lee Smolin


  Even among the benevolent branches there would be disparities in health. Tomorrow a gamma ray will strike a strand of my DNA, and the consequences will include the splitting of us and our world into a bunch of decohering worlds. Some of my copies will develop cancer as a result; some won’t. There are versions of me in both sets; hence I care about both. The extreme version of this argument suggests that, far into the future, some very fortunate copies of me, who had the luck to dodge every bullet and survive every cancer, will be still alive.

  It seems to me that the Many Worlds Interpretation offers a profound challenge to our moral thinking because it erases the distinction between the possible and the actual. For me, the reason to strive to make a better world is that we can hope to make the actual future better than the possible futures we were dealt to begin with. If every eventuality we worked to eliminate, whether starvation, disease, or tyranny, was actual somewhere else in the wave function, then our efforts would not result in an overall improvement. Issues such as nuclear war and climate change are less urgent if there are multiple versions of Earth and the human race has more than one chance to get things right.

  The existence of all these copies of ourselves would then seem to me to present a moral and ethical quandary. If no matter what choices I make in life, there will be a version of me that will take the opposite choice, then why does it matter what I choose? There will be a branch in the multiverse for every option I might have chosen. There are branches in which I become as evil as Stalin and Hitler and there are branches where I am loved as a successor to Gandhi. I might as well be selfish and make the choices that benefit me. Irrespective of what I choose, the kind and generous choice will be made by an infinite number of copies living in an infinite number of other branches.

  This seems to me to be an ethical problem because simply believing in the existence of all these copies lessens my own sense of moral responsibility.

  A dear friend who works on Everettian quantum theory would insist that, nonetheless, this is a way the world might be. Our job is to figure out how the world is, and it is not up to us to impose our personal likes and dislikes. My reply is that, so long as there is no decisive argument to prefer Everett over other approaches, I am free to bet on another approach. They are free to do otherwise, but I choose to invest my time in developing cosmologies that inspire us to look for new particles, new phenomena, new physics, over the scholastic contemplation of the lives of copies of ourselves.

  And, I might add, given that I don’t believe it is likely that Everett or anything like it is going to turn out to be true, there is little danger of harm if a few brilliant philosophers choose to spend their efforts working out the consequences of a truly startling and subtle hypothesis. (Were the idea to come to influence the zeitgeist, that would be something else to worry about.) Even if it is a wrong idea, it is an idea that probably had to come up sooner or later, and they have the right kind of analytically able minds, rigorously trained, that are suited to the question, which mine is clearly not. Let us then hope they will finally resolve the question of whether or not a realist theory based on Rule 1 alone can make sense.

  * * *

  —

  THE DISTINGUISHED PARTICLE THEORIST Steven Weinberg recently weighed in on the failure of efforts to deduce probabilities from quantum mechanics.

  There is another thing that is unsatisfactory about the [Many Worlds] realist approach, beyond our parochial preferences [e.g., “not liking” the idea of having copies]. In this approach, the wave function of the multiverse evolves deterministically. We can still talk of probabilities as the fractions of the time that various possible results are found when measurements are performed many times in any one history; but the rules that govern what probabilities are observed would have to follow from the deterministic evolution of the whole multiverse. . . . Several attempts following the realist approach have come close to deducing rules like the [probability] Born rule that we know work well experimentally, but I think without final success.4

  * * *

  —

  THERE IS A LAST MORAL to draw from the story of Everettian quantum mechanics. Some of its proponents claim that Everettian quantum mechanics is quantum mechanics, and that all else is a modification of it. But that is simply not the case. Ordinary textbook quantum mechanics—by which I mean the theory that is taught in the standard textbooks (Dirac, Bohm, Baym, Shankar, Schiff, etc.), and therefore the theory in common use by real physicists—is based on Rule 1 and Rule 2. That theory simply does not have a realist interpretation.

  So realism, in any version, has a price. The question is only what price we have to pay to get a new theory that makes complete sense and describes nature correctly and completely.

  PART 3

  BEYOND THE QUANTUM

  TWELVE

  Alternatives to Revolution

  In the end we are driven to search for what we hope will turn out to be the correct ontology of the world. After all, it is the desire to understand what reality is like that burns deepest in the soul of any true physicist.

  —LUCIEN HARDY

  In the last few years the field of quantum foundations has undergone a lively ascension. After eight decades in the shadows, it is finally possible to make a good career as a specialist in quantum foundations. That is for the good; however, most of the progress (and most of the young people) has been on the anti-realist side of the field. The aim of most of the new work has not been to modify or complete quantum theory, but only to give us a new way of speaking about it. To explain why, I need to review a bit of the history of the field of quantum foundations.

  Quantum mechanics did not spring up overnight. It was the result of a long gestation, which began in 1900 with Planck’s discovery that energy carried by light came in discrete packets, and culminated in the final form of quantum mechanics being established in 1927. There followed a period of debate among the founders, during which many of the quantum physicists were concerned with the foundations of the new theory. However, this period of free debate soon came to an end, and, despite the objections of Einstein, Schrödinger, and de Broglie, it culminated with the triumph of the Copenhagen view.

  From the early 1930s through the mid-1990s, most physicists regarded the question of the meaning of quantum mechanics as settled. This long dark age was punctuated by the important works of Bohm, Bell, Everett, and a few others, but most of the community of physicists paid little attention to these works or to foundational questions in general. One can see this from the fact that the crucial papers by those authors had very few citations into the mid-1970s, when the experimental tests of Bell’s restriction began to be done. Even now, it is not uncommon to find very accomplished physicists who believe, incorrectly, that Bell proved all hidden variable theories must be wrong.* Until very recently, there were virtually no academic positions in physics departments for physicists focused on quantum foundations. The tiny community of specialists in quantum foundations either earned their tenure for other work, as Bell did, or, like Bohm, found places in out-of-the-way corners of the academic world. A few made careers in philosophy or mathematics, others by teaching in small undergraduate colleges.

  It was the promise of quantum computing that began, just before the turn of this century, to open doors to people who wanted to work on quantum foundations. The idea that quantum mechanics could be used to construct a new kind of computer was broached by Richard Feynman in a lecture1 in 1981. That talk, and other early anticipations of the idea, seemed to make little impression until David Deutsch, originally a specialist in quantum gravity who held a position at Oxford, proposed in 1989 an approach to quantum computation in the context of a paper on the foundations of mathematics and logic.2 In his paper, Deutsch introduced the idea of a universal quantum computer, analogous to a Turing machine. A few years later Peter Shore, a computer scientist working for an IBM research laboratory, proved that a quantum computer could factor large nu
mbers much faster than a regular computer. At that point people began to take notice, because one application of being able to factor large numbers is that many of the codes now in common use could be broken.

  Research groups began to spring up around the world, and they quickly filled with brilliant young researchers, many of whom had a dual research strategy in which they would attack the problems in quantum foundations while contributing to the development of quantum computing. As a result, a new language for quantum physics was invented that was based on information theory, which is a basic tool of computer science. This new language, called quantum information theory, is a hybrid of computer science and quantum physics and is well adapted to the challenges of building quantum computers. This has led to a powerful set of tools and concepts that have proved invaluable at sharpening our understanding of quantum physics. Nonetheless, quantum information theory is a purely operational approach that is most comfortable describing nature in the context of experiments, in which systems are prepared and then measured. Nature outside the laboratory hardly makes an appearance, and when it does, it is often, not surprisingly, analogized to a quantum computer.

  The current renaissance of the field of quantum foundations/quantum information is almost all for the good, not least because much of the theoretical work is anchored in real experiments. The drive toward quantum computation has led to many spin-offs which illuminate the foundational questions, such as quantum teleportation. This is a technology by means of which the quantum state of an atom can be transferred to a distant atom without being measured. If not quite up to science fiction’s transporters, this technology is here now and is already playing a role. For example, it is used to make a new kind of code, which is unbreakable.

  These developments have also deepened our appreciation for how quantum theory is structured. For example, a new type of approach, initiated by Lucien Hardy, seeks the shortest and most elegant set of axioms from which the mathematical formalism of quantum mechanics may be derived. Of these axioms, there are several that are unremarkable and tell us things that are true for every theory; then there is one into which all the strangeness of the quantum world is packed.

  At the same time, there is little room, in a climate dominated by operational approaches, for old-fashioned realists in search of a completion of quantum theory that will explain what happens in individual events. Some of those realists are many-worlders, but there persists a small community of Bohmians. A handful develop theories of wave-function collapse. Those who try to push the search for reality beyond these established approaches are even fewer. Most of us in this class were originally specialists in other fields, some at the highest level of accomplishment, such as Stephen Adler and Gerard ’t Hooft. We fit imperfectly into the lively field quantum foundations has become, especially as our concerns and ambitions—and the theories we develop to realize them—cannot be expressed in the operational language whose mastery is the sign of a professional quantum informationist. Still, we persist in our search for a realist and complete picture of the quantum world.

  I believe that, as expressed by Lucien Hardy in the quote that opens this chapter, many physicists would prefer realism to operationalism, and would take an interest in the discovery of a realist approach to quantum theory that overcame the weaknesses of the existing approaches. If, during the present period, operational approaches dominate, this is partly due to the lack of a realist alternative which has the ring of truth.

  The rest of this book is about the future of realist approaches to quantum physics. But before we dismiss the non-realist approaches, let’s see if there is anything to be learned from the recent focus on them.

  One lesson I’ve learned is that there are many different ways to express how the quantum world differs from the classical world of Newtonian physics. If you are happy taking an anti-realist point of view, there are a range of options. You can adopt Bohr’s radical denial that science is anything other than an extension of common language we use to tell each other about the results of experiments we do. You can embrace an approach called quantum Bayesianism, according to which the wave function is no more than a symbolic representation of our beliefs, and prediction is a fancy word for betting. Another option is to embrace a purely operational perspective, which allows one to speak only of processes delineated by and sandwiched between preparations and measurements.

  In all these the measurement problem is sidestepped, or rather, defined out of existence, because you cannot even pose the possibility that the quantum state describes the observers and their measuring instruments.

  Several of the new proposals have at their core the concept that the world is made of information. This can be summarized in John Wheeler’s slogan “it from bit,” modernized as “it from qubit,” where a qubit is a minimal unit of quantum information, i.e., a quantum binary choice, as in our story about pet preference. In practical terms, this program imagines that all physical quantities are reducible to a finite number of quantum yes/no questions, and also that evolution in time under Rule 1 can be understood as processing this quantum information as a quantum computer would. This means that the change in time can be expressed as the action of a sequence of logical operations applied to one or two qubits at a time.

  John Wheeler put it like this:

  It from bit symbolizes the idea that every item of the physical world has at bottom—at a very deep bottom, in most instances—an immaterial source and explanation; that what we call reality arises in the last analysis from the posing of yes-no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and this is a participatory universe.3

  The first time you hear this kind of view expressed, you may not be sure the speaker means it. But he does. Here is another, briefer quote: “Physics gives rise to observer-participancy; observer-participancy gives rise to information; information gives rise to physics.”4

  When Wheeler speaks of a participatory universe, he means that the universe is brought into existence by our observing or perceiving it. Yes, you might reply, but before we can perceive or observe anything we have to be brought into existence within and by the universe. Yes, says John. Both. Is there a problem?

  Does this yield any insight? Some systems with a finite number of possible outcomes can be represented this way, and doing so does illuminate the physics: for example, the importance of entanglement in quantum physics can be brought into the foreground. But other systems which have an infinite number of physical variables, such as the electromagnetic field, do not fit as easily within this program. Nonetheless, this quantum information approach to quantum foundations has had a good influence on diverse fields of physics, from hard-core solid-state physics to speculations on string theory and quantum black holes.

  However, we should be careful to distinguish several different ideas about the relationship between physics and information, some of which are useful but also trivially true; others of which are radical and would need, in my view, more justification than they’ve been given.

  Let’s start by defining information. One useful definition was given by Claude Shannon, who may be considered the founder of information theory. His definition was set in the framework of communication, and contemplates a channel which carries a message from a sender to a receiver. These, it is assumed, share a language, by means of which they give meaning to a sequence of symbols. The amount of information in the message is defined to be the number of answers to a set of yes/no questions that the receiver learns from the sender by understanding what the message says.

  Put this way, few physical systems are, or can be construed as, channels of information between senders and receivers who share a language. The universe as a whole is not such a channel of information. What is powerful about Shannon’s idea is that a measure of how much information is transmitted can be separated from the semantic content, i.e., from what the message means. T
he sender and receiver share a semantics that gives meaning to the message, but you don’t have to share that knowledge to measure the quantity of information carried. Still, without the shared semantics the message would not carry information. One way to see this is that to measure how much information a message carries, you need some information about the language, such as the relative frequencies with which different letters, words, or phrases occur in the linguistic community of those who speak that language. This information about context is not going to be coded into every message. If you don’t specify the language, the Shannon information is not defined. This means, in particular, that the message has to be in a language that the sender and receiver share. A pattern of irregular symbols carries no information. So, to the extent that Shannon’s measure of information depends on the language and other aspects of the context which are shared by the sender and receiver and not coded into the message, it is not purely a physical quantity.

  One of the stubborn problems in the philosophy of language is to understand how speakers have intentions and convey meaning. That this is a hard problem does not mean that intentions and meanings are not part of the world. But they are aspects of the world that are dependent for their existence on the existence of minds. Shannon information is a measure of what goes on in this world of meanings and intentions. It is well defined even if we don’t have a good understanding of how meaning and intention fit into the natural world, but it is nonetheless a part of that world.

  Let me give an example to make this distinction clear. I hear drops of water falling intermittently from a leaky drainpipe after a summer rain. The pattern of the drips seems irregular, but it carries no message for me or anyone else. There is no sender, and I am no receiver; hence no information, in Shannon’s sense, is contained in the drips. On the other hand, someone could use Morse code to send me a message via a sequence of short and long pauses between drips. The patterns between the two cases would differ in a way that reflects the presence or absence of an intention to convey meaning. The intent matters: information in this sense requires beings with the intention of conveying meaning. For a realist, who wants to know what the world is beyond what people know or understand, this is not a useful idea to apply to the atomic world.*

 

‹ Prev