Analog SFF, July-August 2008

Home > Other > Analog SFF, July-August 2008 > Page 12
Analog SFF, July-August 2008 Page 12

by Dell Magazine Authors


  * * * *

  A Little Physics...

  Starting at the beginning of cosmology: In the beginning, the word was God—but later it was Newton, and then Einstein.

  Newton's three laws of motion: 1—Every object in a state of uniform motion tends to remain in that state of motion unless an external force is applied to it. 2—The force on an object is proportional to its mass multiplied by its acceleration (F=ma). 3—For every action there is an equal and opposite reaction.

  These “laws” allow us to calculate things—to actually do physics. The laws themselves require other assumptions: That the world is describable, and what has been called Newton's “Zeroth” law. 0—The world can be described with mathematics—particularly continuous, smooth functions.

  Newton also experimented with light. Although he discovered that white light sent through a prism gives a spectrum, he believed light was made up of small particles. And these particles passing through the “ether” (the fluid-like substance he thought filled the heavens) he believed stimulated that ether to give off waves. To this very day, physics has this schizophrenia: Is matter waves or particles? (And if both, what does that really mean?)

  In the cases where there is gravity and/or masses traveling at close to light-speed, Einstein's equations superceded Newton's three laws. Gravity, for Einstein, was nothing more than “geometry,” a warping of space. And this led to another example of physics schizophrenia: “Things” versus geometry. For example, can gravitons be considered particles, or are they just convenient ways to talk about geometry (e.g. curvature)? John Wheeler actually had a (not entirely successful) theory, “Geometrodynamics,” which considered everything as geometry. And, building on Einstein's explaining gravity as geometry, Kaluza and Klein made a good, but ultimately unsuccessful, attempt to explain electromagnetism as geometry in a five-dimensional space-time. String-theory owes much to Kaluza-Klein ideas.

  In relativity theory, Newton's zeroth law was still assumed. But, with the advent of quantum mechanics, that law is likely to fall. At some small scale, 10-33cm or thereabouts, space-time seems as if it must be granular. There are several reasons to think so: First, if string theory is “correct,” the basic element of space-time is a discrete string. But also, since vacuum fluctuations (see below) imply a fluctuation of mass—and mass in some sense is geometry, then the geometry on the scale of the fluctuations wouldn't be continuous and would hence be granular. If we can't trust the mathematics of continuous functions, calculus (which Newton co-invented) goes away, as do equations in general. What do we have to replace them? Einstein, in his later years, began to question the zeroth law. In his Out of My Later Years essays, he wrote: “The spacetime continuum may be considered as contrary to nature in view of the molecular structure of everything which happens on a small scale.... Perhaps the success of the Heisenberg method points to a purely algebraic method of description of nature, that is to the elimination of continuous functions in physics. At the present time, however, such a program looks like an attempt to breath in empty space.”

  Arguably, theoretical physics explains less these days than it did in Newton's time. And, if math is suspect, can we believe physics at all?

  * * * *

  ...And Some Cosmology

  Einstein's “field” equations of General Relativity are deceptively simple...

  ...and remarkably elegant: The left side of the equation describes geometry and the right side matter. And at a philosophical level, they are simple. They explain the force of gravity (and only the gravitational force).

  naturally enough, is called the Einstein Tensor. It is a rather complex function of

  (the “metric” tensor), which essentially describes the distance between two neighboring points in space-time.

  The (mu) and (nu) subscripts range from zero through three and represent our four dimensions (e.g. t, x, y, z). The above then represents sixteen equations (44), but since

  is identically

  (and similarly for

  ), and because of the arbitrariness of the four dimension coordinates, there are in fact, only six independent equations. The equations are analogous to Newton's second law equation F=ma (force equals mass times acceleration). At a given point,

  is a purely geometrical quantity describing curvature, and

  describes the mass distribution at that point. At a point where there is no mass (ignoring for the moment, the quantum “mass” at any point in space-time), the equations reduce to:

  describes the mass distribution at that point. At a point where there is no mass (ignoring for the moment, the quantum “mass” at any point in space-time), the equations reduce to:

  Again,

  describes space-time curvature and is a descriptor of gravity. So, for Einstein, gravity is geometry.

  Einstein applied his equations to the universe as a whole and found they did not allow a static universe. This came as a shock since the prevailing belief of the day was that the universe was static. Einstein found that he could alter his equations with, essentially, a fudge factor as follows (the mass free case):

  Lambda, a small, pure number, is the so-called cosmological constant, a provider of a cosmic scale universal repulsion (or attraction). Carefully tweaking that number could provide a static universe—or so Einstein thought. The Russian physicist, Alexander Friedman, however, showed that the solution would be at a point of unstable equilibrium—the equations would still say the universe was expanding or contracting. And with the Lambda term, the equations predict that the rate of expansion (or contraction) would continuously increase.

  But then the American astronomer Edwin Hubble observed that the universe actually was expanding. This was a big surprise—the notion that the universe was not eternal and unchanging. The belief in an unchanging universe was so ingrained that Fred Hoyle (along with Herman Bondi and Tommy Gold) proposed the Steady State theory. It accepted the observed expansion, but preserved the idea of an unchanging universe by positing that a hydrogen atom occasionally pops into existence (just one per cubic meter per billion years). But then came strong evidence of the Big Bang, and Hoyle abandoned his theory—reluctantly. He said it was a great theory and he couldn't understand how God had overlooked it.

  Einstein though, happily accepted the expansion of the universe and immediately expunged his “blunder,” the cosmological constant, from his equations.

  For years, most physicists agreed that Lambda was indeed a blunder and its value was exactly zero. That is what I was taught in grad school. But then ideas from quantum mechanics about the nature of a vacuum changed everything.

  * * * *

  Quantum Mechanics Fills the Vacuum

  The Uncertainty Principle says that you can't measure exactly how much energy there is at a point in space, at least not in a finite amount of time (and until it is measured, energy isn't even a well-defined quantity). But the principle says more: Even at a point in a complete vacuum, there must be energy (the so-called vacuum or zero-point energy). This energy, according to relativity, can be thought of as mass (E=mc2). And mass has gravity. The equations of general relativity, therefore, can be used to describe this evanescent sea of energy in which we're all immersed. But this sea of gravity is exactly what is described by the cosmological constant.

  The cosmological constant then, is, just the manifestation of the universe's zero-point energy. But, oddly enough, the constant could still be zero (or even negative). That is because, due to the weirdness of quantum mechanics, the zero-point energy at a point can be negative—as if we live in a hot soup and someone throws in an ice-cube.

  Something must carry the zero-point energy, and that something is normal elementary particles—well, not exactly normal particles, as they pop into existence, stay around a very short time, and pop out again. These virtual particles appear (in an important example) as particle/anti-particle pairs and an accompanying photon. Soon after they're created, they annihilate.

  One important division of pa
rticles is into bosons and fermions (or equivalently, integer spin and half-integer spin particles). Bosons (e.g. photons, gluons, gravitons, in so far as gravitons can be considered particles) contribute positive energy to the vacuum, whereas the fermion (e.g. electrons, muons, quarks) contribution is negative. Each of these virtual elementary particle types gives an enormous amount of energy to the vacuum, either positive or negative, and there is no known reason why the positive and negative energies should cancel or come anywhere near canceling. And yet, to be consistent with astronomical observations, the magnitude of Lambda, the cosmological constant, (the measure of this energy) must be less than 10 -120 (that's a decimal point followed by lots and lots of zeros followed by a one).

  Steven Weinberg wondered how big the cosmological constant could be and still produce a universe like ours—one that allows, for example, Analog readers. He found that if the value were just one or two orders of magnitude larger, the early universe would have been unable to produce stars or galaxies. (A similar problem in the early universe exists for a negative cosmological constant with a value a couple of orders of magnitude away from -10 -120.)

  The problem is: If the cosmological constant is not zero, but instead 10-120 or thereabouts, it is exceptionally difficult to explain how nature managed to tune the value of the constant to just one of those miniscule values that would allow us to exist. A value of zero would solve the problem. There is a theoretical way the energy could be zero—supersymmetry. If every fermion (or boson) came with another particle, the same in every respect except in that it was a boson (or fermion), then the energies would cancel. This was such an appealing solution that the particles (even though never found) were named. The particle corresponding to the electron was called a selectron, for a quark the particle is a squark, a gluon gives a gluino, and the neutrino corresponds to a sneutrino (great name, isn't it?).

  * * * *

  Our Universe Doesn't Cooperate

  Even though supersymmetry doesn't seem to apply to our universe, most everyone none-the-less wanted a zero value for the cosmological constant. In fact, Joseph Polchinski confided that he'd quit physics if the cosmological constant turned out to be nonzero. It would strain credulity (even for us science-fiction types) that the value of the constant could have been so well tuned entirely by chance. One would have to invoke the anthropic principle.

  In 1998, two separate research groups, The Supernova Cosmology Project and The High-redshift Supernova Search Team, attempted to measure how fast the universe was slowing down—in other words, the change in the rate of expansion.

  They used observations of Type Ia supernovas in distant galaxies. This class of supernova is the case where a white dwarf star has a companion. Over time, the dwarf captures gas emitted from the companion until the dwarf reaches a well-defined maximum mass, the Chandrasekhar limit. Then it collapses and goes boom—with the luminosity of four billion suns. The important points are that a Type Ia supernova has a signature and is, in effect, a “standard candle"; all of these supernovae have the same absolute luminosity. So by measuring how bright one of them appears, we can calculate how far away it is. And by measuring the redshift of the galaxy in which it is embedded, we can tell how fast the galaxy is moving away from us.

  After examining many of these supernovae at various distances from the solar system, the High-redshift Team reported that our universe was not slowing down. On the contrary, it was speeding up. This implied a non-zero cosmological constant, and its value was about 10-120. Brian Schmidt, one of the team leaders, said his reaction to the result was “somewhere between amazement and horror.” A few months later, the other team came in with very similar results.

  Improbable as it is, there is (at least as of this writing) a non-zero cosmological constant. And that means there is a cosmic vacuum energy (which is likely the entire explanation of the “Dark Energy” in the universe). The question is: Why isn't it zero?

  * * * *

  Just How Improbable Is our Universe?

  The “Standard Model” of particle physics works pretty well at describing what makes up matter—of the non-dark variety (six flavors of quark [up, down, charm, strange, top, bottom] and six leptons [electron, tau, muon, and their corresponding neutrinos], and the force mediating particles [photons, W and Z bosons, gluons]). The model has about twenty-five free parameters (constants) specifying the masses of the particles and coupling constants, i.e. the strengths of the forces between them. These twenty-five or so constants are (as best we know) independent of each other; the value of one of the constants doesn't depend on the values of the others.

  These constants, for example, determine the likelihood of atoms, the stability of atomic orbits and nuclei. From the values, one can determine, among other things, the amount of hydrogen available after the Big Bang to form stars, the masses and lifetimes of those stars, the ability of galaxies to form, and even the size of the universe.

  Many of these constants seem “tuned.” If their numerical values were different by a very small amount, we wouldn't exist. But as we saw above, as scary as the tuning of these constants is, it is the cosmological constant, Lambda, which is the most tightly tuned.

  The theoretical physicist and cosmologist, Lee Smolin, looking at the individual tunings of the constants, has estimated that the probability of a universe that can support life to be at best one chance in 10229.

  There's yet another cause for some unease about the universe and our place in it. The universe is estimated to be about fourteen billion years old. The Earth seems to have been around for about four and a half billion years. So, as far as age is concerned, the Earth fits comfortably in the universe. Perhaps too comfortably. The universe arguably is not much older than it must be to have an Earth like ours.

  * * * *

  Which Way Out?

  Lee Smolin considers that there are four solutions to the problem, schemas if you will.

  1) God tuned the parameters for our benefit. This isn't his preferred answer (or mine either).

  2) There are a very large number if universes, in each of which the parameters are chosen randomly. The number of universes is so large that some of the universes would have the “right” values of the parameters—and we are in one of those universes. Neither is this Smolin's preferred solution. He writes, “To argue this way is not to reason, it is simply to give up looking for a rational explanation.”

  3) There is a “unique mathematically consistent theory of the whole universe.” And as such, we'd have no choice but to accept it. Smolin rejects this as well. “It strains credibility,” he writes, “to imagine that mathematical consistency could be the sole reason for the parameters to have the extraordinarily unlikely values that result in a world with stars and life.”

  On the other hand, one might hope that a theory would come along and explain the values of all the parameters in terms of just one of them. If the parameters were not independent, that would make the odds rise enormously.

  4) The parameters evolve in time—in the Darwinian sense. This is Smolin's belief. (I quote Lee Smolin often. He has a highly informed opinion and is perhaps the strongest advocate for this “schema four.")

  * * * *

  A good number of very intelligent people have argued for schemas two, three, and four above. At the moment, there is nothing resembling a consensus among physicists. And in addition, the schemas are not mutually exclusive.

  Schema four is appealing (to me, at any rate) and deserves an article of its own. But at the moment, the idea hasn't, in my opinion, risen to the point of being a theory.

  Schema three also doesn't seem to have a theory attached to it. But it is in the tradition of physics to look along these lines. Macho physics, as George Efstathiou, Director of the Institute of Astronomy in Cambridge, dismissively describes it. He goes on to say, “A lot of theoretical physicists really dislike the idea of an anthropic principle. It's regarded as a bit of a cop-out; that if you can't actually calculate something from first p
rinciples, from the theory, that it's a cop-out to invoke the anthropic principle. So the macho physicists’ point of view is to say that there is a theory of everything and we will one day find this theory of everything, and the theory of everything will predict all of the physical constants of nature and it will predict exactly why the universe is the way it is today.”

  Schema two could also have been considered an idea rather than a theory. Among others, Martin Rees had years ago proposed multiple universes. But in the last few years, a viable theory, the landscape model, has grown up around the idea.

  Leonard Susskind coined the term “landscape” for the emerging theory of a populated “megaverse.” The theory itself grew from the work of many. If we were to draw up a “Society of Landscape Architects,” the charter members would certainly include Raphael Bousso of U California, Berkeley, Alan Guth of MIT, Andrei Linde of Stanford U, Joseph Polchinski of Kavli Institute, Leonard Susskind of Stanford U, and Alex Vilenkin of Tufts U.

  One should note that there are (at least) two many-universe theories. Hugh Everett in the mid 1960s proposed that at every quantum “decision,” the universe breaks into multiple copies, one for every possible outcome of the quantum decision. Those universes interact, giving rise to quantum interference. This is the “multi-world” interpretation of quantum mechanics (beloved of us SF writers). The other theory, “the megaverse,” is what we are here concerned with. Arguably, the multi-world theory might (in view of Polchinski's D-brane theory) really be a subset of the landscape model.

  Many would not consider the landscape a pretty or elegant theory. Indeed, I asked Steven Weinberg about it. He allowed that it wasn't elegant, but it might well be right. Not everyone agrees, though. One prominent string theorist I asked (I won't mention his name as it was an off-the-cuff comment) half-jokingly called it “California science.” And Smolin has written of it, “If an attempt to construct a unique theory of nature leads instead to 10500 theories, that approach has been reduced to absurdity,” strong language indeed considering the normal reserve of academic discourse. Still, the theory seems to be the only game in town at the moment. Perhaps a schema three or four theory will come along eventually. But the landscape approach at the very least, in my opinion, provides a way to look a creationist straight in the eye.

 

‹ Prev