Book Read Free

The Trouble With Physics: The Rise of String Theory, The Fall of a Science, and What Comes Next

Page 27

by Lee Smolin


  Changes in these ratios would be measurable by changes in the frequencies of light emitted by atoms. Atoms emit light in a spectrum made up of many discrete frequencies, so there are many ratios defined by pairs of these frequencies. One can ask whether these ratios are different in the light from faraway stars and galaxies—that is, in light that is billions of years old.

  Experiments of this kind have failed to detect changes in the constants of nature within our galaxy or among nearby galaxies. On time scales of millions of years, then, the constants have not changed in any detectable way. But an ongoing experiment by a group in Australia has found changes in the ratios by looking at light from quasars—light that was emitted on the order of 10 billion years ago. The Australian scientists don’t study the atomic spectra of the quasar itself; what they do is more clever than that. On its way from the quasar to us, the light travels through many galaxies. Each time it passes through a galaxy, some light is absorbed by atoms in that galaxy. Atoms absorb light at specific frequencies, but because of the Doppler effect, the frequency at which the light is absorbed is shifted toward the red end of the spectrum by an amount proportional to that galaxy’s distance from us. The result is that the spectrum of light from the quasar is decorated by a forest of lines, each corresponding to light being absorbed by a galaxy a particular distance from us. By studying ratios of the frequencies of these lines, we can look for changes in the fundamental constants over the time that the light has been traveling from the quasar. Because a change must show up as a frequency ratio and there are several fundamental constants, physicists have settled on the simplest ratio to study—the fine structure constant, which is made up of the constants that determine the properties of the atom. It is called alpha, and it is equal to the ratio of the charge of the electron squared, divided by the speed of light, times Planck’s constant.

  The Australians studied measurements of light from a sample of eighty quasars, using very precise spectra taken by the Keck telescope, in Hawaii. They deduced from their data that around 10 billion years ago, alpha was smaller by about 1 part in 10,000.8

  This is a small change, but if it holds up, it is a momentous discovery, the most important in decades. This would be the first time that a fundamental constant of nature had been seen to vary over time.

  Many of the astronomers I know are keeping an open mind. By all accounts, the data have been taken and analyzed with extreme care. No one has found an obvious flaw in the Australian team’s method or results, but the experiment itself is very delicate, the precisions involved are at the edge of what is possible, and we cannot rule out the possibility that some error has crept into the analysis. As of this writing, the situation is messy, as is typical for a new experimental technique. Other groups are attempting the same measurements, and the results are controversial.9

  Many theorists are skeptical of the indications of a variation in the fine structure constant. They worry that such a variation would be exceedingly unnatural, in that it would introduce to the theory of electrons, nuclei, and atoms a time scale many orders of magnitude removed from the scale of atomic physics. Of course, they could have said this about the scale of the cosmological constant. In fact, the scale at which the fine structure constant varies is not close to anything else that has been measured, except for the cosmological constant itself. So perhaps this is another mysterious phenomenon having to do with the scale R.

  Yet another manifestation of the scale R may be the mysterious neutrino masses. You can convert the length scale R to a mass scale, using just the fundamental constants of physics, and the result is the same order of magnitude as the differences between the masses of the various kinds of neutrinos. No one knows why neutrinos, the lightest particles there are, should have masses related to R, but there it is—another tantalizing hint.

  There could be a final experimental hint involving the scale R. By combining it with Newton’s gravitational constant, we can conclude that there may be effects that alter the gravitational force on the scale of millimeters. Currently a group at the University of Washington, led by Eric Adelberger, has been making ultraprecise measurements of the force of gravity between two objects that are millimeters apart. As of June 2006, all they are able to say publicly is that they see no evidence that Newton’s laws are wrong down to scales of 6/100 of a millimeter.

  If nothing else, our experiments should certainly test the fundamental principles of physics. There is a great tendency to think that these principles, once discovered, are eternal, yet history tells a different story. Almost every principle once proclaimed has been superseded. No matter how useful they are or how good an approximation they give to phenomena, sooner or later most principles fail, as experiment probes the natural world more accurately. Plato proclaimed that everything in the celestial sphere moves on circles. There were good reasons for this: Everything above the sphere of the moon was believed to be eternal and perfect, and no motion is more perfect than uniform motion in a circle. Ptolemy adopted this principle and enhanced it by constructing epicycles—circles moving on circles.

  The orbits of the planets are indeed very nearly circular, and the motion of the planets in their orbits is almost uniform. Somehow it is fitting that the least circular of the planetary orbits is that of unruly Mars—and its orbit is so close to circular that the deviations are at the limit of what can be deduced from the best possible naked-eye observations. In 1609, after nine years of painstaking work on the Martian orbit, Johannes Kepler realized that it must be an ellipse. That year, Galileo turned a telescope to the sky and began a new era of astronomy, in which it eventually became clear that Kepler was right. Circles were the most perfect shapes, but planetary orbits are not circular.

  When the ancients declared the circle the most perfect shape, they meant that it was the most symmetric: Each point on the orbit is the same as any other. The principles that are hardest to give up are those that appeal to our need for symmetry and elevate an observed symmetry to a necessity. Modern physics is based on a collection of symmetries, which are believed to enshrine the most basic principles. No less than the ancients, many modern theorists believe instinctively that the fundamental theory must be the most symmetric possible law. Should we trust this instinct, or should we listen to the lesson of history, which tells us that (as in the example of the planetary orbits) nature becomes less rather than more symmetric the closer we look?

  The symmetries most deeply embedded in contemporary theory are those that come from Einstein’s special and general theories of relativity. The most basic of these is the relativity of inertial frames. It is actually Galileo’s principle, and it has been a foundational idea of physics since the seventeenth century. It says that we cannot distinguish motion with a constant speed and direction from rest. It is this principle that is responsible for the fact that we don’t feel the motion of the earth, or our motion in an airplane cruising at constant speed in the sky. As long as there is no acceleration, you cannot feel your own motion. Another way to express this is that there is no preferred observer and no preferred frame of reference: As long as acceleration is absent, one observer is as good as another.

  What Einstein did in 1905 was to apply this principle to light. A consequence is that the speed of light must be considered a constant, independent of the motion of the light source or the observer. No matter how we are moving relative to each other, you and I will attribute exactly the same speed to a photon. This is the basis of Einstein’s special theory of relativity.

  Given the special theory of relativity, we can make many predictions about the physics of the elementary particles. Here is one concerning cosmic rays. These are a population of particles, believed to be mainly protons, that travel through the universe. They arrive at the top of Earth’s atmosphere, where they collide with atoms in the air, producing showers of other kinds of particles, which can be detected on the ground. No one knows the source of these cosmic rays, but the higher their energy, the rarer they are. They have been o
bserved at energies more than 100 billion times the mass of the proton. To have this energy, protons must be moving very, very close to the speed of light—a speed limit that, according to special relativity, no particle is allowed to break.

  Cosmic rays are believed to come from distant galaxies; if so, they must travel millions and perhaps billions of light-years across the universe before arriving here. Back in 1966, two Soviet physicists, Georgiy Zatsepin and Vadim Kuzmin, and (independently) the Cornell University physicist Kenneth Greisen made a striking prediction about cosmic rays, using only the special theory of relativity.10 Their prediction, usually known as the GZK prediction, is worth describing, because it is just now being tested. It is the most extreme test of special relativity ever made. It is, in fact, the first test of special relativity approaching the Planck scale, the scale at which we might see the effects of a quantum theory of gravity.

  Good scientists take advantage of all the tools at their disposal. What Greisen, Zatsepin, and Kuzmin understood is that we have access to a laboratory vastly larger than any we could build on Earth—the universe itself. We can detect cosmic rays that arrive on Earth after traveling for billions of years over a substantial proportion of the universe. As they travel, very small effects—effects that would be too tiny to show up in earthly experiments—may amplify to the point where we can see them. If we use the universe as an experimental tool, we can see much deeper into the structure of nature than people ever imagined.

  The key point is that the space that cosmic rays are traveling through is not empty; it is filled with the cosmic microwave background radiation. Greisen and the Soviet scientists realized that protons of energy greater than a particular value would interact with the photons in the background radiation and that this interaction would create particles (most likely pions, or pi-mesons). This particle creation would take energy, and because energy is conserved, the high-energy protons would be slowed down. Thus, space is in effect opaque to the passage of any protons that carry more energy than that needed to make pions.

  Space therefore functions as a kind of filter. The protons making up cosmic rays can travel only if they have less energy than that required to make pions. If they have more, they make pions and slow down and keep doing so until they slow to the point at which they can no longer make pions. It is as if the universe had imposed a speed limit on protons. Greisen, Zatsepin, and Kuzmin predicted that no protons would arrive at Earth with more than the energy needed to make pions in this way. The energy at which they predicted that this pion creation would happen is about a billionth of the Planck energy (1019 GeV) and is called the GZK cutoff.

  This is an enormous energy, closer to the Planck energy than anything else we know of. It is more than 10 million times the energy that will be made in the most sophisticated particle accelerators currently planned. The GZK prediction provides a stringent test of Einstein’s special theory of relativity. It probes the theory at a much higher energy, and for a velocity much closer to the speed of light, than any experiment done, or even feasible, on Earth. In 1966, when the GZK prediction was made, only cosmic rays with energies much lower than the predicted cutoff were being seen, but recently a few instruments have been built that can detect cosmic-ray particles at or even above the predicted cutoff. One such experiment, called AGASA (for Akeno Giant Air Shower Array), carried out in Japan, reported at least a dozen such extreme events. The energy involved in these events is greater than 3 × 1020 electron volts—roughly the energy a pitcher can give to a fastball, but all carried by one proton.

  These events may be a signal that special relativity is breaking down at extreme energies. Sidney Coleman and Sheldon Glashow proposed in the late 1990s that a breakdown in special relativity could raise the energy required to make pions, thus raising the GZK cutoff energy and allowing protons of much higher energy to reach our detectors on Earth.11

  This is not the only possible explanation for the observation of these higher-energy cosmic-ray protons. It is possible that they originated close enough to Earth so that they haven’t had time to be slowed down by interaction with the cosmic microwave background. This can be checked by seeing if the protons in question arrived from any preferred place in the sky. There is so far no such evidence, but it remains a possibility.

  It is also possible that these extremely high-energy particles are not protons at all. They could be a so-far-unknown species of stable particle with a mass much higher than that of protons. If so, this, too, would be a major discovery.

  It is of course always possible that the experiments are wrong. The AGASA team report that their measurements of energy are accurate up to an uncertainty of about 25 percent, which is a big percentage of error, but still not enough to explain the existence of the highest-energy events they see. However, their estimate of their experiment’s degree of accuracy could be wrong.

  Luckily, an experiment now in progress will resolve the disagreement. This is the Auger cosmic-ray detector, now in operation on the pampas of western Argentina. If the Auger detectors confirm the Japanese observation and if the other possible explanations can be discounted, this would be the most momentous discovery of the last hundred years—the first breakdown of the basic theories comprising the twentieth century’s scientific revolution.

  What does it take to observe cosmic-ray particles of such an extreme energy? When a particle of this energy strikes the top of the atmosphere, it produces a shower of other particles that rain down over an area of many square kilometers. The Auger experiment consists of hundreds of detectors placed over 3,000 square kilometers of the Argentinean pampas. Several HiRes (for “high resolution”) light sensors at the site also scan the sky to catch the light produced by the particle shower. By combining the signals made by all these detectors, the Auger researchers can determine the energy of the original particle that struck the atmosphere, as well as the direction from which it came.

  As of this writing, the Auger Observatory is just releasing its first data. The good news is that the experiment is working well, but there is still not quite enough data to decide whether the cutoff predicted on the basis of special relativity is there or not. Still, it is reasonable to hope that after running for a few years, there will be enough data to settle the issue.

  Even if the Auger team announces that special relativity remains viable, this finding alone will be the most important in fundamental physics in at least twenty-five years—that is, since the failure to find proton decay (see chapter 4). The long dark era in which theory developed without the guidance of experiment will finally be over. But if Auger discovers that special relatively is not completely right, it will usher in a new era in fundamental physics. It’s worth taking some time to explore the implications of such a revolutionary finding and where it might lead.

  14

  Building on Einstein

  SUPPOSE THAT THE AUGER project or some other experiment shows that Einstein’s special theory of relativity breaks down. This would be bad news for string theory: It would mean that the first great experimental discovery of the twenty-first century was totally unanticipated by the most popular “theory of everything.” String theory assumes that special relativity is true, exactly as written down by Einstein a hundred years ago. Indeed, a major achievement of string theory was to make a theory of strings consistent with both quantum theory and special relativity. So string theory predicts that no matter how distant their sources are from one another, photons of different frequencies travel at the same speed. As we have seen, string theory does not make many predictions, but this is one; in fact, it’s the only prediction of string theory that can be tested by present technology.

  What would it mean for the predictions of special relativity to be falsified? There are two possibilities. One is that special relativity is wrong, but the other possibility leads to a deepening of it. On this distinction rides a tale of perhaps the most surprising new idea to have emerged in fundamental physics in the last decade.

  There are
several experiments that could reveal a breakdown or modification of special relativity. The Auger experiment could do it, but so could our observations of gamma-ray bursts. These are enormous explosions that for a few seconds can produce as much light as that emitted by a whole galaxy. As the name implies, most of this light is radiated in gamma rays, which are a highly energetic form of photons. Signals from these explosions reach Earth on average about once a day. They were first detected in the late 1960s, by military satellites designed to look for illegal tests of nuclear weapons. Now they are observed by scientific satellites, whose purpose is to detect them.

  We do not know exactly what the sources of gamma-ray bursts are, although there are plausible theories. They may come from the collision of two neutron stars or of a neutron star and a black hole. Either pair would have orbited each other for billions of years, but such systems are unstable. As they radiate energy in gravitational waves, they spiral very slowly toward each other, until eventually they collide in the most violent and energetic events known.

  Einstein’s special theory of relativity tells us that all light travels at the same speed, no matter what its frequency. The gamma-ray bursts provide a laboratory to test this claim, because they give a very short burst of photons in a wide range of energies. Most important, they can take billions of years to reach us, and herein lies the heart of the experiment.

  Suppose Einstein is wrong, and photons of different energies travel at slightly different speeds. If two photons created in the same distant explosion were to arrive on Earth at different times, this would surely indicate a breakdown in the special theory of relativity.

  What would such a momentous discovery imply? This would first depend on the physical scale at which the breakdown occurred. One place where we do expect special relativity to crumble is at the Planck length. Recall from the preceding chapter that the Planck scale is about 10−20 times the size of a proton. Quantum theory tells us that this scale represents a threshold below which the classical picture of spacetime disintegrates. Einstein’s special theory of relativity is part of that classical picture, so we might expect it to break down at just that point.

 

‹ Prev