Book Read Free

The Particle at the End of the Universe: How the Hunt for the Higgs Boson Leads Us to the Edge of a New World

Page 22

by Sean Carroll


  The ruse worked. After the war, scientists were able to recover the gold by precipitating the atoms out of de Hevesy’s solution. Bohr delivered the metal back to the Royal Swedish Academy of Sciences in Stockholm, which was able to recast von Laue’s and Franck’s Nobel medals. De Hevesy himself, who fled to Sweden in 1943, won the Nobel Prize in Chemistry in 1944—not for discovering new techniques in hiding contraband, but for the use of isotopes in tracing chemical reactions.

  In case it wasn’t obvious, people take Nobel Prizes very seriously. At the end of the nineteenth century, chemist Alfred Nobel, the inventor of dynamite, established prizes in Physics, Chemistry, Physiology or Medicine, Literature, and Peace, which have been awarded each year since 1901. (The Economics prize, begun in 1968, is run by a different organization.) Nobel passed away in 1896, and the executors of his will were surprised to find that he had donated 94 percent of his considerable fortune to the establishment of the prizes.

  In the years since, the Nobel Prizes have become universally recognized as the pinnacle of scientific recognition. That isn’t quite the same as scientific “achievement”—the Nobels have quite specific criteria, and there are endless arguments about how well the prizes match up with the truly important scientific discoveries. Nobel’s original will aimed the prizes at “those who, during the preceding year, shall have conferred the greatest benefit on mankind,” and the Physics prize in particular “to the person who shall have made the most important ‘discovery’ or ‘invention’ within the field of physics.” To some extent these instructions are simply ignored; after a few early prizes were given to findings that later turned out to be in error, nobody pretends anymore that the prizes recognize work done in the preceding year. Crucially, making a “discovery” is not the same as being recognized as one of the world’s leading scientists. Some discoveries are made somewhat by accident, by people who later leave the field. And some scientists do fantastic work over the course of a lifetime, but don’t quite have a single world-changing discovery that rises to the level of a Nobel.

  There are other criteria that highly constrain the Nobel choices. Prizes are not awarded posthumously, although if a laureate passes away between the time when the decision is made and when it is announced, the prize is still given to them. Most important for physics, the prize is not given to more than three people in any one year. Unlike the Peace prize, for example, the Physics prize isn’t given to an organization or a collaboration; it is given to three or fewer individuals. That poses something of a challenge in the Big Science era.

  When it comes to theoretical contributions, it’s not enough to be smart, or even to be right; you have to be right, and your theory has to be confirmed by experiment. Stephen Hawking’s most important contribution to science is the realization that black holes give off radiation due to the effects of quantum mechanics. The large majority of physicists believe he is right, but at this point it’s a purely theoretical result; we haven’t observed any evaporating black holes, and we don’t have any promising way of doing so with current technologies. It’s quite possible that Hawking will never win the Nobel Prize, despite his incredibly impressive contributions.

  To outsiders, it can sometimes seem like the whole point of doing research is to win the Nobel Prize. That’s not the case; the Nobel captures important moments in science, but scientists themselves recognize there is a rich tapestry of progress that includes many contributions, great and small, which build on one another over the years. Still—let’s admit it—winning the Nobel is a big deal, and physicists certainly keep track of which discoveries might someday qualify.

  There is no question that discovering the Higgs boson is the kind of achievement that is certainly worthy of the Nobel Prize. For that matter, inventing the theory that predicted the Higgs in the first place is undoubtedly prize-worthy. But that doesn’t necessarily imply that any prizes are actually going to be given. Who might win them? Ultimately it’s not prizes that matter, it’s the science; but we have a good excuse for looking at the fascinating history of the ideas behind the Higgs boson and how physicists set about searching for it. The goal of this chapter is not to provide a definitive history nor to adjudicate who deserves what prize. Quite the opposite: By looking at how the ideas developed over time, it should become clear that the Higgs mechanism, like many great ideas in science, involved many crucial steps to the final answer. Attempting to draw a bright line between three (or fewer) people who deserve a prize and the many others who don’t necessarily does great violence to the reality of the development, even if it does make for good news copy.

  In this chapter we’re going to try to get the history right, although such a brief account will necessarily be incomplete. For history, however, the details often matter. Therefore, compared with the rest of the book, this chapter will go a little bit more into technical details. Feel free to skip over it, if you don’t mind missing out on some fascinating physics and compelling human drama.

  Superconductivity

  In Chapter Eight we explored the deep connection between symmetries and forces of nature. If we have a “local” or “gauge” symmetry—one that operates independently at each point in space—it necessarily comes with a connection field, and connection fields give rise to forces. This is how gravity and electromagnetism both work, and in the 1950s, Yang and Mills suggested a way to extend the idea to other forces of nature. The problem, as Wolfgang Pauli forcibly pointed out, is that the underlying symmetry always comes associated with massless boson particles. That’s part of the power of symmetries: They imply stringent restrictions on the properties that particles can have. The symmetry underlying electromagnetism, for example, implies that electric charge is exactly conserved.

  But forces mediated by massless particles—as far as anyone knew at the time—stretch over infinite distances and should be very easy to detect. Gravity and electromagnetism are the obvious examples, while the nuclear forces seem very different. Today we recognize that the strong and weak interactions are also Yang-Mills-type forces, with the massless particles hidden from us for different reasons: In the strong force the gluons are massless but confined inside hadrons, while in the weak force the W and Z bosons become massive because of spontaneous symmetry breaking.

  Back in 1949, American physicist Julian Schwinger had put forward an argument that forces based on symmetries would always be carried by massless particles. He kept thinking about the problem, however, and in 1961, he realized that his argument was not airtight: There was a loophole that allowed for the gauge bosons to get a mass. He wasn’t quite sure how it might actually happen, but he wrote a paper that pointed out his previous mistake. Schwinger was famously elegant and precise in his personal style as well as his physics research. He stood in contrast with Richard Feynman, with whom he and Sin-Itiro Tomonaga shared the Nobel Prize in 1965. Feynman was known for his boisterously informal personality and deeply intuitive approach to physics, while Schwinger was unfailingly meticulous and proper. When he wrote a paper pointing out a flaw in a well-accepted piece of conventional wisdom, people took him seriously.

  The question remained: What could cause the force-carrying bosons to get a mass? The answer came from a slightly unexpected source: not particle physics but condensed matter physics, the study of materials and their properties. In particular, insights borrowed from the theory of superconductors—materials with no resistance to electricity, such as those that power the giant magnets in the LHC.

  Electrical current is the flow of electrons through a medium. In an ordinary conductor, the electrons keep bumping into atoms and other electrons, providing resistance to the flow. Superconductors are materials in which, when the temperature is low enough, current can flow through unimpeded. The first good theory of superconductors was put forward by Soviet physicists Vitaly Ginzburg and Lev Landau in 1950. They suggested that a special kind of field permeates the superconductor, which acts to give a mass to the ordinarily massless photon. They weren’t necessarily th
inking of a new fundamental field of nature, but a collective motion of electrons, atoms, and electromagnetic fields—much like a sound wave doesn’t come from vibrations of a fundamental field, but from the collective motion of atoms in the air bumping into one another.

  Although Landau and Ginzburg proposed that some kind of field was responsible for superconductivity, they didn’t specify what that field actually was. That step was carried out by American physicists John Bardeen, Leon Cooper, and Robert Schrieffer, who invented what’s called the “BCS theory” of superconductivity in 1957. The BCS theory is one of the milestones of twentieth-century physics, and certainly deserves a book of its own. (This isn’t that book.)

  BCS borrowed an idea of Cooper’s, that pairs of particles could team up at very low temperatures. It’s these “Cooper pairs” that make up the mysterious field suggested by Landau and Ginzburg. While a single electron would continually meet resistance by bumping into the atoms around it, a Cooper pair can combine in a clever way so that every nudge that pushes on one electron exerts an equal and opposite pull on the other one (and vice versa). As a result, the paired electrons glide through the superconductor unimpeded.

  This is directly related to the fact that photons are effectively massive inside the superconductor. When particles are massless, their energy is directly proportional to their velocity and can range from zero up to any number you imagine. Massive particles, by contrast, come with the minimum energy they can possibly have: their rest energy, given by E = mc2. When moving electrons are jostled by atoms and other electrons in a material, their electric field gently shakes, which creates very low-energy photons you would hardly ever notice. It’s that continual emission of photons that lets the electrons lose energy and slow down, diluting the current. Because photons obtain a mass in the Landau-Ginzburg and BCS theories, there is a certain minimum energy required to make them. Electrons that don’t have enough energy can’t make any photons, and therefore can’t lose energy: The Cooper pairs flow through the material with zero resistance.

  Electrons, of course, are fermions, not bosons. But when they come together to make Cooper pairs, the result forms a boson. We have defined bosons as force-carrying fields that can pile up, as opposed to fermions, which are matter fields that take up space. As we discuss in Appendix One, fields have a property called “spin” that also distinguishes bosons from fermions. All bosons have spins that are whole numbers: 0, 1, 2 . . . Fermions, meanwhile, have spins that are whole numbers plus one-half: 1/2, 3/2, 5/2 . . . The electron is a fermion with spin equal to 1/2. When particles get together, their spins can either add or subtract; so a pair of two electrons can have either spin-0 or -1—just right for making bosons.

  This introduction is deeply unfair to the intricacies of the Landau-Ginzburg and BCS theories, which tell a rich story of many kinds of particles moving together in an intrinsically quantum-mechanical way. For our present purposes, the take-home message is straightforward: A bosonic field pervading space can give a mass to photons.

  Spontaneous symmetry breaking

  That last statement sounds pretty close to the Higgs idea. But a puzzle remained: How do we reconcile the idea that photons have mass inside a superconductor with the conviction that the underlying symmetry of electromagnetism forces the photon to be massless?

  This problem was tackled by a number of people, including American physicist Philip Anderson, Soviet physicist Nikolay Bogolyubov, and Japanese-American physicist Yoichiro Nambu. The key turned out to be that the symmetry was indeed there, but that it was hidden by a field that took on a nonzero value in the superconductor. According to the jargon that accompanies this phenomenon, we say the symmetry is “spontaneously broken”: The symmetry is there in the underlying equations, but the particular solution to those equations in which we are interested doesn’t look very symmetrical.

  Yoichiro Nambu, despite the fact that he won the Nobel Prize in 2008 and has garnered numerous other honors over the years, remains relatively unknown outside physics. That’s a shame, as his contributions are comparable to those of better-known colleagues. Not only was he one of the first to understand spontaneous symmetry breaking in particle physics, he was also the first to propose that quarks carry color, to suggest the existence of gluons, and to point out that certain particle properties could be explained by imagining that the particles were really tiny strings, thus launching string theory. Theoretical physicists admire Nambu’s accomplishments, but his inclination is to avoid the limelight.

  Nambu’s office was across the hall from mine while I was a faculty member at the University of Chicago. We didn’t interact much, but when we did he was unfailingly gracious and polite. My major encounter with him was one time when he knocked on my door, hoping that I could help him with the email system on the theory group computers, which tended to take time off at unpredictable intervals. I wasn’t much help, but he took it philosophically. Peter Freund, another theorist at Chicago, describes Nambu as a “magician”: “He suddenly pulls a whole array of rabbits out of his hat, and before you know it, the rabbits reassemble in an entirely novel formation and by God, they balance the impossible on their fluffy cottontails.” His highly developed sense of etiquette, however, failed him when he was briefly appointed as department chair: Reluctant to explicitly say no to any question, he would indicate disapproval by pausing before saying yes. This led to a certain amount of consternation among his colleagues, once they realized that their requests hadn’t actually been granted.

  After the BCS theory was proposed, Nambu began to study the phenomenon from the perspective of a particle physicist. He put his finger on the key role played by spontaneous symmetry breaking and began to wonder about its wider applicability. One of Nambu’s breakthroughs was to show (partly in collaboration with Italian physicist Giovanni Jona-Lasinio) how spontaneous symmetry breaking could happen even if you weren’t inside a superconductor. It could happen in empty space, in the presence of a field with a nonzero value—a clear precursor to the Higgs field. Interestingly, this theory also showed how a fermion field could start out massless but gain mass through the process of symmetry breaking.

  As brilliant as it was, Nambu’s suggestion of spontaneous symmetry breaking came with a price. While his models gave masses to fermions, they also predicted a new massless boson particle—exactly what particle physicists were trying to avoid, since they didn’t see any such particles created by the nuclear forces. These weren’t gauge bosons, since Nambu was considering the spontaneous breakdown of global symmetries rather than local ones; these were a new kind of massless particle. Soon thereafter, Scottish physicist Jeffrey Goldstone argued that this wasn’t just an annoyance: Spontaneously breaking a global symmetry always gives rise to massless particles, now called “Nambu-Goldstone bosons.” Pakistani physicist Abdus Salam and American physicist Steven Weinberg then collaborated with Goldstone in promoting this argument to what seemed like an airtight proof, now called “Goldstone’s theorem.”

  One question that must be addressed by any theory of broken symmetry is, what is the field that breaks the symmetry? In a superconductor the role is played by the Cooper pairs, composite states of electrons. In the Nambu–Jona-Lasinio model, a similar effect happens with composite nucleons. Starting with Goldstone’s 1961 paper, however, physicists became comfortable with the idea of simply positing a set of new fundamental boson fields whose job it was to break symmetries by taking on a nonzero value in empty space. The kind of fields required are known as a “scalar” fields, which is a way of saying they have no intrinsic spin. The gauge fields that carry forces, although they are also bosons, have spin-1, except for the graviton, which is spin-2.

  What happens when you spontaneously break a global symmetry. Without symmetry breaking, there would be a certain number N of scalar bosons with equal masses. After the symmetry is broken, all but one of them become massless Nambu-Goldstone bosons. The remaining one is massive.

  If the symmetry wasn’t broken, all the
fields in Goldstone’s model would behave in exactly the same way, as massive scalar bosons, due to the requirements of the symmetry. When the symmetry is broken, the fields differentiate themselves. In the case of a global symmetry (a single transformation all throughout space), which is what Goldstone considered, one field remains massive, while the others become massless Nambu-Goldstone bosons—that’s Goldstone’s theorem.

  Reconciliation

  This was bad news. It seemed as if, even if you followed BCS and Nambu to use spontaneous symmetry breaking as a way to give mass to the hypothetical Yang-Mills bosons that could carry the nuclear forces, the very technique you employed gave rise to another kind of massless boson that wasn’t seen in experiments.

  Fortunately, the resolution to this puzzle was known almost as soon as the puzzle arose. At least it was known to Phil Anderson at Bell Labs, and he tried his best to share it with the world. Anderson, who won the Nobel in 1977, is recognized as one of the world’s leading condensed-matter physicists. He has been a vocal champion for the intellectual status of condensed matter as a field; his celebrated 1972 article entitled “More Is Different” helped spread the word that studying the collective behavior of many particles was at least as interesting and fundamental as studying the underlying laws obeyed by the particles themselves. In contrast to the reticent Nambu, Anderson has always been willing to speak his mind, often in provocative ways. The subtitle of a collection of his essays is “Notes from a Thoughtful Curmudgeon,” and the biography on the back flap informs us that “at press time he was involved in several scientific controversies about high profile subjects, in which his point of view, though unpopular at the moment, is likely to prevail eventually.”

 

‹ Prev