Book Read Free

The Higgs Boson: Searching for the God Particle

Page 24

by Scientific American Editors


  The protons will travel in nearly 3,000 bunches, spaced all around the 27-kilometer circumference of the collider. Each bunch of up to 100 billion protons will be the size of a needle, just a few centimeters long and squeezed down to 16 microns in diameter (about the same as the thinnest of human hairs) at the collision points. At four locations around the ring, these needles will pass through one another, producing more than 600 million particle collisions every second. The collisions, or events, as physicists call them, actually will occur between particles that make up the protons—quarks and gluons. The most cataclysmic of the smashups will release about a seventh of the energy available in the parent protons, or about 2 TeV. (For the same reason, the Tevatron falls short of exploring tera scale physics by about a factor of five, despite the 1-TeV energy of its protons and antiprotons.)

  Four giant detectors—the largest would roughly half-fill the Notre Dame cathedral in Paris, and the heaviest contains more iron than the Eiffel Tower—will track and measure the thousands of particles spewed out by each collision occurring at their centers. Despite the detectors’ vast size, some elements of them must be positioned with a precision of 50 microns.

  The nearly 100 million channels of data streaming from each of the two largest detectors would fill 100,000 CDs every second, enough to produce a stack to the moon in six months. So instead of attempting to record it all, the experiments will have what are called trigger and dataacquisition systems, which act like vast spam filters, immediately discarding almost all the information and sending the data from only the most promising-looking 100 events each second to the LHC’s central computing system at CERN, the European laboratory for particle physics and the collider’s home, for archiving and later analysis.

  A “farm” of a few thousand computers at CERN will turn the filtered raw data into more compact data sets organized for physicists to comb through. Their analyses will take place on a so-called grid network comprising tens of thousands of PCs at institutes around the world, all connected to a hub of a dozen major centers on three continents that are in turn linked to CERN by dedicated optical cables.

  Journey of a Thousand Steps

  In the coming months, all eyes will be on the accelerator. The final connections between adjacent magnets in the ring were made in early November, and as we go to press in mid-December one of the eight sectors has been cooled almost to the cryogenic temperature required for operation, and the cooling of a second has begun. One sector was cooled, powered up and then returned to room temperature earlier in 2007. After the operation of the sectors has been tested, first individually and then together as an integrated system, a beam of protons will be injected into one of the two beam pipes that carry them around the machine’s 27 kilometers.

  The series of smaller accelerators that supply the beam to the main LHC ring has already been checked out, bringing protons with an energy of 0.45 TeV “to the doorstep” of where they will be injected into the LHC. The first injection of the beam will be a critical step, and the LHC scientists will start with a low-intensity beam to reduce the risk of damaging LHC hardware. Only when they have carefully assessed how that “pilot” beam responds inside the LHC and have made fine corrections to the steering magnetic fields will they proceed to higher intensities. For the first running at the design energy of 7 TeV, only a single bunch of protons will circulate in each direction instead of the nearly 3,000 that constitute the ultimate goal.

  As the full commissioning of the accelerator proceeds in this measured step-by-step fashion, problems are sure to arise. The big unknown is how long the engineers and scientists will take to overcome each challenge. If a sector has to be brought back to room temperature for repairs, it will add months.

  The four experiments—ATLAS, ALICE, CMS and LHCb—also have a lengthy process of completion ahead of them, and they must be closed up before the beam commissioning begins. Some extremely fragile units are still being installed, such as the so-called vertex locator detector that was positioned in LHCb in mid-November. During my visit, as one who specialized in theoretical rather than experimental physics many years ago in graduate school, I was struck by the thick rivers of thousands of cables required to carry all the channels of data from the detectors— every cable individually labeled and needing to be painstakingly matched up to the correct socket and tested by present-day students.

  Although colliding beams are still months in the future, some of the students and postdocs already have their hands on real data, courtesy of cosmic rays sleeting down through the Franco- Swiss rock and passing through their detectors sporadically. Seeing how the detectors respond to these interlopers provides an important reality check that everything is working together correctly—from the voltage supplies to the detector elements themselves to the electronics of the readouts to the data-acquisition software that integrates the millions of individual signals into a coherent description of an “event.”

  All Together Now

  When everything is working together, including the beams colliding at the center of each detector, the task faced by the detectors and the data-processing systems will be Herculean. At the design luminosity, as many as 20 events will occur with each crossing of the needlelike bunches of protons. A mere 25 nanoseconds pass between one crossing and the next (some have larger gaps). Product particles sprayed out from the collisions of one crossing will still be moving through the outer layers of a detector when the next crossing is already taking place. Individual elements in each of the detector layers respond as a particle of the right kind passes through it. The millions of channels of data streaming away from the detector produce about a megabyte of data from each event: a petabyte, or a billion megabytes, of it every two seconds.

  The trigger system that will reduce this flood of data to manageable proportions has multiple levels. The first level will receive and analyze data from only a subset of all the detector’s components, from which it can pick out promising events based on isolated factors such as whether an energetic muon was spotted flying out at a large angle from the beam axis. This so-called level-one triggering will be conducted by hundreds of dedicated computer boards—the logicembodied in the hardware. They will select 100,000 bunches of data per second for more careful analysis by the next stage, the higher-level trigger.

  The higher-level trigger, in contrast, will receive data from all of the detector’s millions of channels. Its software will run on a farm of computers, and with an average of 10 microseconds elapsing between each bunch approved by the level-one trigger, it will have enough time to reconstruct” each event. In other words, it will project tracks back to common points of origin and thereby form a coherent set of data—energies, momenta, trajectories, and so on—for the particles produced by each event.

  The higher-level trigger passes about 100 events per second to the hub of the LHC’s global network of computing resources—the LHC Computing Grid. A grid system combines the processing power of a network of computing centers and makes it available to users who may log in to the grid from their home institutes.

  The LHC’s grid is organized into tiers. Tier 0 is at CERN itself and consists in large part of thousands of commercially bought computer processors, both PC-style boxes and, more recently, “blade” systems similar in dimensions to a pizza box but in stylish black, stacked in row after row of shelves. Computers are still being purchased and added to the system. Much like a home user, the people in charge look for the ever moving sweet spot of most bang for the buck, avoiding the newest and most powerful models in favor of more economical options.

  The data passed to Tier 0 by the four LHC experiments’ data-acquisition systems will be archived on magnetic tape. That may sound oldfashioned and low-tech in this age of DVD-RAM disks and fl ash drives, but François Grey of the CERN Computing Center says it turns out to be the most cost-effective and secure approach.

  Tier 0 will distribute the data to the 12 Tier 1 centers, which are located at CERN itself and at 11 other major insti
tutes around the world, including Fermilab and Brookhaven National Laboratory in the U.S., as well as centers in Europe, Asia and Canada. Thus, the unprocessed data will exist in two copies, one at CERN and one divided up around the world. Each of the Tier 1 centers will also host a complete set of the data in a compact form structured for physicists to carry out many of their analyses.

  The full LHC Computing Grid also has Tier 2 centers, which are smaller computing centers at universities and research institutes. Computers at these centers will supply distributed processing power to the entire grid for the data analyses.

  * * *

  Too Much Information

  With up to 20 collisions occurring at 25-nanosecond intervals at the center of each detector, the LHC produces more data than can be recorded. So-called trigger systems select the tiny fraction of the data that has promising features and discard the rest. A global network of computers called a grid provides thousands of researchers around the world with access to the stored data and the processing power to analyze it.

  Illustration by Slim Films

  * * *

  Rocky Road

  With all the novel technologies being prepared to come online, it is not surprising that the LHC has experienced some hiccups—and some more serious setbacks—along the way. Last March a magnet of the kind used to focus the proton beams just ahead of a collision point (called a quadrupole magnet) suffered a “serious failure” during a test of its ability to stand up against the kind of significant forces that could occur if, for instance, the magnet’s coils lost their superconductivity during operation of the beam (a mishap called quenching). Part of the supports of the magnet had collapsed under the pressure of the test, producing a loud bang like an explosion and releasing helium gas. (Incidentally, when workers or visiting journalists go into the tunnel, they carry small emergency breathing apparatuses as a safety precaution.)

  These magnets come in groups of three, to squeeze the beam first from side to side, then in the vertical direction, and finally again side to side, a sequence that brings the beam to a sharp focus. The LHC uses 24 of them, one triplet on each side of the four interaction points. At first the LHC scientists did not know if all 24 would need to be removed from the machine and brought aboveground for modification, a timeconsuming procedure that could have added weeks to the schedule. The problem was a design fl aw: the magnet designers (researchers at Fermilab) had failed to take account of all the kinds of forces the magnets had to withstand. CERN and Fermilab researchers worked feverishly, identifying the problem and coming up with a strategy to fix the undamaged magnets in the accelerator tunnel. (The triplet damaged in the test was moved aboveground for its repairs.)

  In June, CERN director general Robert Aymar announced that because of the magnet failure, along with an accumulation of minor problems, he had to postpone the scheduled start-up of the accelerator from November 2007 to spring of this year. The beam energy is to be ramped up faster to try to stay on schedule for “doing physics” by July.

  Although some workers on the detectors hinted to me that they were happy to have more time, the seemingly ever receding start-up date is a concern because the longer the LHC takes to begin producing sizable quantities of data, the more opportunity the Tevatron has—it is still running—to scoop it. The Tevatron could find evidence of the Higgs boson or something equally exciting if nature has played a cruel trick and given it just enough mass for it to show up only now in Fermilab’s growing mountain of data.

  Holdups also can cause personal woes through the price individual students and scientists pay as they delay stages of their careers waiting for data.

  Another potentially serious problem came to light in September, when engineers discovered that sliding copper fingers inside the beam pipes known as plug-in modules had crumpled after a sector of the accelerator had been cooled to the cryogenic temperatures required for operation and then warmed back to room temperature.

  At first the extent of the problem was unknown. The full sector where the cooling test had been conducted has 366 plug-in modules, and opening up every one for inspection and possibly repair would have been terrible. Instead the team addressing the issue devised a scheme to insert a ball slightly smaller than a Ping-Pong ball into the beam pipe—just small enough to fit and be blown along the pipe with compressed air and large enough to be stopped at a deformed module. The sphere contained a radio transmitting at 40 megahertz—the same frequency at which bunches of protons will travel along the pipe when the accelerator is running at full capacity—enabling the tracking of its progress by beam sensors that are installed every 50 meters. To everyone’s relief, this procedure revealed that only six of the sector’s modules had malfunctioned, a manageable number to open up and repair.

  When the last of the connections between accelerating magnets was made in November, completing the circle and clearing the way to start cooling down all the sectors, project leader Lyn Evans commented, “For a machine of this complexity, things are going remarkably smoothly, and we’re all looking forward to doing physics with the LHC next summer.”

  -Originally published: Scientific American 298(2), 39-45 (February 2008)

  The Coming Revolutions in Particle Physics

  by Chris Quigg

  When physicists are forced to give a single-word answer to the question of why we are building the Large Hadron Collider (LHC), we usually reply “Higgs.” The Higgs particle—the last remaining undiscovered piece of our current theory of matter—is the marquee attraction. But the full story is much more interesting. The new collider provides the greatest leap in capability of any instrument in the history of particle physics. We do not know what it will find, but the discoveries we make and the new puzzles we encounter are certain to change the face of particle physics and to echo through neighboring sciences.

  In this new world, we expect to learn what distinguishes two of the forces of nature—electromagnetism and the weak interactions—with broad implications for our conception of the everyday world. We will gain a new understanding of simple and profound questions: Why are there atoms? Why chemistry? What makes stable structures possible?

  The search for the Higgs particle is a pivotal step, but only the first step. Beyond it lie phenomena that may clarify why gravity is so much weaker than the other forces of nature and that could reveal what the unknown dark matter that fills the universe is. Even deeper lies the prospect of insights into the different forms of matter, the unity of outwardly distinct particle categories and the nature of spacetime. The questions in play all seem linked to one another and to the knot of problems that motivated the prediction of the Higgs particle to begin with. The LHC will help us refine these questions and will set us on the road to answering them.

  The Matter at Hand

  What physicists call the “Standard Model” of particle physics, to indicate that it is still a work in progress, can explain much about the known world. The main elements of the Standard Model fell into place during the heady days of the 1970s and 1980s, when waves of landmark experimental discoveries engaged emerging theoretical ideas in productive conversation. Many particle physicists look on the past 15 years as an era of consolidation in contrast to the ferment of earlier decades. Yet even as the Standard Model has gained ever more experimental support, a growing list of phenomena lies outside its purview, and new theoretical ideas have expanded our conception of what a richer and more comprehensive worldview might look like. Taken together, the continuing progress in experiment and theory point to a very lively decade ahead. Perhaps we will look back and see that revolution had been brewing all along.

  Our current conception of matter comprises two main particle categories, quarks and leptons, together with three of the four known fundamental forces, electromagnetism and the strong and weak interactions. Gravity is, for the moment, left to the side. Quarks, which make up protons and neutrons, generate and feel all three forces. Leptons, the best known of which is the electron, are immune to the strong f
orce. What distinguishes these two categories is a property akin to electric charge, called color. (This name is metaphorical; it has nothing to do with ordinary colors.) Quarks have color, and leptons do not.

  The guiding principle of the Standard Model is that its equations are symmetrical. Just as a sphere looks the same whatever your viewing angle is, the equations remain unchanged even when you change the perspective from which they are defi ned. Moreover, they remain unchanged even when the perspective shifts by different amounts at different points in space and time.

  Ensuring the symmetry of a geometric object places very tight constraints on its shape. A sphere with a bump no longer looks the same from every angle. Likewise, the symmetry of the equations places very tight constraints on them. These symmetries beget forces that are carried by special particles called bosons.

  In this way, the Standard Model inverts Louis Sullivan’s architectural dictum: instead of “form follows function,” function follows form. That is, the form of the theory, expressed in the symmetry of the equations that define it, dictates the function—the interactions among particles— that the theory describes. For instance, the strong nuclear force follows from the requirement that the equations describing quarks must be the same no matter how one chooses to defi ne quark colors (and even if this convention is set independently at each point in space and time). The strong force is carried by eight particles known as gluons. The other two forces, electromagnetism and the weak nuclear force, fall under the rubric of the “electroweak” forces and are based on a different symmetry. The electroweak forces are carried by a quartet of particles: the photon, Z boson, W+ boson and W– boson.

 

‹ Prev