Cracking the Particle Code of the Universe
Page 17
Many of the BSM avenues of research are faced with the problems of lack of unitarity and lack of renormalizability in the weak-interaction theory. If it turns out that there is no Higgs boson, the thinking is that there must be other ways of solving these problems. The culprit in the weak-interaction theory is the charged W massive boson; without the Higgs boson and Higgs mechanism, the W boson destroys the theory because we lose gauge invariance. We recall that this results in the theory not being renormalizable and unitary, unlike QED. Thus, without some radical new way of solving the weak-interaction problem, we are faced with a failed theory. In QED, because the photon is massless and QED has the all-important gauge invariance, the photon’s interaction with the electrons avoids all the trouble with obtaining finite calculations of cross-sections and there is no problem with unitarity and probabilities of scattering amplitudes. However, in weak interactions, the fact that the W boson has to be massive to produce the short-range weak interactions spoils this QED picture. Without the Higgs boson, we may lose both renormalizability and unitarity. The Higgs and its accompanying spontaneous symmetry-breaking Higgs mechanism rescued us from this W boson mass problem, and this solution has been entertained for more than 40 years. The whole superstructure of this theory could collapse when you remove the Higgs boson and the spontaneous symmetry-breaking mechanism.
A major part of the standard model of particle physics is quantum chromo-dynamics QCD, which describes the strong interactions of quarks with gluons using the nonabelian group SU(3). Because the gluons are massless, the theory is fully gauge invariant and also renormalizable. Therefore, the weak interactions, involving the leptons, the W and Z bosons, and the photon, present a greater need for BSM physics than the strong interactions, which involve quarks and gluons. We must somehow succeed in making the electroweak theory finite and meaningful to have reached a successful goal of a standard model of particle physics.
TECHNICOLOR
In solid-state physics, the explanation of how superconductivity works lies in having pairs of electrons (Cooper pairs) binding and condensing together at very low temperatures in metals. In the standard model of particle physics as proposed by Weinberg and Salam, the Higgs boson is considered an elementary scalar particle, not a composite of other particles. But what if the Higgs is a composite of other particles, like the Cooper pair condensates in superconductivity? In 1976, Steven Weinberg,3 followed by Leonard Susskind in 1979,4 and Savas Dimopoulos and Susskind also in 1979,5 proposed another way of solving the weak-interaction problem of lack of renormalizability and unitarity. They called their solution “Technicolor.” What they proposed was that we replicate the quark-theory QCD at a higher energy with new particles, called Technicolor particles because they would carry the quark characteristic called color. (Recall that hadrons are observed to be colorless because they are composite particles containing quarks, with three colors that combine, or cancel out, to form a white or colorless hadron.) According to this theory, the Higgs boson is a composite of these Technicolor particles. In the original models, these physicists were concerned only with predicting the masses of the W and Z bosons.
Some versions of Technicolor do not have a Higgs boson at all. Instead of proposing an elementary Higgs boson to explain electroweak phenomena, the Technicolor models “hide” the electroweak symmetry and generate the W and Z boson masses through the dynamics of new, postulated gauge interactions. These new gauge interactions were made invisible at lower energies to fit the experimental data from low energies that do not, so far, reveal these interactions.
The early versions of Technicolor were extended so that one could predict the masses of the quarks and leptons. However, these models ran into trouble in that they predicted neutral current flavor changing in decays of Technicolor particles that violated experimental data. Bob Holdom then proposed a way of avoiding these problems by introducing a type of fine-tuning called walking Technicolor.
A notable feature of the Technicolor theory is that the interactions of the Technicolor particles are intrinsically strong interactions, not weak. Although the Technicolor model is a copy of the lower-energy QCD, new strong forces have to be postulated to bind the Technicolor particles together. Therefore, the perturbation methods of quantum field theory used in the standard electroweak model or in QED and in high-energy QCD cannot be used. In low-energy QCD, when it is necessary to explain the confinement of quarks in hadrons, perturbative methods of calculation can fail, and ill-understood nonperturbative methods have to be used, with a large QCD coupling constant.
We do not understand how to perform calculations in which the coupling strength of the interactions is large. The coupling strength of QED, of photons and electrons, is determined by the fine-structure constant, alpha, which is approximately 1 divided by 137, and therefore a small number compared with unity. This small coupling strength of photons and electrons allows us to do the perturbation calculations such that each order of calculation is under control; the magnitude of each order is smaller than the preceding one. For strong interactions of particles in Technicolor models, however, the coupling strength is larger than unity, and we can no longer use perturbation theory. The problems of unitarity and probabilities of cross-section calculations can be resolved potentially in these strong-interaction theories. However, it is difficult to draw any affirmative conclusions about the validity of Technicolor when one cannot calculate anything rigorously with the theory.
Many variations of Technicolor theory have been developed during the past three decades, and the precise data for electroweak experiments at Fermilab have, to some extent, discredited them; these theories do not agree well with the experiments. Somewhat contrived methods have been proposed to counteract the disagreement with the experiments, one of them being Holdom’s walking Technicolor model.
Of course, the major problem with Technicolor is that we have to discover all these new Technicolor particles, and the LHC so far appears to have ruled out the obvious lower-energy Technicolor particles such as techni-pions and techni-rhos, which are expected to have masses well below 1 TeV. (Note that, for example, the hypothetical techni-pion, which is the equivalent of the ordinary pi meson, carries color charge whereas the well-known pi meson does not.) The LHC has to explore at high energies whether these Technicolor particles exist. So far, none has been observed, which of course undermines the whole idea of Technicolor. Whenever BSM practitioners are told that a particle they predicted has not been discovered at the LHC, they come up with ways of increasing the mass of the particle beyond the accelerator’s current ability to observe it, thus keeping the theory alive. This is happening with Technicolor and supersymmetry today.
One of the chief justifications for the elementary Higgs boson in the standard model is that it generates the masses of the other elementary particles, the quarks and leptons. This happens when the Higgs scalar field has a nonvanishing vacuum expectation value. In quantum mechanics and particle physics, the particles we have observed do not actually exist in a vacuum, except as virtual particle/antiparticle pairs that annihilate one another continually. The vacuum expectation value is the constant nonvanishing potential energy of the particle field in the vacuum. That is, in contrast to other particle fields, the Higgs scalar field energy does not vanish in the vacuum. When Technicolor includes a composite Higgs particle, then it also has a nonvanishing vacuum expectation value.
In the “extended Technicolor model,” which predicts the masses of the quarks and leptons, physicists increase the number of gauge bosons in the Technicolor theory that couple to quarks and leptons, thereby proliferating even more the number of particles necessary to make Technicolor a physically viable theory. Again, the LHC will eventually discover or rule out all these hypothesized particles as it reaches higher and higher energies. One of the most severe constraints on all the Technicolor models is the experimental fact that quarks do not change their “flavor” when they transmute from one quark to another through weak interactions. As we recall
, the six flavors of quarks make up three generations, with two flavors in each generation. A quark can only decay into another quark in its own generation—that is, without changing flavor. Thus, it is observed that a bottom quark cannot decay into a strange quark. This is an experimental constraint that has to be satisfied in Technicolor models, and it requires a lot of fine-tuning and rather contrived mechanisms to save the model.
ALTERNATIVE COMPOSITE HIGGS MODELS
Besides Technicolor, there are other ways to form composite Higgs models. One is to make the Higgs particle a composite of known quarks and antiquarks, such as the top and antitop quark bound state, called a top quark Higgs condensate. This can be compared with the Cooper pairs condensate of electrons in superconductivity. Because the top quark has a mass of 173 GeV, a strong enough force has to be postulated to bind the top quark and the antitop quark into a condensate. A scalar Higgs condensate, made of quarks, replaces the idea of the Higgs boson as an elementary particle.
Several authors have considered the idea that a top quark condensate would get rid of the fine-tuning problems that have plagued the elementary Higgs boson in the standard model. The original idea of a top–antitop condensate was proposed by Yoichiro Nambu and was elaborated further by Vladimir Miransky and Koichi Yamawaki.6 They used a four-fermion interaction to describe the force that binds the top and antitop quarks to make a condensate. Four-fermion interactions, with quarks and leptons, are not mediated by intermediate gauge bosons such as the W and Z. Such an interaction is not renormalizable, so it would produce unwanted infinities in calculations. The papers on this idea unfortunately predicted that such a composite Higgs boson would have a mass of about 600 GeV. Clearly, this does not agree with the mass of the top quark, which is now known with accuracy to be about 173 GeV. The condensate would therefore be roughly twice the mass of the top quark, or 346 GeV. The problem with predicting the mass of the Higgs boson from this model arises because the top quark and antitop quark are bound together through a gauge boson, the gluon, which produces a binding energy that has to be accounted for, leading to serious discrepancies with the observed mass of the top quark. We now know, through fitting the precise electroweak data, that a composite top–anti-top quark model must produce not a heavy Higgs, but a light Higgs boson, with a mass between 115 GeV and 135 GeV. Moreover, the discovery of the new Higgs–like boson at 125 GeV forces the top–antitop condensate model to incorporate a light scalar Higgs boson, which it has difficulty accomplishing.
Needless to say, particle physicists with their unbridled imaginations have extended the top quark condensate model of the Higgs boson. For example, they included neutrino–antineutrino condensates into the model and other possible quark condensates, thereby potentially lowering the predicted mass of the Higgs particle, making it agree with the global fits of the theory to accurate electroweak data.
A problem with the top–antitop quark condensate is that the top quark decays so rapidly into a bottom quark and a positively charged W boson that the bound state, called toponium, cannot exist long enough to be detected. (The lifetime of the top quark is about 5 × 10−24 seconds.) Although toponium has not been detected, it is still considered a state of quarkonium. In contrast, the bottom–antibottom quarkonium state, bottomonium, has been detected at SLAC with an energy of about twice the bottom quark mass—that is, 9 to 10 GeV.
Literally hundreds of papers have been published since 1976 on the topic of creating a physically consistent model of a composite Higgs boson. Recently, physicists have even considered the possibility that the quarks and W and Z bosons are composites of other particles called preons, which in turn can produce a composite Higgs boson on their own. The postulate that quarks are made of preons produces new and higher energy scales, which can help to resolve the mass hierarchy problem. These models attempt to remove an unnatural feature of the standard model with an elementary Higgs boson—namely, that the theory has an unstable vacuum. In addition, the standard model with an elementary Higgs boson has the enormous unnatural mass scale difference between the electroweak energy scale of about 200 to 300 GeV and the Planck energy scale of 1019 GeV when gravity is speculated to become a strong force. The preon models attempt to resolve this energy hierarchy problem. However, experiments at the LHC during 2011 and 2012 searched for a composite structure of quarks but did not discover any such structure, or any evidence of preons, up to an energy of beyond 1 TeV.
In the event that the LHC does not find any new particles beyond the new boson at 125 GeV, as it increases its energy up to a maximum of 14 TeV, then this mountain of papers suggesting alternative theories will have been produced in vain, which only emphasizes the significance of experimental physics in our quest to understand the nature of matter.
A NON-HIGGS RESONANCE INTERPRETATION
The problem of identifying the new boson at 125 GeV with a spin-0 boson is that, as far as I could ascertain, that would mean that there are only two possible candidates: either it is indeed the elementary scalar Higgs boson or it is some kind of quark–antiquark resonance not formed as a condensate of the top–antitop quarks, which is why I investigated the possibility of a quark–anti-quark resonance.
Let us look more closely at this model. I postulated the existence of two new quarkonium resonances, which I named zeta (ζ) and zeta prime (ζ′). The zeta, which is the lighter of the two, can be identified with the new boson discovered at the LHC, with a mass of 125 GeV. Both the zeta and zeta prime are electrically neutral, and have isospin 0 (isospin singlet), so they have quantum numbers the same as the much lighter pseudoscalar mesons, eta (η) and eta prime (η′), which were first detected during the 1960s.7
My new zeta resonances are superheavy quarkonium states, so there is a richer spectroscopy of excited quark/antiquark states associated with their decays, similar to charmonium and bottomonium. The higher excited states of the zeta and zeta prime would be much more difficult to detect than the lowest S-wave state, which has spin 0 and is a pseudoscalar meson—that is, it has negative parity, the same as the eta and eta prime mesons. The zeta and zeta prime interact with one another, and the size of this interaction is measured by a mixing angle of 36 degrees. The mixing angle is calculated using the masses of bottomonium and toponium, which are bound states of bottom–antibottom and top–antitop quarks, respectively. By requiring the zeta mass to be 125 GeV, the predicted mass of the zeta prime resonance is 230 GeV. The strength of the interaction causing the mixing of the zeta and zeta prime is mainly a result of nonperturbative gluon interactions of the top and bottom quarks. This interaction is due to what is called the UA (1) axial anomaly first proposed by Gerard ‘t Hooft during the 1970s to explain the large mass difference between the eta and eta prime pseudoscalar mesons. The zeta meson resonance is a bound state of a quark and an antiquark with an effective constituent quark mass of roughly 62 to 63 GeV each, so that the zeta then has a mass of approximately 125 GeV, taking a small binding energy into account.
In this quarkonium model, I stress that the predicted pseudoscalar resonance is not a standard-model Higgs boson nor a quark–antiquark composite Higgs particle condensate. Instead, it is a sequential quark–antiquark resonance at a higher energy than the well-known observed quarkonium resonances at lower energies. In part, I constructed this model to serve as a cautionary message that we should not rush to identify the 125-GeV bump as a standard-model elementary Higgs boson.
8
Electroweak Gauge Theories
Progress in physics is often achieved by viewing popular standard theories from a different point of view. This approach tests the robustness of theories. In my research I find that by constructing an alternative model, I achieve a much deeper understanding of the prevailing standard model. It is necessary to have alternative theories so that the standard model can be compared with them, and also to see which theory best explains the data.
During the early 1990s, I was pursuing alternative theories in cosmology and gravity as well as investigati
ng an alternative to the standard Weinberg–Salam model of electroweak theory, which did not include a Higgs particle.
ALTERNATIVE THEORIES AS FOILS TO STANDARD MODELS
In 1992/1993, I published two papers invoking a variable speed of light (VSL) to resolve initial value problems in cosmology, several difficulties in our understanding of events right after the Big Bang.1,2 This was an alternative to the standard inflation model proposed by Alan Guth and others during the early 1980s. VSL could explain just as well as inflation the so-called horizon problem, the fact that parts of the early universe that were far removed from each other could apparently have the same temperature, without being able to “communicate” with each other. Different parts of the universe could not be in communication through the standard measured speed of light without violating causality, for it would take time for light to cross the universe. Inflation does away with this problem by a sudden, exponential growth of spacetime in the very early universe after the Big Bang, whereas VSL increases the speed of light by many orders of magnitude. The VSL model I published predicted a scale-invariant spectrum of matter fluctuations, as did inflation theory, which agreed with observations. Moreover, VSL also explained why the geometry of the universe is observed to be spatially flat. A VSL cosmology required that I modify Einstein’s gravity theory and special relativity. In particular, my theory broke the Lorenz invariance symmetry of special relativity. A critical observation that can discriminate between inflation models and VSL is the detection of gravitational waves. Today, VSL is established as a possible alternative to inflation theory; one of the physicists most closely associated with this development is João Magueijo at Imperial College London, who in collaboration with Andreas Albrecht published a paper on VSL.3