Why String Theory?
Page 11
The study of gravitational lensing has led to one of the most striking pieces of evidence for dark matter, in the so-called Bullet Cluster. The Bullet Cluster is the aftermath of a collision between two large clusters of galaxies. In this collision, the familiar matter within each cluster has collided, leading to the formation of a visible shock wave where the two clusters have passed through each other. The lensing techniques can however also map the total mass distribution – including the dark matter. The dominant dark components have passed through each other smoothly without colliding, and are well separated from the shock wave of ordinary matter. There is no evidence for any interactions between the dark matter components – but they can still be traced through their ability to bend light.
In this way, dark matter is something defined largely by an absence: it is the difference between what is weighed and what is seen. It is natural to wonder whether this really counts as good evidence for the actual existence of dark matter. For example, some ‘failed stars’ such as brown dwarfs can be compact, massive objects that emit almost no light. However, these do not require any new particles or new physics: they are just known astrophysical objects, made up of familiar constituents, that happen to be both massive and dark. Another possibility might be small black holes: black holes are also dark objects that do not require any new forms of matter for their creation.
While once weighty, these objections are no longer sustainable. The existence of dark matter is no longer inferred simply from the rotational properties of stars in galaxies. Even in the earliest epoch of the universe, when the universe was only a few hundred thousand years old, the existence of dark matter can be established through the spectrum of light present in the cosmic microwave background. This spectrum contains within it evidence for acoustic sound waves in the early universe, oscillations that are analogous to the vibrations of a drumhead and are imprinted within the structure of the microwave background. These oscillations arose in the early universe in a bath of protons, electrons and light. The details of this spectrum tell us how much matter was present in the form of protons and electrons – and how much was not. The part that was not represents new matter with at best very weak interactions with familiar particles – dark matter. In this way, the cosmic microwave background provides additional, independent, evidence for the existence of dark matter.
What do we learn from the three jigsaw pieces in this chapter? We learn that the framework of modern physics, the Standard Model plus general relativity, is not and cannot be complete. It is not complete theoretically: it is a hybrid between quantum and classical theories of forces. It is not complete internally. It contains patterns and structures that must arise from a deeper underlying theory. It is not complete experimentally: it provides no suitable candidate for the dark matter that pervades the universe. We realise that what has been achieved is magnificent, but also that magnificent is not enough – something new is needed. We do not know what that New Thing should be, or what form it should take, but what we know does tell us that it must exist.
None of this serves, as such, as an argument for string theory. It is instead an argument that we cannot rest on our laurels. Physics is not finished. New structures must exist, and it is important to explore what these new structures might be. If string theory is anything, it is a consistent theoretical structure that connects to known ideas in physics while also extending them. We know that deeper truths about nature are required. While not guaranteed, many think string theory will represent some part of these deeper truths.
It is now time to start describing string theory.
1One can of course reasonably view the question of how to organise an enormous number of contributing terms as a conceptual difficulty in itself.
2As a technical aside, these correspond to particles with integer or integer-plus-one-half quanta of angular momentum.
3The most profitable development during this time at Stony Brook, however, was happening around the corner in the math department, where the department chair Jim Simons was formalising his investment hobby into the firm that grew to become Renaissance Technologies, one of the world’s largest and most successful hedge funds.
4The underlying reason for this is that the interactions of particles with the Higgs boson are proportional to their mass, and so the heavier the fourth family is, the more strongly it couples to the Higgs boson. It is this property that implies the impossibility of decoupling any fourth family to arbitrarily large masses, as the decoupling effects of large masses are precisely cancelled by the ever-increasing strength of their coupling to the Higgs boson.
5Pedant’s corner: what the experiments actually measure is the difference in the squares of the masses.
6In this case, there is a prime suspect for one part of the hidden structure. The vanishing of the theta angle can be explained if the Standard Model is extended to include a very light, very weakly interacting particle called an axion (discussed more in chapter 10). While the axion has not been found yet, there is an active experimental program searching for its existence.
II
What?
CHAPTER 5
What Was String Theory?
5.1 THE BIRTH
What is meant by string theory?
It is 1968. In the United Kingdom, England are World Cup holders and the last mainline steam trains take their final journey from London to Carlisle. Enoch Powell speaks about immigration, evoking the river Tiber foaming with much blood, and is dismissed from the Conservative front bench. In Paris, students riot, and in the Vatican Pope Paul VI issues the encyclical Humanae Vitae. In the United States, the Standard and Poor index closes above 100 for the first time, while the rich scents of marijuana and free love merge with chants of ‘Ho Ho Ho Chi Minh, the Viet Cong are going to win!’
At CERN in Switzerland, a young 26-year-old Italian physicist named Gabriele Veneziano is trying to understand the properties of the strong nuclear force. The strong nuclear force is at this time entirely baffling. A large number of strongly interacting particles have been discovered, but there is a lack of an organising principle for how they behave. Veneziano is trying to understand how strongly interacting particles scatter off each other. In particular, he is thinking about what happens when two pion particles are collided to make one pion and one omega particle. He realises that a single formula can capture many of the features of this process – a single formula that involves a famous function from mathematics, the beta function of Leonhard Euler, the greatest mathematician of the 18th century. Veneziano writes a paper, published on the 1st of September 1968 in the Italian scientific journal Il Nuovo Cimento, in which he puts down this formula, suggests it may describe the strong interactions, and studies many of the interesting properties possessed by it.
The article was about the messy properties of the strong force. The words ‘string’ or ‘strings’ appeared nowhere within it, and quantum gravity was not even within the outskirts of the penumbra of the concepts discussed. Nonetheless, this paper was the first-ever paper on string theory, and it acts as an intellectual Eve from which all subsequent work on the subject is descended from.
The formula Veneziano wrote down during that heady summer of 1968 is now called the Veneziano amplitude – the word ‘amplitude’ being the conventional scientific term to describe how likely it is that particles will scatter off each other. Even at the time, Veneziano knew he had discovered something important – and so did others. One of the early pioneers of string theory, Joel Shapiro, described it as follows.
This paper arrived at the Lawrence Radiation Lab in Berkeley in the summer of 1968 while I was away… and I returned to find the place in a whirlwind of interest. Everyone had stopped what they were doing, and were asking if this idea could be extended to a more accessible interaction, such as ππ → ππ [the scattering of two pion particles from each other].
Gabriele Veneziano had found a key, and at the time he and others thought that the lock this key turned was in the door to und
erstanding the strong interactions. There was an immediate rush to generalise his formula to other processes, such as the scattering of two pions into two pions, or to extend it further to the scattering of two pions into three or four pions.
The Veneziano amplitude described the scattering of two particles into two particles. Whatever theory the amplitude was part of, this theory should also describe the scattering of two particles into three particle, or three particles into three particles, or twelve particles into seventeen particles. One target was there: generalise the Veneziano amplitude into an N-particle amplitude. Within months, this problem had been solved by several independent groups, to give formulae that would describe the scattering of N particles within whatever theory Veneziano’s amplitude described.
These results were part of an explosion of interest in this topic, as it became clear that a new theoretical structure had been identified. In various places groups of physicists turned their thoughts to these ideas – in particular at CERN, where a large group under the inspirational and charismatic leadership of Daniele Amati attracted interested minds from around the globe. At first this structure was called the Dual Resonance models. These Dual Resonance models captured some – not all – aspects of the strong interactions, and the models appeared to have some internal problems, but it was hoped that further work would see a solution to these issues.
The Dual Resonance models defined a relativistic quantum theory and contained lots of structure that required more study. It is important to remember that these Dual Resonance models were still exactly that: ‘string theory’ did not exist yet, and the only professed motivation for studying these models was for their ability to describe the strong interactions.
It was next necessary to understand the conditions required for the consistency of this structure. It was well understood that quantum theories contain consistency requirements that do not apply to classical theories, but the procedure was thought to be straightforward: there should be no ‘unphysical’ states in the spectrum and no negative probabilities. Through 1970 and 1971 many physicists worked on understanding these new structures, and one of them, Claud Lovelace, came up with a curious and unexpected result: the Dual Resonance models were indeed consistent quantum theories – precisely if they were formulated in twenty-six dimensions (twenty-five spatial dimensions and one time dimension). With hindsight, this was the first inkling that the theory under consideration would have to involve extra dimensions.
Towards the end of 1971 Lovelace accepted a faculty position at Rutgers University in New Jersey. His result about twenty-six dimensions did not lead to him being regarded by his colleagues with awe. In Lovelace’s words,
I was the only professor not being promoted despite the many citations of my papers. However, the jeers of the physics establishment did have one good consequence. When my discovery of the critical dimension turned out to be correct and significant, they remembered that I had said it first. One has to be very brave to suggest that spacetime has 26 dimensions.
We would now say that the discovery of the Veneziano amplitude initiated the first great wave of study of ‘string theory’, from 1968 to 1973, marked by a study of the Veneziano amplitude, its extensions, and their properties. All of the above is now viewed as the first epoch of string theory. However, for much of this period, the response to the question ‘What is string theory?’ would have been ‘String theory? What have strings got to do with anything? We are studying amplitudes to describe the strong interactions!’
5.2 THE NAMING CEREMONY
In science, everything is clear in retrospect. Once you know what is essential and what is a distraction, the path to illumination is easy. Without these benefits, periods of muddle are necessary preliminaries for important results. Discoveries are only major if they resolve previous confusion, and the best researchers seek to maximise, not minimise, their number of ‘How could I have been so bloody stupid!’ moments.
It is therefore strictly the advantage of hindsight that makes it clear that the Dual Resonance models really were a set of amplitudes for the scattering of quantum relativistic strings from each other. However this realisation did not come until some time after interest in them was well established. It arose independently through the work of Yoichiro Nambu, Holger Bech Nielsen and Leonard Susskind, around 1970 and 1971. These three men came from very different backgrounds. Nambu was born in Japan in 1921 and had to learn his physics among the chaos and destruction of first wartime and then occupied Tokyo. He spent three years having to sleep on a straw mattress on his office desk. Nielsen was Danish, born in 1941 soon after the Nazi occupation of Denmark. Susskind, born a year earlier in 1940, was a product of the straight-talking Bronx Jewish community in New York. The scientific careers of these three men would later go in interesting directions. Almost forty years later, Nambu was the winner of the 2008 Nobel Prize for physics; Nielsen was exploring ideas where a time travel conspiracy would cause the universe to prevent the discovery of the Higgs boson; and Susskind was writing a popular book, The Cosmic Landscape, explaining how the anthropic multiverse was the scientific answer to intelligent design.
The realisation that there was an underlying theory of strings was not instant and came in stages. How did it occur? Analysis of the Veneziano amplitude made it apparent that the theory it described included particles of progressively increasing mass. From it, it was possible to read off both the number of particles involved and their masses. The first clue to the stringlike nature of this amplitude was the fact that the particle energies (given Einstein’s identification of energy and mass) inferred from the Veneziano amplitude were also the energies that arose from studying the vibrational energies associated to a string.
Where do the energies of a string come from? A string can be plucked, and can oscillate, in every direction transverse to its length. A string living in two spatial dimensions has one direction it can oscillate in; a string in three spatial dimensions has two directions it can oscillate in; a string in twenty-five spatial dimensions has twenty-four directions it can oscillate in. A plucked string can oscillate both at its lowest, fundamental frequency and also at higher frequencies – harmonics, which can be excited along any of the plucked directions. There is an energy associated to each harmonic: the higher the frequency, the greater the energy. The first inklings of strings came from the fact that counting these harmonics and their energies revealed a precise match with the particle content and energies of the Veneziano model.
‘String theory’ slowly began to emerge as in 1971 first Nambu and then Tetsuo Goto wrote down a set of equations from which the Veneziano amplitude would emerge. These equations were those of a relativistic string, and were the starting point from which a systematic calculation procedure could be developed. The Veneziano amplitude had been found essentially by guessing an answer. The Nambu-Goto equations for a string now provided a set of principles from which the Veneziano amplitude could be derived.
We have already seen in chapter 3 that quantum mechanics is more constraining than classical mechanics: a consistent classical theory does not necessarily imply a consistent quantum theory. Was the Nambu-Goto string another such case? It was a nice classical theory, but did it make sense in quantum mechanics? The quantum consistency of the Nambu-Goto string was established in 1973 by Peter Goddard and Charles Thorn.1 Their result was called the No-Ghost Theorem, and what this exercise in ghostbusting involved was showing that the quantum string was indeed consistent and had no physically meaningless configurations – which are known as ghosts for obscure technical reasons. This result held precisely when the number of dimensions was twenty-six, thus confirming the older result of Lovelace for the Dual Resonance models.2
The development of string theory also allowed a further loose end from the Dual Resonance models to be tied up. In addition to the Veneziano amplitude, another amplitude, similar in spirit but different in detail, had been discovered by Joel Shapiro and Miguel Virasoro. This amplitude also described the
scattering of two particles into two particles. It shared many similar properties with the Veneziano amplitude, and it also required twenty-six dimensions for consistency. However, this amplitude was different in detail, and it was clearly not describing the same process.
It is an elementary observation that strings come in two kinds – those that join up back to themselves, as in a loop, and those that have endpoints, such as violin strings or shoelaces. The former are called closed strings and the latter are called open strings. With study, it became apparent that the Shapiro-Virasoro amplitude described the behaviour of closed strings while the Veneziano amplitude was the one appropriate for open strings that had endpoints: as the strings involved were different, the amplitudes took slightly different forms. In this way both the Veneziano amplitude and the Shapiro-Virasoro amplitude could be understood as arising from theories of strings – and with this realisation, string theory was born and came into its patrimony, while ‘Dual Resonance models’ had to subsume themselves into this broader framework.
If asked what ‘string theory’ was at this stage, however, the form of the answer would have been the same as for the Dual Resonance models: it was a candidate theory for understanding the strong interactions. This candidate theory was about to be flattened though by the combination of new experimental data and a new theory of the strong force, quantum chromodynamics.
It was not just a new social and cultural outlook that was brewing throughout the late 1960s and early 1970s. New accelerators were also being constructed across the world. In California the Stanford Linear Accelerator, operating from 1967 onwards, was accelerating electrons to higher energies that had ever been achieved before, and then colliding them with protons and neutrons. By doing so, it was probing the inner structure of the proton and neutron, the particles held together by the strong force. At CERN the Intersecting Storage Rings had started operating in 1971. These were at the time a technological marvel. Instead of accelerating one beam of particles and firing it at a stationary target, it was able to circulate two distinct beams, one clockwise and one anticlockwise, and focus and collide them into each other. These accelerators were able to probe the strong force at higher energies than ever before, and in doing so, led to the discovery of a novel phenomenon – hard scattering.