The second reason is that the subject contains enough trumpet blowing as it is. It is genuinely true that quantum field theory and quantum gravity are hard and important subjects. It is genuinely true that in order simply to understand these topics, and even more so to contribute to them at a meaningful level, one needs a level of analytical intelligence that is higher than average. It is also true that a prerequisite for a career in the subject is a solid work ethic and a healthy dose of self-confidence – indeed, most young physicists start with the secret hope of contributing at the level of Einstein, Dirac or Feynman.
It does not follow that everyone who has proposed an idea that is not immediately ruled out is therefore a genius. Both an overly large self-regard and a feeling of being set apart from one’s fellow citizens feature heavily in the deformation professionelle of theoretical physics. It is too easy to think that, through contemplating the laws of nature, one belongs to a higher, more exalted plane of existence than those whose sweat and taxes fund this research. The conviction of genius is commoner than the reality, and physicists solving problems have more in common with sewage engineers clearing blockages than they care to admit.
The third – and best – reason is that physics is done by physicists, and physicists are human. We all have different paths through life, and these are reflected in the varied approaches to physics. Physics is taught largely as an impersonal tale of results in a textbook. In the real world of active research on open problems, personality enters both in the problems people choose to address and the way they go about attacking them. A range of questions benefits from a range of physicists, and without a range of approaches the subject would stagnate. Einstein was a different kind of physicist than Dirac, who was a different kind of physicist than Feynman. No one type is ‘better’ absolutely, anymore than a spanner is better’ than a hammer which is better’ than a screwdriver.
Anyone who holds that science is a social construct is nuts. However anyone denying that social factors do in part determine which topics are regarded as important is also nuts. In the long term, experiment is the great and impartial judge that depersonalises the subject. In the short term, if you want an honest understanding of the state of current research, then it is good to have a sense of the types of people who do it.
12.1 THE REVOLUTIONARIES
There are few subjects as mythological as the history of science as taught by scientists. The conventional telling of scientific history is a wonderful story. It is a story that crosses hundreds of years and involves diverse languages and cultures. It is a story of brave and unconventional thinkers who have overturned the dead hand of authority through their own independence of mind. It is a story of heroes. There is Christopher Columbus, who set off westwards to reach the East Indies in defiance of the flat-earth dogmatists of the Spanish court. There is Galileo, surrounded by sour-faced cardinals ordering him to recant, but still softly murmuring ‘Eppur si muove’. There is Isaac Newton, struck by a falling apple and being inspired to his insight of universal gravitation. There is Charles Darwin and his bulldog Thomas Henry Huxley, putting forward and defending evolution in the teeth of opposition from a clerical establishment insisting on the inerrancy of scripture. There is the shock white hair of Einstein and the joker Richard Feynman – what do you care what other people think? On this account, the main characteristic of a good scientist is conceptual originality and contempt for authority. This history is magnificent. This history is inspiring. This history is also, sorry to say, bunk.
It is true that almost all discoveries of the first rate fly in the face of conventional wisdom and are contrary to the generally accepted theories of the time. This truth is, however, tautological. A result that only confirms that which everyone already suspected to be true is, entirely reasonably, less important than one which refutes it. The discovery of the Higgs boson at the Large Hadron Collider was widely anticipated. It was expected to happen. It did happen. The result was important – good champagne was uncorked, and Peter Higgs and François Englert got the Nobel Prize. However, it was not revolutionary. The Higgs boson was the last missing ingredient of the Standard Model of particle physics, and its absence would actually have been more surprising than its presence.
The converse is not true. Personally, I am so far a moderately successful physicist with a decent career but no earth-shattering results or any broader name recognition. However, even I still receive several times per month accounts by email of amazing theories that correct the errors of Newton, Einstein and Maxwell, and on occasion all three simultaneously. These theories promise high and deliver low, and they are wonderfully and bizarrely bonkers.1 Regrettably, the fact that an idea is believed nonsense by the scientific establishment and contradicts hundreds of years of theoretical and experimental spadework is not sure and certain evidence that the author can book his tickets to Stockholm.2
Setting aside crackpots, it is still true that a lot of scientific work is drudgery. Drudgery is perhaps an unfair word. The majority of work in theoretical physics requires careful and accurate calculation over a period of months, years or decades. The Standard Model is an example of a quantum field theory, and as discussed in chapters 3 and 8, quantum field theory is complex and has many subtleties.
The great value of physics, and what sets it apart from many other disciplines, is the ability to predict. But, when working out these predictions it is not easy either to obtain the correct answer or to know the answer is correct once it has been found. The ability to predict is a byproduct of the ability to calculate, and the ability to calculate is a byproduct of the ability to apply, for an extended period, the seat of the pants to the seat of the chair.
Most of the time such calculations only extend the known validity of pre-existing theories such as the Standard Model. This work is important, but in a sense routine. The basic rules are known, and the job is to apply them correctly. It entails accepting the job of cook rather than captain on the Mayflower. You cannot fail spectacularly, but the chances for glory are also limited.
Calculation is also a reminder of the collectivist nature of much of physics. The subject is built up through the labours of many people over many years. The underlying equations of the Standard Model of particle physics were fully written down by the middle of the 1970s. To a divine and omniscient mind, all its logical consequences would have been immediately visible.3 To us, they are not. Obtaining any sort of sight of them requires first many hours of labour and then again many more hours removing the errors that have inevitably crept in.
Some people rebel against this. Undergraduate physics is an unparalleled intellectual experience: it is a smorgasbord of the deepest and most powerful thoughts that have ever been thunk. You learn physics at a rate of a Nobel Prize a week, and the resulting frisson of the mind is at a level that is never experienced again. Four years takes you from Newtonian gravity through the laws of heat and energy, past Maxwell’s synthesis of electromagnetism and into the laws of special and general relativity, from the basics of quantum mechanics to the Standard Model and quantum field theory. It is wonderful and magnificent. After all this, it is easy to feel a sense of ennui at the prospect of becoming merely a cog in the calculating machine. The siren call of Big Ideas is more attractive than the factory hooter summoning workers to the assembly line of computation.
This rebellion carries a dream, the dream of Galileo: to give the finger to the establishment with a new, revolutionary idea that makes previous conceptions redundant. A revolutionary idea needs an appropriate canvas, and this canvas can only be provided by the big questions: Is quantum mechanics correct? What is the correct theory of quantum gravity and what are the principles that govern it? What is the nature of the dark energy that appears to dominate the energy budget of the universe? Are there extra spatial dimensions with sizes possibly as large as a millimetre? Is there a multiverse, or landscape, of universes? Is the anthropic principle responsible for the ‘constants’ of nature?
These big que
stions offer the chance of altering the entire course of physics. They modify not just the details but, if answered correctly, the whole way one should think about the subject. These are areas where an idea that lasts will survive for decades and centuries, and enter the enduring legacy of physics.
In contrast, no such grand historical narrative is offered to ideas that aim at calculating next-to-next-to-leading order corrections to the production rate of a W boson together with three quarks in proton-proton collisions at the Large Hadron Collider. Big revolutionary thinkers need big revolutionary questions, and they recoil from such detail.
Science always needs a supply of would-be-revolutionaries. Most of the time, this approach only allows a researcher to sit on the periphery of the subject while looking down on those who toil in the vineyard. Just occasionally, however, there are discoveries to be made that can only be made in this manner. In all parts of life, true revolutionaries have despised the petite bourgeoisie who take the road more travelled by. In the case of real genius, the road less travelled by leads to glorious discovery. In most cases, however, it peters out to an empty trail in the middle of nowhere.
12.2 VORSPRUNG DURCH TECHNIK
‘The thing that hath been, it is that which shall be; and that which is done is that which shall be done: and there is no new thing under the sun. Is there any thing whereof it may be said, See, this is new? It hath been already of old time, which was before us.’ [Ecclesiastes 1:9-10]
It is not only in physics that an absence of new phenomena depresses the wise. As we saw originally in the first chapter, the Standard Model of particle physics is contemporaneous with the Vietnam War: it was developed during the 1960s and 1970s, and it was essentially finished by 1975. The theory of electroweak interactions was developed during the 1960s and shown to work as a quantum theory in 1971. This electroweak theory extended the account of quantum electrodynamics that had been developed in the late 1940s, merging it with a theory of the weak force. In a period of ten years from the 1960s to the 1970s, the strong force also went from being baffling to being understood. The final step was the discovery by David Gross, David Politzer and FrankWilczek in 1973, encountered in chapter 1, that at shorter and shorter distances the strong force becomes progressively weaker.
The Standard Model was written down by bolting together these various different components. At the time, it predicted the existence and interactions of many particles that were yet to be discovered. When first put down on paper, there seemed no reason for the Standard Model to survive five years, let alone thirty five.
However, since then all the particles whose existences were predicted by the Standard Model have actually gone on to be found. The gluon, the force carrier of the strong interactions, was identified in 1977 in four separate experiments at DESY – Deutsches Electronen SYnchrotronen, the German accelerator complex on the outskirts of Hamburg. The tau lepton was discovered in 1975 at the Stanford Linear Accelerator and the bottom quark in 1977 at Fermilab. In 1983 the enormously heavy force carriers of the weak interaction – the W and Z bosons – were found by the Underground Area 1 collaboration at CERN, led by the elemental and volcanic force of nature that was Carlo Rubbia.4 The W and Z, individually, are each approximately as massive as an atom of bromine, and at the time of discovery were twenty times heavier than any other known particle.
Since then, two still heavier particles have been discovered. The first was the top quark, around twice as heavy again as the W and Z particles, which was discovered in 1995 at the Fermilab complex near Chicago. The final ingredient of the Standard Model was the Higgs boson, which was discovered in 2012 at the Large Hadron Collider at CERN.
All this time, all these colliders – and the Standard Model has yet to crack. Envisioned as a cardboard shack, the Standard Model has proved Kevlar-plated.
Over this period, the quality of theoretical predictions has improved dramatically. The original predictions of the Standard Model were evaluated at ‘tree-level’ or ‘leading order in perturbation theory’: essentially just the first term in an approximation scheme. ‘Leading order’ is the spherical cow level of approximation – if you had to estimate the weight of a cow, the simplest method is to model the cow as a sphere. These rough approximations were then extended to ‘next-to-leading-order in perturbation theory’ – this is the approximation of an ellipsoidal cow. The most recent extension is to ‘next-to-next-to-leading-order’, at which point the cow starts sprouting legs and a face. The time period to move between each successive level of approximation is often around a decade for each step. If this sounds a little lazy, it is because the complexity of the computation grows enormously with each next-to’. If a tree-level computation has ten components, a next-to-leading-order computation may have five hundred parts and a next-to-next-to-leading order computation twenty thousand elements.
Over this period, the increase in the energies of particle colliders is equivalent to a decrease in the distances probed by a factor of two hundred. The statement that the Standard Model has continued to describe nature successfully across this range of distances is important. It tells us that the Standard Model works, and by doing so it significantly extends our understanding of nature.
It is also important that this statement is true, and so actually can be made. This statement can only be made because we know what the Standard Model predicts. It is only by knowing what the Standard Model predicts that we can compare its predictions to experiment. This knowledge has required work; it has taken tens of thousands of years of effort by thousands of individuals to determine and refine the predictions of the Standard Model to a level that can be tested in the progressively more demanding environments that newer and newer colliders have provided.
There are many aspects to this. There is the level of calculational technique – once a calculation involves many thousands of individual elements, it is necessary that the methods used are computationally economical and efficient. As seen in chapter 3, the techniques of renormalisation make the individual elements of the calculation formally infinite, with these formal infinities cancelling between different parts of the computation. It is important that the calculation is structured so that these infinities do manifestly cancel: a single stray infinity makes mockery of any claims to precision.
There is the level of verification: once a calculation is done, how do you know that the answer is correct? What tests can you do to ensure that you have not misled yourself, and that the answer is indeed the correct one? What are the equivalents of checksums – consistency checks that ensure the answers obtained are not simply meaningless?
There is the level of coding. Once calculations have gone beyond a certain level of complexity, they must be performed numerically and on a computer. Large codes need to be written to perform these calculations, and large codes also need to be maintained and debugged. Uninitialised variables and the conflation of signed and unsigned integers can render results just as meaningless as if the wrong laws of quantum mechanics had been used.
Even once a calculation has been done, it still has to be compared with experiment. Particle physics experiments can be many storeys high and they are packed with cables. Their outputs consist not of ‘Higgs bosons’, but of electronic signals. A primary energetic particle at a collider may manifest itself as a complex jet of hundreds or thousands of secondary particles, approximately collimated along the direction of the original particle. These hundreds of particles in turn deposit their energies in large calorimeters, producing electronic signals that record the locations and amounts of energy deposited. Further modelling is needed to turn the basic results of the calculation into the observables that a collider experiment can actually measure – and large codes are also needed to describe this.
Finally, there is even the level of data storage. The Large Hadron Collider can store only a minute fraction of all the collisions that occur. Most of the data never even makes it to memory, and elaborate triggers’ are used to determine the collisions that ge
t recorded and the collisions that do not. In comparing any prediction to data, one must account for the selection effect that over 99.999% per cent of the ‘data’ was immediately discarded.5
Despite all this, the predictions of the Standard Model are ultimately precise and well-defined. At the end, there is a theoretical number that can be compared to an experimental result. Determining these predictions requires large amounts of work at each of the many levels described above. For this to happen, it needs people – many people! – to devote their entire careers to working on single parts of this process. The physicists who follow these paths spend years calculating the precise implications of the Standard Model for regimes where it has yet to be tested. It involves turning the equations of the Standard Model into real predictions for real data.
Their work will never win the Nobel Prize. It involves working within a well-defined framework, in which there is a well-defined answer, and evaluating this answer. The rules of reward are simple. Nobel Prizes and the highest scientific honours are not given to those who refine the theories of others. Their results may be important. The calculations may be essential for confirming any observation of new physics. Ce n’importe quoi – the highest awards are reserved for those who develop their own frameworks rather than those of others.
Of course those who choose to work on, for example, the theory of the precise predictions of the Standard Model know the choice they are making. It is a choice not to work on the latest hot topic. The fashionable areas vary hugely with time. At one time it was proton decay and grand unified theories, at another time superstrings and Calabi-Yau compactification, at another time millimetre-sized extra dimensions, and for one very brief and very embarrassing instant it was neutrinos that could travel faster than the speed of light. The precise form of the higher-order corrections to the predictions of the Standard Model are never fully in fashion – but they are never out of fashion either. Some ideas – the existence of millimetre-sized extra dimensions is a good example – can go rapidly from the height of fashion to almost total death. The Standard Model will never suffer this fate and will always be of value.
Why String Theory? Page 30