by Jim Baggott
The correspondence theory of truth
All this talk of verification and falsification implies that ‘truth’ plays a central role in whatever we conclude passes for a scientific methodology. But according to the Reality Principle, the reality we seek to study is a metaphysical concept. The Theory Principle says that the common currency of scientific theories involves yet more metaphysics.
It seems obvious that concepts that start out as purely speculative cannot be considered to be true. By the same token, concepts accepted as true may later turn out to be false. Phlogiston was once believed to be a substance contained in all combustible materials, released when these materials burned. This is not what we believe today.
How should we reconcile all this with the idea of scientific truth?
What convinces us either way is, of course, evidence in the form of facts. Here, we anchor the idea of ‘truth’ firmly to the idea of an empirical reality. A statement is true if, and only if, it corresponds to established facts about the real world derived from our perceptions or measurements.
This correspondence theory of truth implies that there can be no truth — no right or wrong — without reference to facts about the external world.* True statements tell it like it really is. The correspondence theory is the natural choice of a realist.
We appreciate that scientific theories are imperfect in many ways and, as Russell noted, we cannot assume that what we regard as true today will necessarily be true tomorrow. But we can think of scientific theories as possessing a verisimilitude, a truth-likeness, which increases with each successive generation of scientific development.
Scientists are constantly refining and improving their theories, and as it makes no sense to develop new theories that are less useful than their predecessors, we might be prepared to accept that successive developments help take us closer and closer to ‘the truth’ about empirical reality.
The correspondence theory shifts the burden from the meaning of ‘truth’ to the meaning of ‘facts’, which seems both logical and helpful. Most often, scientists who claim to be in pursuit of the truth are actually in pursuit of the facts. An interesting and unexpected experimental result will prompt a flurry of activity in competing laboratories, as scientists rush to confirm or refute the facts and so establish the truth.
For example, on 4 July 2012, scientists at CERN declared that they had discovered a new particle ‘consistent’ with the standard-model Higgs boson. Further research will be required to categorize fully its properties and behaviour. This research will establish empirical facts about the existence of a particle that was first hypothesized in 1964. If these facts emerge broadly as anticipated, we may conclude that the statement ‘the Higgs boson exists in nature’ is true, in much the same way that ‘unicorns exist in nature’ or ‘phlogiston exists in nature’ are false.
The Veracity Principle. It is not possible to verify a scientific theory such that it provides absolute certainty for all time. A theory is instead accepted (or even tolerated), on the basis of its ability to survive the tests and meet additional criteria of simplicity, efficacy, utility, explanatory power and less rational, innately human measures such as beauty. Over time the theory becomes familiar and is accepted as ‘true’ or, at least, as possessing a high truth-likeness or verisimilitude. The scientists’ confidence or degree of belief in the theory grows. It is eventually absorbed into the common body of knowledge which forms the current ‘authorized’ version of empirical reality.
The Copernican attitude
There is one final, critically important ingredient to consider. When scientists go about their business — observing, experimenting, theorizing, predicting, testing and so on — they tend to do so with a certain fixed attitude or mindset. It is this attitude that sets science apart, that lends it its ‘Spock-ness’ and exposes it to occasional criticism as a soulless, somewhat inhuman enterprise.
Let me explain.
The purpose of an organized system of religion is to enable its followers to come to terms with their place in the universe, give meaning to their lives and offer moral instruction and comfort in times of need. Religion is all about ‘us’. It puts us at the centre of things, and although not all religious systems necessarily claim that the external physical world is organized principally for our benefit, many do.
Science is very different. Scientists tend to assume that there is, in fact, nothing particularly special about ‘us’. We are not uniquely privileged observers of the universe we inhabit. We are not at the centre of everything. There is nothing special about the planet on which we exist. Or the rather average class G2 main-sequence star that gives us sunlight. Or the galaxy of between 200 and 400 billion stars in which our sun orbits, about two thirds or 25,000 light years from the centre. Or the 53 other galaxies which together with the Milky Way form the Local Group. Or the Virgo Supercluster of which the Local Group forms part. I could go on, but I think you’ve got the message.
This is the Copernican Principle.
The Copernican Principle. The universe is not organized for our benefit and we are not uniquely privileged observers. Science strives to remove ‘us’ from the centre of the picture, making our existence a natural consequence of reality rather than the reason for it. Empirical reality is therefore something that we have learned to observe with detachment, without passion. Scientists ask fundamental questions about how reality works and seek answers in the evidence from observation and experiment, irrespective of their own personal preferences, prejudices and beliefs.
The principle is misnamed insofar as Nicolaus Copernicus himself did not view his heliocentric model of the universe as necessarily undermining earth’s unique position. What he offered was a technical improvement over the Ptolemaic system’s obsession with convoluted structures constructed using epicycles. By putting the sun at the centre, the retrograde motions of the other planets in the solar system could be explained as apparent motions when observed from an orbiting (rather than stationary) earth. The planets don’t really move backwards: they only appear to move backwards because we are also being transported around the sun.
But Copernicus sparked a revolution that was to shape the very nature of science itself. It became apparent that science works best when we remove ‘us’ as the primary objective or purpose of the equations of reality.
Nearly five hundred years on, all the scientific evidence gathered thus far suggests that the Copernican Principle is justified. The evidence suggests rather strongly that we are not privileged: we are not at the centre of things. It points to the singular absence of an intelligent force, moving in mysterious ways to organize the world just for our benefit. As French mathematician and astronomer Pierre-Simon, marquis de Laplace, once advised Napoleon: ‘Je n’avais pas besoin de cette hypothèse-là.’*
Now this has been a bit of a whirlwind tour through some aspects of the philosophy of science and scientific methodology. I hope it wasn’t too much like hard work.
I’ve tried to be reasonable. This ‘working model’ of science acknowledges that reality-in-itself is metaphysical, that the objects of scientific study are the shadows, the things-as-they-appear or things-as-they-are-measured. It accepts that the facts that scientists work with are not theory-neutral — they do not come completely free from contamination by theoretical concepts. It accepts that theories are in their turn populated by metaphysical concepts and mathematical abstractions and are derived by any method that works, from induction to the most extreme speculation. It acknowledges that theories can never be accepted as the ultimate truth. Instead, they are accepted as possessing a high truth-likeness or verisimilitude — they correspond to the facts. In this way they become part of the authorized version of empirical reality.
Finally, the model acknowledges the important role played by the Copernican attitude. Science works best when we resist the temptation to see ourselves as the primary objective or purpose of reality.
This is a bit more elaborate than the Science Co
uncil’s definition. But this elaboration is more reflective of actual practice. It is necessary if we are to understand how fairy-tale physics is not only possible, but is able to thrive.
* Perhaps we’ve just lost the habit.
* Alas, it’s not possible.
* That’s a lot of trillions, which is why the Large Hadron Collider cost £5 billion to construct. By the way, an electron volt is the amount of energy a single negatively charged electron gains when accelerated through a one-volt electric field. A l00W light bulb burns energy at the rate of about 600 billion billion electron volts per second.
** ATLAS stands for A Toroidal LHC Apparatus. CMS stands for Compact Muon Solenoid.
* We’ll look more closely at the search for the Higgs boson in Chapter 3.
* Although Copernicus had argued for a sun-centred planetary system, the debate was still raging in Kepler’s time. Brahe himself argued for a system in which the planets orbit the sun, which, in turn, orbits a stationary earth.
* For a vivid account of some of the more extreme methods that scientists have used to make discoveries, see Michael Brooks, Free Radical: The Secret Anarchy of Science, (Profile Books, London, 2011).
* For a recent, highly readable tour through the abstract, see Giovanni Vignale, Beautiful Invisible: Creativity, Imagination, and Theoretical Physics, (Oxford University Press, 2011).
* Eddington was also selective with his data, with history rewarding his choices as ‘good judgement’.
** How you pronounce ‘Uranus’ is entirely up to you. I’m old enough to be stuck with the pronunciation I learned at school: ‘your anus’. This does not make me titter, or blush, because I am no longer eight years old.
* There is an alternative coherence theory of truth which asserts that truth is determined by relations between statements rather than correspondence to external facts. We will return to this alternative interpretation of truth in Chapter 10.
* ‘I had no need of that hypothesis.’ From W. W. Rouse Ball, A Short Account of the History of Mathematics, (4th edition, 1908).
Part I
The Authorized Version
2
White Ambassadors of Morning
Light, Quantum Theory and the Nature of Reality
The more success the quantum theory has, the sillier it looks. How non-physicists would scoff if they were able to follow the odd course of developments!
Albert Einstein1
Two years before they hit the stratosphere with Dark Side of the Moon in 1973, the British progressive rock band Pink Floyd released their sixth studio album, called Meddle. Side two is a single 23-minute-long track called ‘Echoes’.* After a short, discordant nightmare sequence in the middle of the track, ‘Echoes’ greets the morning with a return to harmony. Soft, gentle vocals declare: ‘And through the window in the wall, comes streaming in on sunlight wings, a million white ambassadors of morning.’2
This is a particularly pleasant metaphor for photons, the elementary particles that constitute all electromagnetic radiation, including light. However, on a bright morning there are rather more than a million of them streaming through the window. And photons are no ordinary particles. For one thing, they have no mass. They have spin — an intrinsic angular momentum — which we perceive as polarisation. They also have something called phase, which means that they are particles that can also behave like waves.
Let’s look at them a bit more closely.
Einstein’s light quantum hypothesis
In fact, photons were the first ‘quantum particles’, suggested by Einstein in a paper he published in 1905. At that time, light was thought to consist of a series of wave disturbances, with peaks and troughs moving much like the ripples that spread out on the surface of a pond where a stone has been thrown.
The evidence for wave behaviour was very compelling. Light can be diffracted. When forced to squeeze through a narrow aperture or slit in a metal plate, it spreads out (diffracts) in much the same way that ocean waves will spread if forced through a narrow gap in a harbour wall. This is behaviour that’s hard to explain if light is presumed to be composed of particles obeying Newton’s laws of motion and moving in straight lines.
Light also exhibits interference. Shine light on two narrow apertures or slits side by side and it will diffract through both. The wave spreading out beyond each aperture acts as though it comes from a ‘secondary’ source of light, and the two sets of waves run into each other. Where the peak of one wave meets the peak of the other, the result is constructive interference — the waves mutually reinforce to produce a bigger peak. Where trough meets trough the result is a deeper trough. But where peak meets trough the result is destructive interference: the waves cancel each other out.
The result is a pattern of alternating brightness and darkness called interference fringes, which can be observed using photographic film. The bright bands are produced by constructive interference and the dark bands by destructive interference. This is called two-slit interference.
The wave model of light was given a compelling theoretical foundation in a series of papers published in the 1860s by Scottish physicist James Clerk Maxwell. He devised an elaborate model which combined the forces of electricity and magnetism into a single theory of electromagnetism. Maxwell’s theory consists of a complex set of interconnecting differential equations, but can be greatly simplified for the case of electromagnetic radiation in a vacuum. When recast, these equations look exactly like the equations of wave motion. Maxwell himself discovered that the speed of these ‘waves of electromagnetism’ is predicted to be precisely the speed of light.
But waves are disturbances in something, and it was not at all clear what light waves were meant to be disturbances in. Some physicists (including Maxwell) argued that these were waves in a tenuous form of matter called the ether, which was supposed to pervade the entire universe. But subsequent experimental searches for evidence of the ether came up empty.
Einstein didn’t believe that the ether existed, and argued that earlier work in 1900 by German physicist Max Planck hinted at an altogether different interpretation. He boldly suggested that Planck’s result should be taken as evidence that light consists instead of independent, self-contained ‘bundles’ of energy, called quanta.
According to the assumption considered here, in the propagation of a light ray emitted from a point source, the energy is not distributed continuously over ever-increasing volumes of space, but consists of a finite number of energy quanta localized at points of space that move without dividing, and can be absorbed or generated only as complete units.3
This was Einstein’s ‘light quantum hypothesis’. He went on in the same paper to predict the outcomes of experiments on something called the photoelectric effect. This effect results from shining light on to the surfaces of certain metals. Light with wave frequencies above a threshold characteristic of the metal will cause negatively charged electrons to be kicked out from the surface. This was a bit of a challenge for the wave theory of light, as the energy in a classical wave depends on its intensity (related to the wave amplitude, the height of its peaks and depth of its troughs), not its frequency. The bigger the wave, the higher its energy.*
Einstein figured that if light actually consists of self-contained bundles of energy, with the energy of each bundle proportional to the frequency of the light, then the puzzle is solved.4 Light quanta with low frequencies don’t have enough energy to dislodge the electrons. As the frequency is increased, a threshold is reached above which the absorption of a light quantum knocks an electron out of the lattice of metal ions at the surface. Increasing the intensity of the light simply increases the number (but not the energies) of the light quanta incident on the surface. He went on to make some simple predictions that could be tested in future experiments.
These were highly speculative ideas, and physicists did not rush to embrace them. Fortunately for Einstein, his work on the special theory of relativity, published in the same year, was better reg
arded. It was greeted as a work of genius.
When in 1913 Einstein was recommended for membership of the prestigious Prussian Academy of Sciences, its leading members — Planck among them — acknowledged his remarkable contributions to physics. They were prepared to forgive his lapse of judgement over light quanta:
That he may sometimes have missed the target in his speculations, as, for example, in his hypothesis of light-quanta, cannot be really held against him, for it is not possible to introduce really new ideas even in the most exact sciences without sometimes taking a risk.5
The risk was partly rewarded just two years later. When American physicist Robert Millikan reported the results of further experiments on the photoelectric effect, Einstein’s predictions were all borne out. The results were declared to be supportive of the predictive ability of Einstein’s equation connecting photoelectricity and light frequency, but the light quantum hypothesis remained controversial.
Einstein was awarded the 1921 Nobel Prize for physics for his work on the photoelectric effect (but not the light quantum). Two years later, American physicist Arthur Compton and Dutch theorist Pieter Debye showed that light could be ‘bounced’ off electrons, with a predictable change in light frequency. These experiments appear to demonstrate that light does indeed consist of particles moving in trajectories, like small projectiles. Gradually, the light quantum became less controversial and more acceptable. In 1926, the American chemist Gilbert Lewis coined the name ‘photon’ for it.