Book Read Free

Quantum Mechanics

Page 1

by Jim Al-Khalili




  Jim Al-Khalili

  * * *

  QUANTUM MECHANICS

  with illustrations by

  Jeff Cummins and Dan Newman

  Contents

  Classical physics

  The dawn of modern physics

  The birth of quantum theory

  Particles of light

  Looking inside atoms

  Bohr creates the Copenhagen empire

  Wave-particle duality

  Schrödinger and his wave equation

  Explaining chemistry

  How the Sun shines

  Dirac and antimatter

  Clash of the titans

  Cats in boxes

  Digging deeper

  Spooky action

  Quantum field theory

  But it works! Applying quantum mechanics

  Quantum 2.0

  How to build a quantum computer

  Quantum biology – a new science

  But what does it all mean?

  Quantum gravity – a theory of everything

  Further Reading

  I think I can safely say that nobody understands quantum mechanics.

  Richard Feynman

  Quantum mechanics is certainly imposing. But an inner voice tells me that it is not yet the real thing. The theory says a lot, but does not really bring us any closer to the secret of the old one. I, at any rate, am convinced that He does not throw dice.

  Albert Einstein

  So Einstein was wrong when he said, ‘God does not play dice.’ Consideration of black holes suggests, not only that God does play dice, but that he sometimes confuses us by throwing them where they can’t be seen.

  Stephen Hawking

  For those who are not shocked when they first come across quantum theory cannot possibly have understood it.

  Niels Bohr

  The theory [quantum mechanics] has, indeed, two powerful bodies of fact in its favour, and only one thing against it. First, in its favour are all the marvellous agreements that the theory has had with every experimental result to date. Second, and to me almost as important, it is a theory of astonishing and profound mathematical beauty. The one thing that can be said against it is that it makes absolutely no sense!

  Roger Penrose

  The underlying physical laws necessary for … a large part of physics and the whole of chemistry are thus completely known, and the difficulty is only that the exact application of these laws leads to equations much too complicated to be soluble.

  Paul Dirac

  Series 117

  This is a Ladybird Expert book, one of a series of titles for an adult readership. Written by some of the leading lights and outstanding communicators in their fields and published by one of the most trusted and well-loved names in books, the Ladybird Expert series provides clear, accessible and authoritative introductions, informed by expert opinion, to key subjects drawn from science, history and culture.

  The Publisher would like to thank the following for the illustrative references for this book: here, here: © Getty Images; here: LC-DIG-ggbain-03392, Library of Congress Prints and Photographs Division Washington, D.C. 20540; here: LC-USZ62-87212, Library of Congress, Prints and Photographs division, via Wikimedia Commons; here: AIP Emilio Segre Visual Archives, Lande Collection; here: © Keystone/Hulton Archive/Getty Images; here: photograph by Francis Simon, courtesy AIP Emilio Segre Visual Archives; here: Ramsey et Musprat, courtesy AIP Emilio Segre Visual Archives, Gift of Emil Wolf; here: © Niels Bohr, Niels Bohr Archive, Copenhagen.

  Every effort has been made to ensure images are correctly attributed, however if any omission or error has been made please notify the Publisher for correction in future editions.

  Classical physics

  By the end of the nineteenth century many physicists believed there really wasn’t much more to learn about the workings of nature and the properties of matter and radiation.

  Two centuries earlier, Isaac Newton (1642–1727) had outlined, in powerful mathematical detail, what became known as classical or Newtonian mechanics. Every school-child learns about his Laws of Motion and Universal Law of Gravitation, which are used to calculate the forces that act on all objects we see around us and to explain the way those objects move – everything from falling apples to the trajectories of the Apollo moon rockets. Newton’s body of work, together with the observations and careful experiments of others before him, such as Galileo (1564–1642), had set the scene for the giants of nineteenth-century physics, such as Michael Faraday (1791–1867) and James Clerk Maxwell (1831–1879), to complete the picture of classical physics.

  This entire body of work, which took three centuries to develop, still provides us today with a description of the universe made up of solid objects obeying Newtonian mechanics, along with Maxwell’s description of electromagnetic waves, fields and radiation that beautifully unified electricity, magnetism and light.

  Physics, so it seemed at the time, was complete. It was expected that everything in the universe could be describable precisely, and that it was now simply a matter of dotting the ‘i’s and crossing the ‘t’s. One physicist even remarked, in 1894: ‘It seems probable that most of the grand underlying principles have been firmly established.’

  The dawn of modern physics

  In the final decade of the nineteenth century a number of mysteries and unresolved puzzles in physics were beginning to suggest that science was in crisis.

  For example there was no proof as yet that matter was ultimately composed of atoms – fundamental building blocks that could not themselves be subdivided. Many physicists and chemists had begun using the idea of the existence of atoms (a rudimentary ‘atomic theory’) as a working assumption while in fact having no idea about their nature or properties.

  There were also puzzling phenomena that could not be explained, such as the photoelectric effect, whereby light shone on metal electrodes could create electricity, as well as black-body radiation (the heat and light given off by certain non-reflecting objects) and the distinctive pattern of lines in the spectrum of light given off by each chemical element.

  Adding to the excitement were three discoveries made in quick succession: first came mysterious X-rays (by the German physicist Wilhelm Röntgen in 1895), followed by the equally mysterious phenomenon of radioactivity (by Frenchman Henri Becquerel in 1896 and for which he won a Nobel Prize, with Marie and Pierre Curie), and, finally, the electron, the very first elementary particle, credited to the English physicist J. J. Thomson in 1897.

  Röntgen’s X-rays allowed us to see through solid matter, as though by magic, while Marie Curie earned her place in history through her work on radioactivity, becoming the first woman to receive a Nobel Prize.

  The birth of quantum theory

  In 1900 the German physicist Max Planck explained the way hot bodies emit electromagnetic radiation (as heat and light) at different wavelengths. He proposed that the energy of this radiation was proportional to its frequency: the higher the frequency (or the shorter the wavelength), the higher its energy will be, and the ratio of the two quantities (energy divided by frequency) was therefore a constant number, which still bears his name: Planck’s constant.

  The idea led Planck to the conclusion that this radiation could not have just any energy – since only discrete energies were allowed (corresponding to the frequency), the radiation had to be ‘lumpy’, like water dripping from a tap rather than flowing continuously. It was a revolutionary proposal and completely at odds with the then prevailing ideas.

  Planck’s constant is a very tiny number and tells us that there is a limit to how small a lump of heat radiation can be – an indivisible piece of energy, or a ‘quantum’ of radiation.

  This was the first indication that
different rules applied at the tiniest scales. Planck was a somewhat reluctant revolutionary and was never really happy with his new theory, even though it beautifully explained the way bodies radiated heat, when all previous attempts had failed. Others would, however, take his idea and run with it.

  The word quantum comes from the Latin quantus, meaning ‘how great’, and came into general use in physics in the first few years of the twentieth century to denote the smallest indivisible piece.

  Particles of light

  Planck’s formula was, by and large, ignored for five years until Albert Einstein used it to explain the photoelectric effect – work for which he would win his Nobel Prize, rather than for his far more famous theory of relativity.

  In the photoelectric effect, light shone on an electrically charged metal plate can knock free electrons from its surface. You might think that the energy of the released electrons would depend on the light’s brightness, or intensity. Instead, their energy was found to depend on its frequency. This is an unexpected result if light is a wave, because increasing a wave’s intensity (its height) should increase its energy. Think of water waves crashing against the seashore; they have more energy the higher they are, not the faster they break against the shore. In the photoelectric effect, high-intensity light did not result in electrons being knocked free with more energy, just more electrons being knocked free!

  Einstein successfully explained what was happening by proposing that all electromagnetic radiation (from high-energy gamma rays and X-rays, to visible light to radio waves) is ultimately made up of tiny lumps of energy: particles we now call photons. Each electron is thus knocked out when it is hit by a single photon, the energy of which depends on its frequency. The reason we do not usually see this particle-like nature of light is due to the large number of photons involved, just as we cannot usually see individual pixels on a photograph.

  Looking inside atoms

  In 1909, in Manchester, Ernest Rutherford conducted a famous experiment with Ernest Marsden and Hans Geiger into the structure of atoms.

  They discovered that when a beam of alpha particles, tiny pieces of atomic nuclei given off as alpha radioactivity, was fired at a thin gold foil, the majority passed through easily, suggesting that the atoms were mostly empty space. However, a few rebounded back. Rutherford exclaimed: ‘It was as though you had fired a fifteen-inch shell at a piece of tissue paper and it had bounced back and hit you.’

  The only explanation was that most of an atom’s mass and electric charge must be concentrated in a tiny volume in its centre, the ‘atomic nucleus’, which is 100,000 times smaller than the atom itself. That is, if an atom were a football stadium, its nucleus would be a garden pea on the centre spot.

  But there was a problem: if all the positive charge was in this tiny nucleus while the negatively charged electrons orbited around it like planets around the Sun, why did these electrons, which are attracted to the positive charge, simply not fall into the nucleus and neutralize it? Unlike the Earth in its stable orbit around the Sun, electrically charged particles emit radiation and lose energy when forced to move in a circle, and as they do so they should, if they obeyed Newtonian mechanics, quickly spiral in towards the nucleus. But they don’t – atoms are stable, or we wouldn’t be here.

  Rutherford observed a fluorescent screen through a microscope and saw pinpricks of light as alpha particles were deflected by a gold film.

  Quantizing atoms

  Rutherford’s problem with his solar-system model of the atom would be tackled by a Danish physicist who is ranked alongside Einstein as one of the greatest thinkers of the twentieth century. His name was Niels Bohr and he is regarded as the father of quantum mechanics.

  Bohr postulated that atomic electrons lose and gain energy only according to certain ‘quantum’ rules. Each electron orbit is associated with a specific energy, so electrons are not free to follow just any orbit. Instead, they remain in fixed orbits, like a train on a circular track. It would later be shown that each of these orbits can accommodate only a precise number of electrons. Therefore, an electron can drop to a lower orbit only if there is space for it and only if it loses a quantum of electromagnetic energy – a photon – that equals the difference in energy between the two orbits. Likewise, it can jump to an outer orbit only if it gains a photon with the precise energy needed for the jump. When it comes to these quantum jumps, there is never any change given, only the exact energy can be exchanged.

  We will soon discover, however, that the Rutherford–Bohr model of the atom is not the final word. Bohr’s work on quantized electron orbits brought to an end the first phase of the quantum revolution, which is today referred to as the ‘old quantum theory’.

  The atomic model with electron concentric orbits (shells) that is still taught at school. The number of electrons in each is fixed. Shown are the atoms of four different elements: the lightest, hydrogen, and three of the noble gases.

  Bohr creates the Copenhagen empire

  In 1916 Niels Bohr returned to Denmark from Manchester, where he had helped Rutherford explain the stability of atoms. But apart from these fixed, or quantized, orbits that did not fit into the Newtonian picture, it was still assumed that Newtonian mechanics could describe the rest of the microscopic world. Electrons were thought of as tiny spheres moving along well-defined paths.

  Bohr founded a new Institute for Theoretical Physics in Copenhagen in 1921 and set about gathering around him some of the greatest young geniuses in Europe, most notably Werner Heisenberg and Wolfgang Pauli. These physicists would turn the world of science upside down.

  The mathematical theory of quantum mechanics – to distinguish it from classical mechanics – went far beyond simple quantization of electron orbits. It became clear that atoms were not like miniature solar systems at all and that electron orbits had to be replaced by the idea of fuzzy clouds of ‘electroness’.

  The theory, completed by the mid-1920s, described a world far stranger than many could accept. The Copenhagen school of thought that Bohr and others developed to describe the quantum world has shaped many areas of science and philosophy for almost a century.

  On the eightieth anniversary of Bohr’s birth – 7 October 1965 – his institute was officially renamed The Niels Bohr Institute. It was originally funded partly by the charitable foundation of the Carlsberg Brewery, making it … probably … the best quantum institute in the world.

  Wave-particle duality

  In 1924 a young French nobleman called Louis de Broglie made a daring proposal: if a light wave can also behave as a stream of particles, then can moving particles of matter also be made to spread themselves out over space like waves? He suggested that every material object could be associated with a ‘matter wave’ that depended on its mass; the more massive a particle, the shorter its wavelength.

  Experimental confirmation of the wave-like nature of electrons came in 1927, when they were shown to give rise to interference effects, just like light waves, sound waves or water waves. Just how incredible this notion is can be underlined by the famous two-slit experiment.

  Matter particles, such as electrons, are fired one at a time at a screen with two narrow slits. If they obeyed the rules of common sense, then each electron that managed to get through would have necessarily gone through one slit or the other. But instead, the pattern of fringes that builds up on the screen is what is seen when spread-out waves pass through both slits simultaneously, like a single extended sea wave hitting the length of a shore. Each electron must be somehow behaving as a wave as it passes through, even though it hits the back screen as a particle, since the pattern on the screen is built up from individual dots where electrons have landed. If you think this is confusing then congratulations, you see the problem: it is completely crazy.

  Like the particle that can go through both slits at once, the quantum skier can go round both sides of the tree. It looks impossible, but down in the quantum world, this is the equivalent of what we see happening.
>
  Schrödinger and his wave equation

  While Bohr and Heisenberg were developing their mathematical picture of atoms, an Austrian physicist named Erwin Schrödinger was presenting a different approach, suggesting that the entire subatomic world is made up of waves. His theory became known as wave mechanics and the famous equation that bears his name describes how such ‘quantum’ waves change over time.

  Every student of physics and chemistry will learn about Schrödinger’s equation. It would take too long to try to explain it here, so all you need to know is that it provides us with a mathematical quantity called the wave function. This is essentially a set of numbers that tells us everything we could possibly know about whatever it is the equation is describing, whether this is the state of a single particle, such as an electron, or an entire system of interacting particles.

  Nowadays we regard Schrödinger’s equation as the quantum world’s equivalent of the classical equations of motion, which involve distance, speed and acceleration, but with a crucial difference. If we know where something, say a falling apple, is, and how fast it’s moving at any moment in time, then we can calculate precisely when it will hit Newton on the head. But if we know the state and whereabouts of an electron now and we track how its wave function changes over time, we can only ever compute the probability of the electron being somewhere else in the future. We will not know for sure – we must give up on certainty.

  Electron orbits are really more like clouds of probability depicting where the electron is ‘most likely to be’. This is far from the schoolbook picture of a miniature solar system with electrons in circular orbits around the nucleus.

 

‹ Prev