Are we in the same position? All those chapters ago, we were talking about a science of uncertainty: a form of reasoning that would stand next to logical deduction and the scientific method as a means of coming to terms with the world and plotting our course through it. In the intervening pages we have seen its surprising strengths and occasional weaknesses, following the spiral of hope and disillusionment that drives all human discovery. We tend to reset our expectations, discounting our achievements and amplifying our remaining dissatisfactions. So, yes: probability helps us make decisions, it gives us a tool to manage the recurrent but unpredictable; it helps prevent or mitigate disaster, disease, injustice, and the failure of the raisin crop—but do we really believe it? To what degree is it actually true—that is, something innate to the world and experience, not just to urns and wheels?
The Lloyd’s A1 standard of truth for most of us would probably be classical physics. Despite having little personal experience of Newton’s Laws in their purest form, we feel sure of them; we expect them to be as true Out There as they are Around Here. This confidence has two sources: first, most of us stopped studying physics when we mastered Newton, just as we stopped geometry after Euclid, and what one masters last remains most true. Second, humans happen to be a good size for Newtonian mechanics: our billiard tables and tennis courts scale up well to the planetary level.
We also cling to classical physics because Newton’s universe is supremely beautiful, not just in the simplicity and power of his laws, but in the smoothness and grace of their application. The planets progress with irresistible grandeur, without even the tick of clockwork to interrupt the music of the spheres. Smooth fields of force command the motion of masses, of electricity and magnetism. The concept of limit, sketching curves beyond the resolution of any measurement, banishes that childhood terror of infinite time and space and reveals a broad continuum, where all motion has the sense of inevitability.
Except when things get small. Once our imaginations venture below the molecular scale, we find that Newton’s laws no more apply throughout the universe than does the Bill of Rights. What does apply—at every scale—is probability.
One of the most puzzling experiments in physics is also one of the most pleasant: run a hot bath, climb in, and begin wiggling your toes. Wiggle only the left foot and the ripples progress smoothly up toward your nose, forming a smooth wave line along the side of the tub. Wiggle both feet and you see the pattern change: in places the waves reinforce, producing peaks twice as high as before. In others, they cancel out, creating stretches of flat calm. If you are an expert wiggler and keep your toes in sync, you can hold this interference pattern still and steady, the bands of flat and doubly disturbed water extending out toward you like rays.
In 1804 the same experiment was done with light. Cut two thin slits in a window blind and let sunlight (filtered to a single color) project onto a screen in the darkened room and you will see, not twin pools of brightness, but a pattern of alternating bands of light and dark spreading out from the center. Light, therefore, behaves like the waves in the bath, reinforcing and canceling out; no wonder we talk about wavelengths, frequencies, and amplitudes for all the various forms of electromagnetic radiation, from radio to gamma rays.
Yet we also know that light behaves like a stream of bullets, knocking off electrons from exposed surfaces: bleaching our clothes and tanning our skin. Each photon delivers a precisely defined wallop of energy, dependent on the type of radiation: dozy radio goes right through us unperceived; hustling X-rays leave a trail of damage behind. This is practical, not just theoretical, reality: we are now technically adept enough to generate these photons precisely, throttling back the fire hose delivery of a 60-watt lightbulb (1020 photons per second) to a steady drip of individual light particles.
You will already be asking what the experimenters next asked: what if we sent these particles one by one toward the pair of slits? For all we know, the interference pattern could have been produced by some kind of jostling among energetic photons eager to squeeze through the crush and get on. Yet even when photons are sent one by one, when there is no other photon to elbow past at the slits, the same interference pattern appears. Nor is this effect restricted to light: individual electrons, too, produce an interference pattern; even the big soccer-ball shaped molecules of carbon 60, Buckminsterfullerene, behave in a wavelike manner; it is as if they were interfering with themselves, splitting their identities and going through both slits at once. Even odder, if you put detectors at the slits to determine which one the particle has passed through, the effect disappears: the pattern on the screen changes to two pools of light as if no interference had taken place.
What is going on here? We are seeing at first hand the complex interaction of the Newtonian world and the quantum mechanical world. Our assumption from experience of visible, classical physics is a smooth gradation of things: temperature will move from 20 to 25 degrees through all the temperatures in between; a ball will fly from this court to that through all the positions that divide them. Quantum mechanics takes its name from the fact that the phenomena it studies do not behave this way: their fixed quantities admit no intermediate values. Electrons jump from one energy state to another; particles remain “entangled” with one another, mutually influencing observable qualities although separated by great distances. At the quantum scale, “Where is it now?” becomes both as puzzling and as pointless a question as “What does it all mean?” The observer cannot help but be part of the action. Simply looking for something (putting detectors at each slit) changes the nature of the physical system; and asking about the location of a photon without observing it is like asking, without slapping it to your wrist, whether a coin spinning in the air is showing heads or tails.
What can we describe, then, without direct observation? Probability: in quantum mechanics, probability itself is the ether through which these waves propagate. The interference pattern represents the equal probability of the photon’s going through either slit; if we do not fix the photon trace by measurement, its path will follow that field of probability, effectively going through both slits at once. Position, therefore, is a concept with two forms: a wavelike field of probability until a measurement is made, a point in space thereafter.
Why, though, should this be true at the quantum level and not the classical? Why don’t you see an interference pattern when, say, throwing baseballs through your neighbor’s front windows? According to the physicist Roger Penrose, it’s simply a matter of scale. The probability distribution for the position of a given particle includes a term which, when we compare one particle path with another, can reinforce or cancel out its equivalent in the second probability distribution, thus generating the interference pattern. But when we scale up to the realm of classical physics, we are dealing with a vast number of simultaneous probability distributions: the key term effectively “averages out” to zero, leaving only the individual probability distribution for each path. Your baseballs will go through one window or the other, and there will be no interesting pattern on the far wall with which to distract your angry neighbor. “Do you really believe that?” The spirit of Ngidi is never far away in quantum mechanics—and if you’re not entirely satisfied that probability is the fundamental reality, you are not alone. Einstein, though he himself had first come up with photons as quanta of radiant energy, profoundly disliked the idea of simply agreeing that such things had no physical presence until observed. Richard Feynman blithely stated: “I think it is safe to say that no one understands quantum mechanics.” Perhaps, like position, understanding—in the sense of making a coherent inner picture of an unseen reality—ceases to have meaning at certain scales. At least the equations work; or, as Feynman put it, “Shut up and calculate.”
Is there any way we can think about this without ending in gibbers and squeaks? Perhaps: there are situations in real life where we are aware of a field of probability separate from particular moments and positions. If you drive to work or school,
you probably imagine your route as a probability field, with certain lanes between certain intersections offering a greater potential for getting past that crawling bus than others. This morning or tomorrow morning you may or may not actually be in the left lane as you pass the doughnut store, crowing in triumph or growling in despair—but the route as it exists in your mind, like the two-slit arrangement, is both specific and probabilistic.
Granted, there will always be a slight whiff of medieval theology at the extreme scales of physics, a touch of credo quia impossibile—“I believe because it is impossible.” Let us therefore shift back into the realms of the visible and palpable by selecting a big, bluff, no-nonsense nineteenth-century example: the steam boiler.
What is the source of its power? Motion. Molecules of water vapor, hot and excited, rocket around the boiler’s confined space, caroming into each other and into the sides of the vessel, thus producing pressure. But already, in the course of that sentence, we have run into the necessity of mixing individual and collective description—particles pursuing their frantic courses and pressure measured across them all. Each molecule is a perfect Newtonian agent; each collision obeys the same laws of motion as do the sudden meetings of billiard balls and linebackers. The whole system is classical and deterministic. If you wanted a model position-and-velocity universe for training Laplace’s all-knowing demon, this would be it. Yet, while we could set the demon its task, we could never begin to achieve it ourselves—not even to predict the positions of the molecules in 1 cubic millimeter of steam 1 millisecond from now. We encountered this problem when we looked at the weather: complexity imposes limits on predictability. We can set the equations, but this does not mean we can solve them.
The movement of individual molecules, buffeted by those around them, is essentially deterministic but effectively random. This means that many of the basic qualities we ascribe to physical systems—heat, mechanical work, pressure—are impossible to define except statistically. The great nineteenth-century physicist James Clerk Maxwell considered the properties of a gas (pressure of steam, for example) in terms of a statistical distribution of qualities among its constituent molecules (in this case, their velocity) and found this distribution was the same as the normal curve. To come to this conclusion, he had to make the same kinds of assumptions about the grubby, prosaic boiler that we have been making about our various more rarified examples of probabilistic systems: that the elements are evenly distributed; that the system as a whole is in equilibrium; that each molecule has an equal probability of going in any direction—in other words, that you could represent this system by true, unchanging dice rolled fairly.
If this were all, we could still say that probability is just a way of talking about heat, not something intrinsic to reality. But Maxwell discovered a further imp in the boiler. Maxwell’s clever demon was called into being in 1871 to point out an essential difference between what could happen in physical systems and what actually does. Imagine the demon as doorkeeper, guarding the pipe linking two boilers. As Maxwell had shown, the various steam particles have different energies, normally distributed around the constant, mean energy for the whole system. So when a particular molecule approaches the pipe, the demon sizes up its energy: if it is above a certain threshold, he lets the molecule through—otherwise, he remains, arms folded, staring off into space. You can see that given enough time, this selection procedure would produce a marked difference in energy between the two boilers: all the high-energy VIP molecules enjoying a party there beyond the pipe while the low-energy majority lurk resentfully on this side. The energy for the whole system remains the same; but it is now more organized than it was—so much so that you could use the difference in energy to do useful work. This sorting seems to create something out of nothing, making possible a thermodynamic perpetual motion machine.
Yet as Maxwell took pains to point out, such a sorting never takes place in the physical world. Mix hot and cold, you get lukewarm, which will not then separate again. Heat moves to cold, high pressure to low, energy spreads; you don’t see these qualities concentrating themselves. This is the Second Law of Thermodynamics: entropy (that is, the proportion of energy not available to do work) tends to its maximum in any closed system. Things fall apart; the center cannot hold. Physical systems slump into the most comfortable position possible: that is, where energy gradients approach flatness.
“I offer you something quite modest, admittedly for me all that I have: myself, my entire way of thinking and feeling.” Ludwig Boltzmann’s own energy gradient was always sharply up or down: he was never entirely comfortable, although he looked the archetype of a nineteenth-century Viennese professor—chubby, flowing-bearded, with weak eyes peering behind oval spectacles. He was kind and argumentative, inspired and despairing, daring and doubtful. He blamed his uneasy moods on having been born on the cusp between Mardi Gras and Ash Wednesday.
Boltzmann’s conscience would not allow him to ignore the logical gap between a gas as a collection of individual molecules, colliding according to classical physical rules, and a gas as a collective described in terms of statistical properties. He set about bridging it in 1872, with his “transport equation,” a description of how a chain of collisions would, over time, distribute momentum from particle to particle so that the final result was normally distributed—in effect, operating like a complex three-dimensional version of Galton’s quincunx.
Imagine it this way: every collision between a higher-velocity molecule and a lower one tends to transfer some energy from the one to the other. When my car rear-ends yours, you jerk forward and I slow down. We could start with a system composed half of high-velocity particles and half of near-stationary ones; it has low entropy in that it’s very ordered. As time passes and these particles collide, the proportion of molecules with either high or zero velocity goes down and the proportion of those with something between high and zero velocity rises. Energy will continue to pass across at every collision, like genes at every generation, but the population as a whole maintains a constant, normal distribution. So although we cannot talk about the history and velocity of each particle, we can talk about the proportion of particles that have energies within a set of given ranges; the vast tangle of interconnected functions of motion is organized into a flight of discrete steps, just as the rich complexity of a human population can be organized by measuring chests or taking a poll.
In 1877, Boltzmann extended this idea to explain mechanically how entropy tended toward its maximum value for any given state of an isolated system, a feat he achieved by relating the overall state of the system to the sum of all its possible micro-states.
Consider a system with a given total energy. There are many different ways that energy could be distributed: equally among all particles, for example, or with all the energy vested in one hyperactive molecule while all the rest remained in chilly immobility. We can call each of these possibilities a micro-state. Each is equally probable—in the same sense that each of 36 throws is equally probable with a pair of dice. But, you’ll remember, the totals you get from the throws are not equally probable: there are more ways of making 7, for instance, than of making 12. Let a vast room full of craps players throw dice simultaneously, and you will find a symmetrical distribution of totals around seven, for the same reason that you find a normal distribution of velocities in a gas at equilibrium: because there are proportionally more ways to achieve this distribution than, say, all boxcars on this side of the room and all snake eyes on that. The maximum entropy for a physical system is the macro-state represented by the highest proportion of its possible micro-states. It is, in the strictest sense, “what usually happens.”
Boltzmann’s linking of microscopic and macroscopic showed how the countless little accidents of existence tend to a general loss of order and distinction. Things broken are not reassembled; chances lost do not return. You can’t have your life to live over again, for the same reason you can’t unstir your coffee.
But, objec
ted Boltzmann’s contemporaries, you can unstir your coffee—at least in theory. Every interaction in classical physics is reversible: if you run the movie backward all the rules still apply. Every billiard-ball collision “works” just as correctly in reverse as it does forward. True, we live in a mostly dark, cold and empty universe, so we don’t see, for instance, light concentrating from space onto a star, as opposed to radiating out from it. Yet if we were to see this, it would merely be surprising, not impossible. Our sense of the direction of time, our belief that every process moves irreversibly from past to future, has no clearly defined basis in the mechanics of our cosmos.
So how could Boltzmann suggest that, although time has no inherent direction at the microscopic scale, it acquires direction when one adds up all the micro-states? How could a grimy steam boiler hold a truth invisible in the heavens? The objections were formal and mathematically phrased, but you can hear in them the same outrage that warmed the proponents of Free Will when they argued against Quetelet’s statistical constants.
Yet there was more than moral outrage at work: there was genuine puzzlement. Poincaré’s conclusions from studying the three-body problem had included a proof that any physical system, given enough time, will return arbitrarily close to any of its previous states. This is not quite the Eternal Return with which Nietzsche used to frighten his readers—the hopelessness to which the Hero must say “yes”—since only the position, not the path, is repeated: this moment (or something very like it) will recur but without this moment’s past or future. Even so, Poincaré’s proof seems to contradict the idea of ever-increasing entropy, because it says that someday—if you care to wait—the system will return to its low-entropy state: the cream will eventually swirl out from the coffee.
Boltzmann, surprisingly, agreed. Yes, he said, low entropy can arise from high, but low entropy is the same as low probability. We can imagine the state of our system moving through the space representing all its possible states as being like an immortal, active fly trapped in a closed room. Almost every point in the room is consistent with the maximum entropy allowed—just one or two spots in distant corners represent the system in lower entropy. In time, the fly will visit every place in the room as many times as you choose, but most points will look (in terms of their entropy) the same. The times between visits to any one, more interesting point will be enormous. Boltzmann calculated that the probability that the molecules in a gas in a sphere of radius 0.00001 centimeter will return to any given configuration is once in 3 × 1057 years—some 200,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 times the age of the universe so far. As comparatively vast a system as a cup of coffee would be more than cold before it spontaneously separated.
Chances Are Page 32