Nothing bedevils mathematicians like paradox. Hilbert attempted desperately to stave off all paradox or, better yet, banish it from mathematical systems. Russell, too, looked in horror upon the paradox he had discovered. His answer was laid out in a work that consumed him and Whitehead for the better part of a decade. Principia Mathematica began as a projected one-year, one-volume work. It grew hydra-headed into three volumes (a fourth was planned, but never realized).12 In it, Russell and Whitehead devised a system of symbolic notation, demonstrated the power of logicism, introduced notions of prepositional function and logical construction, and set the stage for Gödel's metamathematics. To get around his own paradox, Russell proposed a hierarchy of sets: from simple members, to sets, to sets of sets, and so on. No intermingling was permitted in the hierarchy, thus preventing a “set-of-all-sets not-members-of-themselves” logical catastrophe.
Logicism, thus bolstered, spread throughout the philosophical world. In Vienna, especially, there developed a philosophical school of mathematical logic called the Vienna Circle. Its purpose was to found a modern approach to logic that would rid philosophy of metaphysics. Its founding members were Moritz Schlick (who would later be assassinated by a former student on the steps of the University of Vienna), Hans Hahn, Herbert Feigl, and Rudolf Carnap. As the circle grew in number and stature, it moved from smoke-filled Vienna cafés to the university. Meetings were held regularly on Thursday evenings.
In the midst of these antimetaphysicians came a Platonist in sheep's clothing: the young Kurt Gödel, invited as a matter of course by his dissertation adviser, Hans Hahn. He attended regularly for two years, beginning in 1926. As he sat in the shabby classroom observing the birth of logical positivism, Gödel kept his own counsel. His contributions were brilliant, but few. In the end, he let his proofs speak for him. It is ironic, even paradoxical, that the man who undermined the logic of mathematics was schooled by such devout logicists.13
Sometime in 1928, Gödel's attendance tapered off. He was at work on his dissertation, which proved the completeness of first-order (i.e., limited) logic. Then, he seems to have turned to what would become the first incompleteness theorem. On October 7, 1930, at a conference in Königsberg, Gödel delivered his seminal paper. It was the third and final day—an inauspicious time at any conference, usually reserved for low-impact papers on obscure topics or for organizational housekeeping. Always a man of few words, Gödel whittled his announcement down to a mere sentence. His “shining hour,” notes Rebecca Goldstein, was more like “30 seconds tops.” Subdued, uncharismatic, and, at the time, unknown, Gödel was unlikely to have made an impression. No mention of his sentence made it into the conference proceedings. Greatmen were at the conference: Rudolf Carnap, the father of logical positivism; Friedrich Waismann, Hans Reichenbach, and Hans Hahn, all members of the Vienna Circle; and John von Neumann, a student of David Hilbert, the grand old man of mathematics and head of mathematics at the University of Göttingen.
The Conference on Epistemology of the Exact Sciences, organized by a group of Berlin positivists, was too modest in scope to have drawn the great Hilbert, widely thought to be the greatest mathematician of his time. Hilbert did attend the umbrella conference of the Society of German Scientists and Physicians, held concurrently in Königsberg. Still, on that third day, as Gödel spoke, Hilbert's presence loomed large. Twice in the previous decades, Hilbert had issued formal challenges to fellow mathematicians and logicians. First, in 1900, speaking at the Third International Congress of Philosophy, he had listed twenty-three critical “problems” that remained “unsettled” and exhorted true mathematicians to find the solutions. The most critical of these problems for our purposes was number two: Prove that the axioms in arithmetic are consistent and that, therefore, arithmetic is a formal system without contradiction. Then, at the 1928 International Congress of Mathematicians, held in Bologna, he lectured on “Problems in laying the foundations of mathematics.” Now, in addition to the question of “consistency,” Hilbert raised another fundamental question: Is it possible, using the axioms of a system, to prove or refute any proposition that is made in the language of that system? Without such proof, Hilbert acknowledged, mathematical logic was without bedrock.
At the heart of Gödel's proof is the Liar's paradox: “This statement is false”—a version of the Cretan Epimenides paradox: “All Cretans are liars.” The important thing about the paradox, for Gödel, is its circular, self-referential structure. By definition and design, the paradox references itself. If Epimenides the Cretan is a liar, then the statement “All Cretans are liars” must be false. So Cretans must be truthful—but if so, Epimenides’ proposition, that Cretans are liars, must be true. Likewise for the Liar's paradox: If the proposition “This statement is false” is true, then the statement must be false. In logic, a “well-formed proposition” is either true or false. The paradox does an end run around this either-or structure. The hermetic seal that ensures a logical system is threatened by paradox. Nearly thirty years earlier, Russell had stopped Frege's philosophical program in its tracks with the paradoxical “set of all sets that are not members of themselves”—either it is or it is not; if it is, then it is not, and vice versa. In the context of “true or false,” then, the paradox always contradicts logic. But what if, instead of “true or false,” we substitute “provable or not provable”? The paradox is then stripped of its “content,” as it were, and made analytical. That is what Gödel did in his proofs.
Gödel thus answered Hilbert's “completeness” question with a resounding “No.” Arithmetic may be consistent, but it is not possible to prove that consistency using the tools of arithmetic. In a way, Gödel's discovery seems to work well in the world of common sense: It is not possible to see the whole if one is part of the whole. Only from without can we discern the trees as a forest.
Unlike Heisenberg's “uncertainty,” word of which had sped around the world of physics, it took several years before “incompleteness” found its audience. This may have been because the proofs were very difficult to understand, even for mathematicians. But resistance, as we have seen, played a part. When Gottlob Frege received Bertrand Russell's bombshell letter regarding the paradox that upended Frege's arithmetic program, the hapless Frege responded within a day: “Your discovery… left me thunderstruck,” he confessed. Yet he called it “a very remarkable one… [that] may perhaps lead to a great advance in logic, undesirable as it may seem at first sight.”14 It was, as Russell later said, a testament to Frege's “dedication to truth. His entire life's work was on the verge of completion… and upon finding that his fundamental assumption was in error, he responded with intellectual pleasure clearly submerging any feelings of personal disappointment.”15 By contrast, Hilbert's initial response to Gödel was anger, according to Paul Bernays.16 He made no attempt to contact Gödel or respond to the proof. Yet he must have understood how deleterious the impact on his wish to solidify mathematical logic.
When, more than a decade later, Gödel and Russell met (how-ever briefly) and corresponded (however obliquely), it seemed that Russell did not understand the proofs—indeed, in a letter written in 1963, he confessed to their having “puzzled” him.17 By mis-chance—or, more accurately, by means of Gödel's nearly pathological perfectionism—we will never know how Russell might have responded to Gödel's critique of him. In November 1942, Paul Arthur Schilpp, editor of The Library of Living Philosophers, asked Gödel to contribute to a volume dedicated to Russell's philosophy. Gödel accepted, to Schilpp's (initial) delight, then proceeded to tinker with the initial draft until the end of the following September, by which time Russell had finished his responses to the other essays and could spare “no leisure” for a reply to this late-comer. Gödel's effort elicited no more than a brief note of general praise for Gödel and an off-handed acknowledgement that the “Principia Mathematica was completed thirty-three years ago, and obviously, in view of subsequent advanced in the subject, it needs amending in various ways.” Gödel's hopes
for a “discussion” were dashed.
THE MECHANICAL WORLD
In 1912, Bertrand Russell asked the question: “Is there any knowledge in the world which is so certain that no reasonable man could doubt it?”18
This was, of course, the same question Descartes had asked three centuries earlier. To escape from the deadening authority of traditional theology, metaphysics, and morality, Descartes began by doubting everything: the existence of his body, of other people, of the world, of his own sanity. His “methodological skepticism” left him with but one certainty: He was thinking. Hence, “I think, therefore I am.” The operations of his mind, distinct from all other matter and perception, were his starting point. From there, Descartes meant to rebuild philosophy afresh by asserting substantive dualism. He split all reality into mind on one side and matter on the other, with no bridge between them (although Descartes provisionally invoked God to fill the gap). During the philosophical wars that followed, Cartesian rationalists and empiricists like John Locke waged their battles against traditional authority with a flaming, righteous sword. Matter was to be explained only by a deterministic physics, without appeal to morality, religion, aesthetics, or other such “mental” qualities (“value-free,” as is now said).
In these wars for truth, modern science stood for doubt. Descartes urged us to doubt all—and doubt became the modern mode. Physics is a highly organized method for doubting appearance. Its purpose is to question what it encounters until what remains is what must be. A microscope strips away what we see to reveal the unseen. The seemingly “real” world is replaced by the scientifically “real” one. That scientifically “real” world, so different from that of the spirit, seems to work with regularity, according to invariable rules. Thus was born the underpinnings of the mechanical worldview—the “classical physics” which held sway until the twentieth century.
In 1874, Max Planck, the young son of a theology professor, entered the University of Munich. He had just graduated from the Königliches Maximilians Gymnasium, where he had excelled at mathematics and astronomy, and where he learned the fundamentals of physics, particularly the law of conservation of energy.
It is not unusual for students to be guided by the biases and visions of their advisers. Planck was the exception. In no uncertain terms did his University of Munich physics professor, Phillipp von Jolly, warn him away from the field. All had been discovered, said Jolly. All that remained in physics was the mopping up of a few dusty corners. Planck's stubbornness was later vindicated when he became the default founder of quantum theory and a Nobel laureate.19
Jolly was hardly alone. Most physicists agreed—physics was a done deal. After all, Newton's equations had taken care of gravity and motion. As for the remaining forces, in 1864, James Clerk Maxwell presented his famous set of eponymous equations to the Royal Academy. They identified light as an “electromagnetic wave,” and they laid down the basic laws for the forces of electricity and magnetism. The primary forces of nature were thus accounted for. What more could physics do?
RELATIVITY OF TIME AND SPACE
Mathematics is curiously intimate with, and revealing of, physical reality. In modern physics, we have discovered the universe's geometric structure and its knotted energies more by way of skeletal equations than by giant telescopes. The more physics uses mathematics, the more physical reality seems to oblige by offering its deepest secrets.
Common sense tells us that physics depends upon empirical data. Galileo's Tower of Pisa, Newton's prism, Young's double-slit box, Foucault's pendulum: Classical physics looked to the physical world for inspiration and confirmation. After Einstein, whose special and general relativity theories were, as he said, “wrested from Nature,” all of this would change. Theoretical physics is subject to the “lure of numbers,” argues David Lindley. Although physical theories require experimental verification, mathematical structure can make or break a hypothesis. For Lindley, Einstein's “general theory of relativity is the prime example of an idea that convinces by its mathematical structure and power.” The theory has been tested, most notably by Sir Arthur Eddington's measurements of light bending around an eclipse on the island of Principe off Equatorial Guinea. Yet its authority, Lindley believes, comes primarily from its “beautiful theoretical framework.”20
How did mathematics gain the upper hand in physics? Perhaps it always had an advantage. Plato's Republic disdains the empirical: “Geometry is knowledge of the eternally existent.”21 Aristotle fixed his gaze scrupulously at the physical world. From Plato we inherit the suspicion that numbers are magical. Only when Descartes made the connection between “mathematics and the sensory world,” as Arthur I. Miller says, did mathematics (and especially geometry) suddenly emerge as a tool for deciphering the physical world. Pythagoras found harmony in the integers in music and in the spheres. Thus it seemed, too, for quantum mechanics. As Miller points out, Max Born called Bohr's analogy of the solar system to the atom a kind of “magic.”22
Modern science has long been rightly seen as a dissolvent of all certainties—especially physics, which posited uncertainty and wave-particle dualities, split the once-solid atom, and discovered the esoteric geometries of space-time. Yet physics remains a citadel of eternity in its ever-unchanging numbers. As we shall see, physicists have, in their theories of relativity and uncertainty, discovered mathematical formalisms called “universal constants”: the speed of light and Planck's constant, for instance. That is some consolation of an “eternal” kind, though ordinary humanity may not be much moved.
The word “relativity” eventually weighed upon Einstein like an albatross of imprecision. It implies opposition to “absolute”—yet Einstein's relativity theories are anchored to the absolute of absolutes: the speed of light. That speed alone remains absolute; time and space are relative to it. In later years, when Einstein fought his rearguard action against the relativism of quantum mechanics, his early discoveries came back to haunt him. “God does not play dice with the universe,” he said. But a devil's advocate might say, with Banesh Hoffmann, that Einstein had loosed the demons himself. His quantification of light via Planck is often deemed the inaugural leap into quantum theory. His special theory, with its equivalences of mass and energy, has as its legacy particle physics, the arena for quantum mechanics. Einstein died believing quantum physics to be incomplete in its description.
It is the blessing of youth that its energy is greater than its foresight. Einstein's miracle year produced four extraordinary papers: the first on light quanta, the second on the size of molecules, the third on Brownian motion and the existence of the atom, and the fourth on moving bodies—“special relativity.”
In 1896, the discovery of radioactivity inaugurated a search for the nucleus. In 1898, Marie Curie found two radioactive elements, and Ernest Rutherford started sorting out the alpha, beta, and gamma rays from radiation. In 1903, with Frederick Soddy, Rutherford explained radioactive decay, and, in 1911, he finally discovered the atomic nucleus. This set off the next wave of discoveries: Bohr's quantum theory of the atom in 1913; Chadwick's discovery of the neutron in 1932; artificial fission by the Joliot-Curies; and the explanation of nuclear fission by Hahn in 1939.
In 1905, the young Einstein had launched both relativity and quantum physics. (Planck had discovered the quantum phenomenon, but Einstein started quantum physics by applying the concept to light quanta.) Relativity, special and general, were Einstein's singlehanded achievement. But very few physicists specialized in that until after Einstein's death. Einstein worked on it by himself.
Quantum physics, however, attracted a crowd and needed them: The implications went in every direction. Einstein himself remained a most important contributor, continuing to publish important work on quantum problems even while laboring away at general relativity. In March 1916, he finally published the complete gravitational theory; in July, he began to publish three papers on quantum theory. Late in life, he told a friend that he had thought a hundred times more about quan
tum physics than about relativity. As usual, his thinking took a quite individual turn.
Accounts of Einstein's work usually pass quickly over this longest and, in some ways, most ambitious part of his career. For one thing, this period can be dismissed as evidence of his decline in genius. Indeed, this last effort turned out to be a failure, having added little to the progress of physics. It opened no paths for the future. Recent unifying attempts go in an entirely different direction.
But the question of what happened in Einstein's search for unity may cast light on a neglected side of science. Science is collective and cumulative. Its processes ensure that even its surpassing contributions will ultimately “fail.” We rarely see this side of science. Instead, science is presented as a series of dramatic breakthroughs, new pathways, inventions, new frontiers. True, the biologist Robert Hooke and the chemist Robert Boyle were once in the vanguard of discovery; now they have moved back into the fabric of the grand design. Historians of science know that Boyle discovered the relationship between pressure and volume (Boyle's law), but how many working physicists could fairly describe the achievements of the Swedish physicist Svante Arrhenius, whose work on ions predicted the greenhouse effect? Does it matter? Physics is in many ways a self-erasing discipline, concerned only with the latest leading edge of research.
In later life, Einstein was overtaken by history twice. First, by his personal history: In his forties and fifties, his gifts, quickness, and prowess inevitably faded. This happens to everyone. His extraordinary discovery of general relativity may well have made him too confident that he could then master the intricacies of a unified theory. But scientists are overtaken by history in the special way just noted: Sooner or later, the most surpassing achievements will be modified, supplanted, or rebuilt. Newton's gravitational theory eventually became a special case of Einstein's general relativity. If that could happen to Newton, it could happen to Einstein—and indeed, Einstein predicted that it would.
Einstein's Genius Club Page 14