REBECCA GOLDSTEIN
Incompleteness
The Proof and Paradox
of Kurt Gödel
ATLAS BOOKS
W. W. NORTON & COMPANY
NEW YORK • LONDON
Dedication
For Yael
the child is mentor to the mother
Epigraph
But every error is due to extraneous factors (such as emotion and education); reason itself does not err.
—KURT GÖDEL
29 November 1972
Contents
Cover
Title Page
Dedication
Epigraph
Introduction
IA Platonist among the Positivists
IIHilbert and the Formalists
IIIThe Proof of Incompleteness
IVGödel’s Incompleteness
Notes
Suggested Reading
Acknowledgments
Index
Praise
Copyright
Other Works
By Rebecca Goldstein
Introduction
Exiles
It’s late summer in suburban New Jersey. Down a secluded road two men are strolling, hands clasped behind their backs, quietly speaking. Above them a thick canopy of trees shelters them from the sky. Stately old homes stand far back from the road, while on the other side, just beyond the elms, the lush green carpet of a golf course rolls away, the muted voices of men at play coming as if from a great distance.
Yet, appearances to the contrary notwithstanding, this is not just one more suburban enclave strictly populated by the country club set, with men commuting daily into the city to support the affluence. No, this is Princeton, New Jersey, home of one of the great universities of the world, and so possessed of a far more eclectic population than a first glance suggests. At this moment that finds these two men strolling home on a quiet back road, Princeton’s population has become even more cosmopolitan, with many of Europe’s finest minds on the run from Hitler. As one American edu-cator put it, “Hitler shakes the tree and I gather the apples.” Some of the choicest of apples have ended up rolling into this little corner of the world.
So it is not so surprising that the language in which the two strollers are conversing is German. One of the men, dapperly dressed in a white linen suit with a matching white fedora, is still in his thirties while the other, in baggy pants held up by old-world-style suspenders, is approaching seventy. Despite the difference in their ages, they seem to be talking to one another as peers, though occasionally the older man’s face crinkles up into a well-worn matrix of amusement and he shakes his head as if the other has now said something wirklich verrückt, really cracked.
At one end of the leafy road, in the direction away from which the two are heading, the spanking new red-brick Georgian building of the Institute for Advanced Study is laid out on a great expanse of lawn. The Institute has been around now for over a decade, renting space in Princeton University’s Gothic mathematics building. But the brainy influx from Europe has boosted the Institute’s prestige, and now it has moved a few short miles from the university onto its own spacious campus, which includes a pond and a small forest, crisscrossed by paths, where fugitive ideas can be run to ground.
The Institute for Advanced Study is already, in the early 1940s, an American anomaly, peopled with a few select thinkers. Perhaps part of the explanation for the Institute’s uniqueness lies in its having developed out of the visionary ideas of a single man. In 1930, educational reformer Abraham Flexner had persuaded two New Jersey department store heirs, Louis Bamberger and his sister Mrs. Felix Fuld, to charter a new type of academy, dedicated to the “usefulness of useless knowledge.” The two retail magnates, motivated by their philanthropic intent, had sold their business to R. H. Macy and Co. just weeks before the stock market crash; with a fortune of $30 million, they had turned to Flexner to advise them on how to apply it to the betterment of mankind’s mind.
Flexner, the son of Eastern European immigrants, had taken it upon himself some years before single-handedly to expose the shoddiness of American medical education. Around the turn of the century there was a surplus of medical schools, granting medical degrees that often indicated little more than that the recipient had paid the required tuition. The state of Missouri alone had 42 medical schools, the city of Chicago 14. Flexner’s report, exposing the sham and published by the Carnegie Foundation for the Advancement of Teaching, had made a difference. Some of the worst of the institutions folded up their tents and snuck off into the night.
The Bamberger/Fulds were grateful to their former New Jersey patrons and wanted to give them something back. Their first thought was a medical school, and so they sent their representatives to speak with the man who knew so much about how medicine ought to be taught. (Flexner’s brother was head of Rockefeller University’s medical school, which served Flexner as a model.) But Flexner had been harboring even more utopian dreams than ensuring that American doctors know something about medicine. His thoughts on educational reform had taken a decided turn away from the applied and practical. His idea was to create a haven for the purest of thinkers, to realize the proverbial ivory tower in solid red brick: in short, to create what would come to be known as the Institute for Advanced Study.
Here the reverentially chosen faculty would be treated as the princes of Reine Vernunft, of pure reason, that they were. They would be given generous remuneration (so that some dubbed the place “the Institute for Advanced Salaries”), as well as the priceless luxury of limitless time in which to think, unburdened of the need to prepare classroom lectures and correct student exam booklets—in fact unburdened of the presence of students altogether. Instead a constantly replenished stream of gifted younger scholars, eventually known as the “temporary members,” would visit for one or two years, injecting the bracing tonic of their energy, youth, and enthusiasm into the ichor of genius. “It should be a free society of scholars,” Flexner wrote. “Free, because mature persons, animated by intellectual purposes, must be left to pursue their own ends in their own way.” It ought to provide simple, though spacious, surroundings “and above all tranquility—absence of distraction either by worldly concerns or by parental responsibility for an immature student body.” The Bamberger/Fulds had originally wanted to locate their school in Newark, New Jersey, but Flexner persuaded them that Princeton, with its centuries-old traditions of scholarship and insulated layers of serenity, would be far more conducive to drawing forth the desired results from unfettered genius.
Flexner decided to establish his vision on the firm foundations of mathematics, “the severest of all disciplines,” in his words. Mathematicians, in a certain sense, are the farthest removed of all academics from thoughts of “the real world”—a phrase which, in this context, means more than merely the practical world of current affairs. The phrase is meant to cover just about everything that physically exists, aside from ideas, concepts, theories: the world of the mind. Of course, the world of the mind can certainly be, and typically is, about the real world; however, not, typically, in mathematics. Mathematicians, in their extreme remoteness, may not enjoy (or suffer) much notice from the public at large; but, among those who live the life of the mind, they are regarded with a special sort of wonder for the rigor of their methods and the certainty of their conclusions, unique features that are connected with some of the very reasons that make them largely useless (“useless” in the sense that the knowledge of mathematics leads, in and of itself, to no practical consequences, no means of changing our material condition, for better or for worse).
The rigor and certainty of the mathematician is arrived at a priori, meaning that the mathematician neither resorts to any obser
vations in arriving at his or her mathematical insights1 nor do these mathematical insights, in and of themselves, entail observations, so that nothing we experience can undermine the grounds we have for knowing them. No experience would count as grounds for revising, for example, that 5 + 7 = 12. Were we to add up 5 things and 7 things, and get 13 things, we would recount. Should we still, after repeated recounting, get 13 things we would assume that one of the 12 things had split or that we were seeing double or dreaming or even going mad. The truth that 5 + 7 = 12 is used to evaluate counting experiences, not the other way round.
The a priori nature of mathematics is a complicated, confusing sort of a thing. It’s what makes mathematics so conclusive, so incorrigible: Once proved, a theorem is immune from empirical revision. There is, in general, a sort of invulnerability that’s conferred on mathematics, precisely because it’s a priori. In the vaulting tower of Reine Vernunft the mathematicians stand supreme on the topmost turret, their methods consisting of thinking, and thinking alone; this is partly what Flexner meant by calling their discipline the most severe.
Despite their intellectual stature, mathematicians are relatively cost-effective to maintain, requiring, again in Flexner’s words, only “a few men, a few students, a few rooms, books, blackboards, chalk, paper, and pencils.” No expensive laboratories, observatories, or heavy equipment is required. Mathematicians carry all their gear in their craniums, which is another way of saying that mathematics is a priori. Calculated also in Flexner’s practical reasoning was the fact that mathematics is one of the few disciplines in which there is almost total unanimity on the identity of the best. Just as mathematics, alone among the disciplines, is able to establish its conclusions with the unassailable finality of a priori reason, so, too, the ranking of its practitioners follows with almost mathematical certitude. Flexner, functioning not only as the Institute’s designer but as its first director, would know exactly after whom to go.
He soon loosened the requirements sufficiently to allow for the most theoretical of physicists and mathematical of economists, and late in 1932 was able to make the triumphant announcement that the first two employees he had hired were Princeton’s own Oswald Veblen, a mathematician of the highest rank, and none other than Germany’s Albert Einstein, the scientist whose near-cult status had made him a prominent target for the Nazis. Einstein’s revolutionary theories of special and general relativity had been attacked by German scientists as representative of pathologically “Jewish physics,” corrupted by the Jewish infatuation with abstract mathematics. Even before the genocidal plans kicked into full operation, the physicist had been placed on the Third Reich’s special hit list.
As would be expected, a host of universities were more than willing to open their doors to so prestigious a refugee; in particular, Pasadena’s California Institute of Technology was vigorously trying to recruit him. But Einstein favored Princeton, some say because it was the first American university to show interest in his work. His friends, casting their cosmopolitan eyes at the New Jersey seat of learning, convinced of its essential provinciality, asked him “Do you want to commit suicide?” But with a homeland suddenly turned maniacally hostile, perhaps Princeton’s early and lasting friendliness proved irresistible. Einstein asked Flexner for a salary of $3000 and Flexner countered with $16,000. Soon the famous head with the ion-charged hair was strolling the suburban sidewalks, so that at least on one occasion a car hit a tree “after its driver suddenly recognized the face of the beautiful old man walking along the street.”
Other luminaries from Europe followed Einstein to New Jersey, including the dazzling Hungarian polymath, John von Neumann, who would begin construction of the world’s first computer while at the Institute, scandalizing those members who shared Flexner’s commitment to keeping the Institute free of any “useful” work.2 But it is Albert Einstein who has been immortalized, even while still very much alive,3 as the apotheosis of the man of genius, so that townspeople have, almost since the day of his arrival, taken to calling Flexner’s establishment “the Einstein Institute.”
Sure enough the older of the two strollers glimpsed on the leafy road leading from the Institute is none other than Princeton’s most famous denizen, his face once again registering wry amusement at something his walking partner has just propounded in all apparent seriousness. The younger man, a mathematical logician, acknowledges Einstein’s reaction by producing a faint, crooked smile of his own, but continues to deduce the implications of his idea with unflappable precision.
The topics of their daily conversations range over physics and mathematics, philosophy and politics, and in all of these areas the logician is more than likely to say something to startle Einstein in its originality or profundity, naïveté or downright outlandishness. All of his thinking is governed by an “interesting axiom,” as Ernst Gabor Straus, Einstein’s assistant from 1944 to 1947, once characterized it. For every fact, there exists an explanation as to why that fact is a fact; why it has to be a fact. This conviction amounts to the assertion that there is no brute contingency in the world, no givens that need not have been given. In other words, the world will never, not even once, speak to us in the way that an exasperated parent will speak to her fractious adolescent: “Why? I’ll tell you why. Because I said so!” The world always has an explanation for itself, or as Einstein’s walking partner puts it, Die Welt ist vernünftig, the world is intelligible. The conclusions that emanate from the rigorously consistent application of this “interesting axiom” to every subject that crosses the logician’s mind—from the relationship between the body and the soul to global politics to the very local politics of the Institute for Advanced Study itself—often and radically diverge from the opinions of common sense. Such divergence, however, counts as nothing for him. It is as if one of the unwritten laws of his thought processes is: If reasoning and common sense should diverge, then . . . so much the worse for common sense! What, in the long run, is common sense, other than common?
This younger man is known to far fewer people, in his own day as well as in ours. Yet his work was, in its own way, as revolutionary as Einstein’s, to be grouped among the small set of the last century’s most radical and rigorous discoveries, all with consequences seeming to spill far beyond their respective fields, percolating down into our most basic preconceptions. At least within the mathematical sciences, the first third of the twentieth century made almost a habit of producing conceptual revolutions. This man’s theorem is the third leg, together with Heisenberg’s uncertainty principle and Einstein’s relativity, of that tripod of theoretical cataclysms that have been felt to force disturbances deep down in the foundations of the “exact sciences.” The three discoveries appear to deliver us into an unfamiliar world, one so at odds with our previous assumptions and intuitions that, nearly a century on, we are still struggling to make out where, exactly, we have landed.
It is much in the remote nature both of the man and of his work that he will never approach the celebrity status of his Princeton walking partner nor of the author of the uncertainty principle, who is almost certainly engaged at this same moment in history in the effort to produce the atom bomb for Nazi Germany. Einstein’s walking partner is a revolutionary with a hidden face. He is the most famous mathematician that you have most likely never heard of. Or if you have heard of him, then there is a good chance that, through no fault of your own, you associate him with the sorts of ideas—subversively hostile to the enterprises of rationality, objectivity, truth—that he not only vehemently rejected but thought he had conclusively, mathematically, discredited.
The logician and the physicist on one of their daily walks to and from the Institute for Advanced Study, Princeton.
He is Kurt Gödel, and in 1930, when he was 23, he had produced an extraordinary proof in mathematical logic for something called the incompleteness theorem—actually two logically related incompleteness theorems.
Unlike most mathematical results, Gödel’s incompleteness theorems ar
e expressed using no numbers or other symbolic formalisms. Though the nitty-gritty details of the proof are formidably technical, the proof’s overall strategy, delightfully, is not. The two conclusions that emerge at the end of all the formal pyrotechnics are rendered in more or less plain English. The Encyclopedia of Philosophy’s article “Gödel’s Theorem” opens with a crisp statement of the two theorems:
By Gödel’s theorem the following statement is generally meant:
In any formal system adequate for number theory there exists an undecidable formula—that is, a formula that is not provable and whose negation is not provable. (This statement is occasionally referred to as Gödel’s first theorem.)
A corollary to the theorem is that the consistency of a formal system adequate for number theory cannot be proved within the system. (Sometimes it is this corollary that is referred to as Gödel’s theorem; it is also referred to as Gödel’s second theorem.)
These statements are somewhat vaguely formulated generalizations of results published in 1931 by Kurt Gödel then in Vienna. (“Über formal unentscheidbare Sätze der Principia Mathematica und verwandter Systeme I,” received for publication on November 17, 1930.)
Though one might not guess it from this terse statement of them, the incompleteness theorems are extraordinary for (among other reasons) how much they have to say. They belong to the branch of mathematics known as formal logic or mathematical logic, a field which was viewed, prior to Gödel’s achievement, as mathematically suspect;4 yet they range far beyond their narrow formal domain, addressing such vast and messy issues as the nature of truth and knowledge and certainty. Because our human nature is intimately involved in the discussion of these issues—after all, in speaking of knowledge we are implicitly speaking of knowers—Gödel’s theorems have also seemed to have important things to say about what our minds could—and could not—be.
Incompleteness Page 1