Questioning the Millennium

Home > Other > Questioning the Millennium > Page 9
Questioning the Millennium Page 9

by Stephen Jay Gould


  I had intended to spend only a few hours in research for this chapter, but as I looked up documents from century transitions, I noticed something interesting in this sociological realm. The two positions—I have called them “logical” and “common sensible” so far in this chapter—also have clear social correlations that I had not anticipated. The logical position—that centuries must have one hundred years and transitions must therefore occur, because Dionysius started at one rather than zero, between ’00 and ’01 years—has always been overwhelmingly favored by scholars and by people in power (the press and business in particular), representing what we may call “high culture.” The common sensible position—that we must honor the appearance of maximal change between ’99 and ’00 years and not fret overly about Dionysius’s unfortunate lack of foresight—has been the perpetual favorite of that mythical composite once designated as John Q. Public, or the “man in the street,” and now usually called vernacular or pop culture.

  The distinction goes back to the very beginning of this perpetually recurring debate about century transitions. Hillel Schwartz traces the first major hassle to the 1699–1701 passage (place the moment where you wish), the incarnation that prompted Samuel Sewall’s trumpeting in Boston. Interestingly, part of the discussion then focused upon an issue that has been persistently vexatious ever since: viz., did the first millennial transition of 999–1001 induce a period of fear about imminent apocalyptical endings of the world?

  I discussed this topic in Part 1 and wish now only to point out that the first published claim for a panic terror, a late sixteenth-century work by Cardinal Cesare Baronio, also addressed the great issue of endings for centuries—as this document of undoubted high culture favored the end of the year 1000 for apocalyptic expectations, while most popular writing has always focused on the end of 999 (as in the newspaper quotation cited on this page). Thus, whether by anachronism or direct testimony, this debate has always been with us. Hillel Schwartz writes:

  Sarcastic, bitter, sometimes passionate debates in re a terminus on New Year’s Eve ’99 vis-à-vis New Year’s ’00, have been prosecuted since the 1690’s and confusion has spread to the mathematics of the millennial year. For Baronio and his (sparse) medieval sources, the excitements of the millennium were centered upon the end of the year 1000, while the end of 999 has figured more prominently in the legend of the panic terror.

  The pattern has held ever since, as the debate bloomed in the 1690s, spread in the 1790s with major centers in newspapers of Philadelphia and London (and with added poignancy as America mourned the death of George Washington at the very end of 1799), and burst out all over the world in a frenzy of discussion during the 1890s.

  The 1890s version displays the clearest division of high versus vernacular culture. A few high culture sources did line up behind the pop favorite of 1899–1900. Kaiser Wilhelm II of Germany officially stated that the twentieth century had begun on January 1, 1900. A few barons of scholarship, including such unlikely bedfellows as Sigmund Freud and Lord Kelvin, agreed. But high culture overwhelmingly preferred the Dionysian imperative of 1900–1901. An assiduous survey showed that the presidents of Harvard, Yale, Princeton, Cornell, Columbia, Dartmouth, Brown, and the University of Pennsylvania all favored 1900–1901—and with the entire Ivy League so firmly behind Dionysius, why worry about a mere Kaiser?

  In any case, 1900–1901 won decisively, in the two forums that really matter. Virtually every important public celebration for the new century, throughout the world (and even in Germany), occurred from December 31, 1900, into January 1, 1901. Moreover, essentially every major newspaper and magazine officially welcomed the new century with their first issue of January 1901. I made a survey of principal sources and could find no exceptions. The Nineteenth Century, a leading British periodical, changed its name to The Nineteenth Century and After, but only with the January 1901 issue, which also featured a new logo of bifaced Janus, with an old bearded man looking down and left into the nineteenth century and a bright youth looking right up into the twentieth. Such reliable standards as The Farmer’s Almanack and The Tribune Almanac declared their volumes for 1901 as “first number of the twentieth century.” On December 31, 1899, The New York Times began a story on the nineteenth century by noting: “Tomorrow we enter upon the last year of a century that is marked by greater progress in all that pertains to the material well-being and enlightenment of mankind than all the previous history of the race.” A year and a day later, on January 1, 1901, the lead headline proclaimed “Twentieth Century’s Triumphant Entry” and described the festivities in New York City: “The lights flashed, the crowds sang, the sirens of craft in the harbor screeched and roared, bells pealed, bombs thundered, rockets blasted skyward, and the new century made its triumphant entry.” Meanwhile, poor Carry Nation never got to watch the fireworks, or even to raise a glass, for a small story on the same first page announced: “Mrs. Nation Quarantined—Smallpox in jail where Kansas saloon wrecker is held—says she can stand it.”

  The Nineteenth Century and After: A Monthly Review (1901).

  Tribune Almanac and Political Register (1901).

  Thus, the last time around, high culture still held the reins of opinion—even in such organs of pop culture as The Farmer’s Almanack, no doubt published by men who considered themselves among the elite. But consider the difference as we approach the millennium—for who can doubt that pop culture will win decisively on this most important of all replays? Oh, to be sure, the “official” sources of a waning purity in high culture will make their customary noises. Indeed, as I was revising this essay, I noted the following headline in The New York Times for December 8, 1996: “British Observatory Takes Stand on When Millennium Begins.” The story begins by acknowledging the fait accompli of pop culture’s imposition this time around:

  When the clock strikes midnight on December 31, 1999, billions of people around the world will celebrate the dawn of a new millennium—a year too early, some experts say. As the champagne flows and kisses mark the start of the new age, the revelers will actually be welcoming the last year of the present millennium, not the first year of the next, they say.

  The Times then reports that the most official of all conceivable sources—the gold standard that could easily have imposed its will in centuries past—has thrown down the gauntlet for high culture’s perennial favorite, Dionysius Exiguus’s unpopular solution: “The start of the new millennium is January 1, 2001—not the year 2000, say researchers at the Royal Greenwich Observatory in Cambridge, England.”

  Times have changed, however, and the Times quickly acknowledged why high culture’s Greenwich solution cannot prevail. First of all, no one now wields an imprimatur in our decentralized world:

  In addition to no longer being in Greenwich, the observatory is no longer the world’s timekeeper. “Coordinated universal time” measured by some 150 atomic clocks around the world has replaced Greenwich mean time as the standard.

  Second, pop culture’s preferences can no longer be denied. Even once mighty Greenwich has been reduced to impotent tut-tutting! The Times story continues:

  The year 2000 “will certainly be celebrated, as is natural for a year with such a round number,” a statement issued by the observatory said. “But, accurately speaking,” it said, “we will be celebrating the 2,000th year, or the last year of the millennium, not the start of the new millennium.”

  True to form, but armed this time with the invincible authority of new social relations, pop culture will have none of John Bull’s bushwa. Take cover: the perennial (or rather percenturial) debate is on once again! Two letters appeared in the Times on December 12, one announcing with bored insouciance that it’s all over anyway because more than two thousand years have elapsed since Christ’s actual birth; the other responding with scorn and vigor to the old guard of Greenwich:

  Enough already with the sophistic explanations of why the year 2000 is not the beginning of the new millennium. Popular wisdom will make it so,
even if the astronomers disbelieve. Their argument that there is no year zero is silly; we can have a year zero any time we want. The sequence of years can be redefined as 3 B.C., 2 B.C., 1 B.C. or 0 A.D., 1 A.D., 2 A.D.… Then the year 1 B.C. would merely have different names in the A.D. system and the B.C. system.

  This letter provides yet another clever, and perfectly adequate, rationale for celebrating in 2000—a lovely solution akin to my informant’s conviction (cited previously) that the first century had only ninety-nine years. As I have emphasized throughout, arbitrary problems without conceivable final answers require consistent but arbitrary solutions.

  In any case, and in the truly decisive court of culture and sociology, who can doubt that 2000 will win this time? Arthur C. Clarke and Stanley Kubrick stood by Dionysius in book and film versions of 2001, but I can hardly think of another source that does not specify the inception of 2000 as the great moment of transition. All book titles of our burgeoning literature honor pop culture’s version of maximal numerical shift—including Ben Bova’s Millennium: A Novel about People and Politics in the Year 1999; J. G. de Beus’s Shall We Make the Year 2000; Raymond Williams’s The Year 2000; and even Richard Nixon’s 1999: Victory Without War. Prince’s album and lead song 1999 cites the same date from this ne plus ultra of pop sources.

  Cultural historians have often remarked that the expansion of pop culture, including both respect for its ways and diffusion of its influence, marks a major trend of the twentieth century. Musicians from Benny Goodman to Wynton Marsalis play their instruments in jazz bands and classical orchestras. The Metropolitan Opera has finally performed Porgy and Bess—and bravo for them. Scholars write the most damnedly learned articles about Mickey Mouse.

  This remarkable change has been well documented and much discussed, but commentary has so far missed this important example from the great century debate. The distinction still mattered in 1900, and high culture won decisively by imposing January 1, 1901 as the inception of the twentieth century. Pop culture (or the amalgam of its diffusion into courts of decision makers) may already declare clear victory for the millennium, which will occur at the beginning of the year 2000 because most people so feel it in their bones, Dionysius notwithstanding—and again I say bravo. My young friend wanted to resolve the debate by granting the first century only ninety-nine years; now ordinary humanity has spoken for the other end—and the transition from high culture dominance to pop culture diffusion will resolve this issue of the ages by granting the twentieth century but ninety-nine years! The old guard of Greenwich may pout to their heart’s content, but the world will rock and party on January 1, 2000.

  How lovely—for eternal debates about the unresolvable really do waste a great deal of time, put us in bad humor, and sap our energy from truly important pursuits. Let us, instead, save our mental fight—not to establish the blessed millennium (for I doubt that humans are capable of such perfection) but at least to build Jerusalem upon our planet’s green and pleasant land.

  * In this book’s spirit of dispelling a standard set of confusions that have surrounded the forthcoming millennium, may I at least devote a footnote to the most trivial but also the most unambiguously resolvable. Millennium has two n’s—honest to God, it really does, despite all the misspellings, even in most of the books and product names already dedicated to the event. The adjective millennial also has two, but the alternative millenarian only has one. (The etymologies are slightly different. Millennium is from the Latin mille, “one thousand,” and annus, “year”—hence the two n’s. Millenarian is from the Latin millenarius, “containing a thousand (of anything),” hence no annus, and no two n’s.)

  PART ONE:

  BLOODY-MINDED NATURE

  We have a false impression, buttressed by some famously exaggerated testimony, that the universe runs with the regularity of an ideal clock, and that God must therefore be a consummate mathematician. In his most famous aphorism, Galileo described the cosmos as “a grand book written in the language of mathematics, and its characters are triangles, circles, and other geometrical figures.” The Scottish biologist D’Arcy Thompson, one of my earliest intellectual heroes and author of the incomparably well-written Growth and Form (first published 1917 and still vigorously in print, the latest edition with a preface by yours truly), stated that “the harmony of the world is made manifest in Form and Number, and the heart and soul and all the poetry of Natural Philosophy are embodied in the concept of mathematical beauty.”

  Many scientists have invoked this mathematical regularity to argue, speaking metaphorically at least, that any creating God must be a mathematician of the Pythagorean school. For example, the celebrated physicist James Jeans wrote: “From the intrinsic evidence of his creation, the Great Architect of the Universe now begins to appear as a pure mathematician.” This impression has also seeped into popular thought and artistic proclamation. In a lecture delivered in 1930, James Joyce defined the universe as “pure thought, the thought of what, for want of a better term, we must describe as a mathematical thinker.”

  The Vision of Saint John (1608–1614), El Greco. (illustration credit 3.1)

  If these paeans and effusions were invariably true, I could compose my own lyrical version of the consensus, and end this book forthwith. For I have arrived at the last great domain for millennial questions—calendrics. I need to ask why calendrical issues have so fascinated people throughout the ages, and why so many scholars and mathematicians have spent so much time devising calendars and engaging in endless debates about proper versus improper, elegantly simple versus overly elaborate, natural versus contrived systems for counting seconds, minutes, hours, days, weeks, months, lunations, years, decades, centuries and millennia, tuns and baktuns, tithis and karanas, ides and nones. Our culturally contingent decision to recognize millennia, and to impose divisions by 1,000 upon a solar system that includes no such natural cycle, adds an important ingredient to this maelstrom of calendrical debate.

  If God were Pythagoras in Galileo’s universe, calendrics would never have become an intellectual subject at all. The relevant cycles for natural timekeeping would all be nice, crisp, easy multiples of each other—and any fool could simply count. We might have a year (earth around sun) with exactly ten months (moon around earth), and with precisely one hundred days (earth around itself) to the umpteenth and ultimate decimal point of conceivable rigor in measurement. But God, thank goodness, includes both Loki and Odin, the comedian and the scholar, the jester and the saint. God did not fashion a very regular universe after all. And we poor sods of his image are therefore condemned to struggle with calendrical questions till the cows come home, and Christ comes round again to inaugurate the millennium.

  Oh, I don’t deny that some corners of truly stunning mathematical regularity grace the cosmos in domains both large and small. The cells of a honeybee’s hive, and the basalt pillars of the Giant’s Causeway in Northern Ireland, make pretty fair and regular hexagons. Many “laws” of nature can be written in an astonishingly simple and elegant mathematical form. Who would have thought that E = mc2 could describe the unleashing of the prodigious energy in an atom?

  But we have been oversold on nature’s mathematical regularity—and my opening quotations stand among the worst offenders. If anything, nature is infinitely diverse and constantly surprising—in J. B. S. Haldane’s famous words, “not only queerer than we suppose, but queerer than we can suppose.” I call this section “Bloody-Minded Nature” because I wish to specify the two opposite domains of nature’s abject refusal to be mathematically simple for meaningful reasons. The second domain forces every complex society—as all have independently done, from Egypt to China to Mesoamerica—to struggle with calendar-making as a difficult and confusing subject, not a simple matter of counting. Many questions about the millennium—Why do we base calendars on cycles at all? Why do we recognize a thousand-year interval with no tie to any natural cycle?—arise directly from these imposed complexities. Any adequate account of our curre
nt millennial madness therefore requires that we understand why calendrics has been such a troubling and fascinating subject for all complex human societies.

  In the first domain, apparent regularities turn out to be accidental—and the joke is on us. In the most prominent example, consider the significance and importance that traditional culture invested in the equal size of the sun and moon in the sky—a major source of richness for our myths and sagas, and a primary ingredient in our recipe for meaningful order in the heavens: “And God made two great lights: the greater light to rule the day, and the lesser light to rule the night” (Genesis 1:16). But the equality in observed size is entirely fortuitous, and not a consequence of any mathematical regularity or law of nature. The sun’s diameter is about four hundred times larger than the moon’s, but the sun is also about four hundred times more distant—so the two discs appear nearly identical in size to an observer on earth.

  In the second and opposite domain, deeply useful and earnestly sought regularities simply do not exist—and we must resort to inconvenient approximations and irreducible unevenness. The complexities of calendrics arise almost entirely within this domain—and I shall illustrate this essential point with the two primary examples that have dogged humanity ever since Og the Caveperson first recognized that his full-moon symbols, all neatly and carefully inscribed on his mammoth-shoulderblade scratching board, did not line up evenly with the day symbols carved into the row just below. So Og scratched his head, decided that he must have made a mistake, kept his records even more carefully, and always got the same uneven result. (Og either went mad, became a crashing bore to his fellows and ended up in exile, or went with the empirical flow and became the first architect of a complex and approximate calendar.)

 

‹ Prev