For some reason, as I scanned this issue, my main thought went forward to the year 2000. My third grade mathematics told me that I would then be fifty-eight years old, while two living grandparents testified to the high probability that I would witness this far more interesting event. I have been buoyed by this lovely idea ever since—that I would enjoy the rare privilege of experiencing a transition that (however arbitrary) would rivet the attention of nearly all nations. Most folks live and die in years of little numerical distinction. I figured that I was one helluva lucky guy. When I should have died of cancer in the mid-1980s, but recovered instead, I listed only two items as placeholders of all the reasons for cherishing life in our times: “I dwelled on many things—that I simply had to see my children grow up, that it would be perverse to come this close to the millennium and then blow it” (from the preface to The Flamingo’s Smile, 1985).
There will be an orgy of millennial books, and I hate to follow crowds. What then, beyond the indulgence of a little boy’s whim dating from January 1950, can possibly justify my addition to this ephemeral genre? In one sense, this little book rests its case for distinctiveness upon an omission. I will eschew, absolutely and on principle, the two staples of fin de siècle literature, especially of the apocalyptic sort inspired by a millennial transition. I regard these subjects as speculative, boring, and basically silly—for they rank as primary examples of “punditry’s” fundamental error: the fatuous notion that a head-on rush at the biggest questions will automatically yield the deepest insights.
I shall, first of all, make no predictions about human futures, either for years, decades, millennia, or geological ages; or for individuals, family lineages, or races; or for cities, nations, hemispheres, or galaxies. (I limit myself to predicting the aforementioned glut of books about the millennium.) Second, I refuse to speculate about the psychological source either for the angst that always accompanies the endings of centuries (not to mention millennia) or for the apocalyptic beliefs that have pervaded human cultures throughout recorded history, particularly among the miserable and malcontented.
Instead, I will confine myself to a set of related millennial questions that may seem paltry or laughably limited compared with the grandeur of unknowable futures, but that (as I hope to convince you) gain greater potential import by their definability and their exemplification, in fruitful ways, of questions as general as the nature of truth and the mechanisms of human knowledge. God bless all the precious little examples and all their cascading implications; without these gems, these tiny acorns bearing the blueprints of oak trees, essayists would be out of business. I want to talk about calendars and numbers; about fingers, toes and the perception of “evenness”; about the sun and the moon, the age of the earth, and the birth of Jesus.
These preciously definite, but wondrously broad, calendrical questions all arise from a foible of human reasoning, and also underlie all the passionate arguments now swirling around the impending millennial transition. In a famous motto, the Roman dramatist Terence stated in the second century B.C.: “Homo sum: humani nihil a me alienum puto” (I am a man, and nothing human can therefore be alien to me). Our urge to know is so great, but our common errors cut so deep. You just gotta love us—and you gotta view misguided millennial passion as a primary example of our uniqueness and our absurdity—in other words, of our humanity.
The astronomical, historical, and calendrical questions of this book all rest upon the distinction between nature’s factual status and our arbitrary definitions within these constraints—in other words, the interaction of undeniable reality and the flexibility of human interpretation. Some things in nature just are—even though we can parse and interpret such real items in wildly various ways. A lion is a lion is a lion—and lions are more closely tied by genealogy to tigers than to earthworms. (Of course, I recognize that some system of human thought might base its central principle upon a spiritual or metaphorical tie between lion and earthworm—but nature’s genealogies would not be changed thereby, even though the evolutionary tree of life might be utterly ignored or actively denied.)
But other important categories in our lives, however precisely definable and however objectively ascertainable, must be judged as arbitrary in the crucial sense that nature permits a plethora of equally reasonable alternatives, while providing no factual basis for a preferred choice. For example, each pitched baseball crosses home plate in a particular location of undeniable factuality—but the definitions for balls and strikes are human decisions, entirely arbitrary with respect to the physics of projection, however sensible within a system of rules and customs regulating this popular sport. (These definitions can also change—and have often done so—when circumstances favor an alteration.) Similarly, although nature dictates days by a full rotation of the earth, the parsing of days into packages of seven, called weeks, represents an arbitrary decision of some human cultures.
Millennial questions record our foibles, rather than nature’s dictates, because they all lie at the arbitrary end of this spectrum. At the opposite and factual end, nature gives us three primary cycles—days as earthly rotations, lunations (we define our months slightly differently, and for interesting reasons) as revolutions of the moon around the earth, and years as revolutions of the earth around the sun. (God—who, on this issue, is either ineffable, mathematically incompetent, or just plain comical—also arranged these primary cycles in such a way that not a one of them works as a simple multiple of any other—the major theme of Part 3 and a source of many millennial issues.)
In an intermediary position, definitions are surely arbitrary, but nature’s factuality nudges independent cultures toward common (but by no means universal) resolutions. The solar year, for example, does not fall naturally into four equal periods called seasons, but the existence of two solstices and two equinoxes—ascertainable with reasonable ease in most places where people live at high density, and truly important to know for such basic activities as hunting and gathering, and the later development of agriculture—may impose a slight natural bias for division by four.
Nonetheless, many cultures use other systems more attuned to immediate surroundings. In many tropical regions, for example, day lengths and temperatures don’t vary drastically, and solstices and equinoxes may regulate nothing of great importance—whereas a two (or more) fold division of predictable rainy and dry times within the solar year makes far more sense as a basis for divisions. I once spent several months on Curaçao, the formerly Dutch island off the coast of Venezuela. Here no prominent seasonality exists in any natural form (though an indirect surrogate might be found in fluctuating numbers of tourists from lands with pronounced climatic cycles), for the trade winds blow all year from the east, and dryness always prevails. The daily newspaper doesn’t even include a weather report, for nothing much varies. Any notable fluctuation—a hurricane, or even an extensive storm—is treated as news, not weather.
Millennial madness (or at least fascination) surely lies at the arbitrary end of this spectrum, for nature recognizes no divisions by thousands. The intrinsic advantages of decimal mathematics have often been noted, and our Arabic numerology surely gives l,000 that nice look of evenness (enhanced in our century by the active turning of automobile odometers). But we also recognize that these advantages do not arise from nature’s construction, and we know that several cultures developed entirely functional (and beautifully complex) mathematical systems on bases other than 10—and, therefore, with no special status attached to the number l,000 at all.
Perhaps the old saw that links decimal mathematics to our ten fingers has validity after all, and perhaps, for this reason, systems based on ten do follow a natural bias. But Mayan culture, for example, developed an elegant vigesimal mathematics based on 20—perhaps they counted both fingers and toes!—and this complex numerical system honored many cycles and “evennesses,” but not millennia or any multiples of 1,000. Besides, and in any case, our ten fingers represent an evolutionary contingency that mi
ght easily have settled upon a different and equally functional outcome. Darwinian processes did not confer ten fingers upon early reptiles because, more than 300 million years later, a brainy species would walk upright, separate fingers from toes, and then recognize that ten fingers imply the most convenient mathematics! The first terrestrial vertebrates had six, seven, or eight digits on each limb—the Eight Little Piggies of one of my previous books. Base 8 isn’t bad either—but vertebrates followed a different evolutionary pathway.
And maybe, on a plausible alternative earth, the horse would not have become extinct in North America. The Mayans might then have domesticated a beast of burden, invented the wheel, and maybe even those two great and dubious innovations of ultimate domination—efficient oceanic navigation and gunpowder. Europe was a backwater during the great Mayan age in the midst of the first millennium of our Christian era. Continue the reverie, and Mesoamerica moves east to conquer the Old World, makes a concordat with Imperial China—and vigesimal mathematics rules human civilization for the foreseeable everafter. The millennium—the blessed thousand-year reign of a local god known as Jesus Christ—then becomes a curious myth of a primitive and conquered culture, something that kids learn in their third grade unit on global diversity.
But decimal Europe prevailed instead. And decimal Europe became Christian for other contingent reasons. And Christianity has maintained an interesting historical myth about a millennium. Western culture married this particular apocalyptic tale with a focus on intervals of 1,000 that any decimal system might be prone to favor. So here we are, engulfed in a millennial madness utterly unrelated to anything performed by the earth and moon in all their natural rotations and revolutions. People really are funny—and fascinating beyond all possible description.
This book, then, focuses on the three great questions that motivate details of millennial madness. My subjects are calendrics, astronomy, and history—not prediction or psychology. I pose, in turn, three of the standard W questions. Their resolution should clarify all the major muddles that fuel so much fruitless debate about the millennium in popular media. First, what is the millennium after all—and how did the name for a future thousand-year reign of Christ on earth get transferred to the passage of a secular period of a thousand years in current human history? (The connection, both intimate and interesting, forms the subject of Part 1.) Second, when does the millennium begin—on January 1, 2000; or on January 1, 2001? (This issue is not nearly so trivial or nitpicking as it might seem, and the nonresolution tells an interesting story about the cultural history of the twentieth century. This section is a revised and extended version of an essay previously published in Dinosaur in a Haystack, Harmony Books, 1995. All other material is new and appears here for the first time.) Third, why are we so fascinated with calendrical issues about such preferred or “even” transitions as the forthcoming millennial inception (whenever it occurs)? If the universe works like Galileo’s grand mechanical clock, regulated by evident mathematical cycles, why does calendrics amount to anything more challenging than simple counting?
We will all end this exploration, I hope, by affirming an amalgam of Einstein’s two most famous quotations—both, invoking a metaphorical deity to represent nature’s elegant order (or lack thereof). God, indeed, does not play dice with the universe. He is also not at all malicious, though ever so subtle! And, I might add, ever so sly—or do we only see ourselves in a mirror held up to the cosmos?
REDEFINING THE MILLENNIUM: FROM SACRED SHOWDOWNS TO CURRENT COUNTDOWNS
OUR NEED FOR MEANING
We inhabit a world of infinite and wondrous variety, a source of potential joy, especially if we can recapture childhood’s fresh delight for “splendor in the grass” and “glory in the flower.” Robert Louis Stevenson caught the essence of such naive pleasure in a single couplet—this “Happy Thought” from A Child’s Garden of Verses:
The world is so full of a number of things,
I’m sure we should all be as happy as kings.
But sheer variety can also be overwhelming and frightening, especially when, as responsible adults, we must face the slings and arrows of (sometimes) outrageous fortune. In taking arms against this sea of troubles, no tool can be more powerful, or more distinctly human, than the brain’s imposition of meaning upon the world’s confusion. This need for meaning becomes especially acute when we fear the accuracy of two great statements fed by Eastern influences into primary documents of Western culture—for these quotations epitomize our suspicion that the cosmos may feature (in our terms) neither sense nor direction, while we humans may inhabit this planet for no special reason and with no goal ordained by nature.
Edward FitzGerald, publishing in the same year (1859) as another revolutionary document filled with challenges to traditional notions of intrinsic meaning, Darwin’s Origin of Species, freely translated the Rubaiyat of the eleventh century Persian poet Omar Khayyam:
Into this universe, and why not knowing
Nor whence, like water willy-nilly flowing.
While the preacher of Ecclesiastes had written, more than a thousand years earlier but with similar doubts about inherently congenial natural order:
I returned, and saw under the sun, that the race is not to the swift, nor the battle to the strong, neither yet bread to the wise, nor yet riches to men of understanding, nor yet favor to men of skill; but time and chance happeneth to them all.
But why invoke such general themes of mental ordering and natural randomness to begin a small book on particular questions about the millennium? I start here because the basic concept of the millennium in Western culture arose from two of the great mental strategies that we use to wrest order and meaning from a recalcitrant world. Moreover, and more particularly, the central shift of meaning that defines our current millennial madness—from millennium as apocalypse to millennium as calendrics—can best be understood as a change of emphasis from one mental strategy to the other.
The First Strategy, Classification
Among the devices that we use to impose order upon a complicated (but by no means unstructured) world, classification—or the division of items into categories based on perceived similarities—must rank as the most general and most pervasive of all. And no strategy of classification cuts deeper—while providing such an even balance of benefits and difficulties—than our propensity for division by two, or dichotomy.
Some basic attributes of surrounding nature do exist as complementary pairings—two large lights in the sky representing day and night; two sexes that must couple their opposing parts to produce a continuity of generations—so we might argue that dichotomization amounts to little more than good observation of the external world. But far more often than not, dichotomization leads to misleading or even dangerous oversimplification. People and beliefs are not either good or evil (with the second category ripe for burning); and organisms are not either plant or animal, vertebrate or invertebrate, human or beast. We seem so driven to division by two, even in clearly inappropriate circumstances, that I must agree with several schools of thought (most notably Claude Lévi-Strauss and the French structuralists) in viewing dichotomization more as an inherent mechanism of the brain’s operation than as a valid perception of external reality.
I mention dichotomization as the chief rule of classification because millennial definitions hinge upon our standard (and oversimplified) pairwise divisions for the two most general subjects of all: time and change. For time, Western culture has favored a division between arrows and cycles—or inherently directional versus predictably recurrent sequences of events. (See Mircea Eliade, The Myth of the Eternal Return, 1954, for the classic statement, with sources reaching back to Plato and earlier; and my own Time’s Arrow, Time’s Cycle, 1987, for a scientist’s perspective on the subject.) For change, we have emphasized the distinction between the gradual and continuous versus the sudden, cataclysmic and revolutionary.
We hold tight to both ends of these dichotomies because each provides p
art of the psychological comfort needed to survive and prosper in this vale of tears. We need time’s arrow to assure us that sequences of events tell meaningful stories and promise hope for improvement. We need time’s cycle for an ordered rescue from the fear that history might feature no more than a random and senseless jumble of events without meaning or guidance—just one damn thing after another, in the old cliché. If events recur in predictable ways (as days must follow nights, and new births compensate old deaths), then life includes pattern amidst the flux.
As for time, so also for the dichotomy of change. We need a concept of gradual alteration to sustain hope that what we have built through struggle might persist and even augment—in short, to have some sense of continuity. But we also need the possibility of cataclysm, so that, when situations seem hopeless, and beyond the power of any natural force to amend, we may still anticipate salvation from a messiah, a conquering hero, a deus ex machina, or some other agent with power to fracture the unsupportable and institute the unobtainable.
From these themes of hope and order—and especially from the notion of divine intervention at the end of a determined cycle—we derive one of the most popular and potent beliefs in Western (and many other, perhaps even universal) traditions: apocalypticism, defined by Webster’s as “a doctrine distinguished by the expectation of an imminent end of the present temporal world, the final destruction of the unrighteous in a purging holocaust engulfing the earth, and the resurrection of the righteous to a purified world of bliss.” (The word comes from a Greek verb meaning “to uncover” or “to reveal”; Webster’s particular definition may rely too closely upon a specific Christian myth, but the basic elements of apocalyptic belief surely transcend any particular culture.)
Questioning the Millennium Page 3