How We Believe, 2nd Ed.

Home > Other > How We Believe, 2nd Ed. > Page 31
How We Believe, 2nd Ed. Page 31

by Michael Shermer


  Contingency and necessity, long seen to be opposites on a continuum, are not mutually exclusive models of nature from which we must choose. Rather, they are descriptions of change that vary in the amount of their influence in the historical sequence. No one denies that such forces as politics, economics, religion, demographics, and geography impact individuals falling within their purview. Contingencies, however, exercise power sometimes in spite of these forces. At the same time they reshape new and future paths to be taken—think of cassette tapes winning out over eight-tracks, or VHS tapes defeating Beta. It is not that the victor is absolutely superior (and that the invisible hand of the free market always selects the best product), but that quirky events may give one a market edge over the other, and once we start down that path it may be difficult to leap the ever-deepening trough—the QWERTY Principle in action.

  There is in this system a rich matrix of interactions among contingencies and necessities, varying over time, in what I call the model of contingent-necessity, which states: In the development of any historical sequence the role of contingencies in the construction of necessities is accentuated in the early stages and attenuated in the later.

  There are corollaries that encompass six aspects of the model, including:

  Corollary 1: The earlier in the development of any historical sequence, the more chaotic the actions of the individual elements of that sequence are; and the less predictable are future actions and necessities. In other words, chaos reigns early, making long-term prediction all but impossible—think of the initial stages in the development of a storm and how poor meteorologists are at predicting when, where, and how strong the weather pattern will be.

  Corollary 2: The later in the development of any historical sequence, the more ordered the actions of the individual elements of that sequence are; and the more predictable are future actions and necessities. In other words, order reigns late, increasing predictive power—think of the late stages in a weather system and how accurate meteorologists are at pinpointing when, where, and how strong the storm will be.

  Corollary 3: The actions of the individual elements of any historical sequence are generally postdictable but not specifically predictable, as regulated by Corollaries 1 and 2. In other words, in all stages in a sequence, early and late, it is much easier to look back to reconstruct how and why it unfolded as it did, but always difficult to say what is going to happen next—think of the fall of the Berlin Wall in August of 1989 and the subsequent collapse of the Soviet Union, neither of which was anticipated by even the most seasoned politicians and authoritative political scientists.

  Corollary 4: Change in historical sequences from chaotic to ordered is common, gradual, followed by relative stasis, and tends to occur at points where poorly established necessities give way to dominant ones, so that a contingency will have little effect in altering the direction of the sequence. In other words, historical pathways are cut gradually and deeply, stabilizing the system so that order dominates over chaos—think of how most countries usually are stable, secure, and resist change of all sorts.

  Corollary 5: Change in historical sequences from ordered to chaotic is rare, sudden. followed by relative nonstasis, and tends to occur at points where previously well-established necessities have been challenged by others so that a contingency may push the sequence in one direction or the other. In other words, when historical pathways change, they do so quickly and only under conditions where the system becomes unbalanced—think of the sociopolitical conditions of August 1914, when the assassination of the Austrian Archduke Franz Ferdinand triggered the outbreak of World War I.

  Corollary 6: Between origin and bifurcation, sequences self-organize through the interaction of contingencies and necessities in a feedback loop driven by the rate of information exchange. In other words, the hewing of a historical channel is driven by a feedback mechanism between the forces within the system and the forces without—think of mass hysterias and witch hunts that feed on themselves, with the exchange of information among accusers, informants, victims, and bystanders driving the system faster and deeper until it collapses.

  At the beginning of a historical sequence, actions of the individual elements (atoms, molecules, organisms, people) are chaotic, unpredictable, and have a powerful influence on the future development of that sequence. But as the sequence slowly but ineluctably evolves, and the pathways become more worn, the chaotic system self-organizes into an orderly one. The individual elements sort themselves, and are sorted into their allotted positions, as dictated by what came before, with the conjuncture of events compelling a certain course of action by constraining prior conditions—contingent-necessity.

  In the language of contingent-necessity, a bifurcation, or “trigger of change,” is any stimulus that causes a shift from the dominance of necessity and order to the dominance of contingency and chaos in a historical sequence, such as inventions, discoveries, economic and political revolutions, war, famine and disease, immigrations and emigrations, and so on. A trigger of change, however, will not cause a shift at just any point in the sequence. Corollary 5 states that it will be most effective when well-established necessities have been challenged by others so that a contingency may push the sequence in one direction or the other. This trigger point is any point in a historical sequence where previously well-established necessities have been challenged by others so that a trigger of change (contingency] may push the sequence in one direction or the other. Similarly, the butterfly effect, or the trigger effect—described in Corollaries 1 and 2—is the cascading consequences of a contingent trigger of change in a historical sequence. The power of the trigger depends on when in the chronological sequence it enters. The flap of the butterfly’s wings in Brazil may indeed set off a tornado in Texas, but only when the system has started anew or is precariously hanging in the balance. Once the storm is well under way, the flap of a million butterfly wings would not alter the outcome for the tornado-leery Texans. The potency of the sequence grows over time.

  Corollary 6 describes feedback systems whose outputs are connected to their inputs in such a manner that there is constant change in response to both, like microphone feedback in a P.A. system. The mechanism that drives the feedback loop is the rate of information exchange, as in the stock market that booms and busts in response to a flurry of buying or selling, or social movements such as witch crazes that self-organize, grow, reach a peak, and then collapse, all described by Corollaries 1 to 6.

  Chaos theory and the model of contingent-necessity describe change in the same manner, as the Nobel laureate Ilya Prigogine notes when observing that in chaos the “mixture of necessity and chance constitutes the history of the system.” Similarly, necessity and contingency are the shaping forces for historical sequences—humans making their own history but not just as they please. According to Prigogine, all systems, including historical ones, contain subsystems that are “fluctuating.” As long as the fluctuations remain modest and constant, relative stasis in the system is the norm. If the fluctuation becomes powerful enough that it upsets the preexisting organization and balance, a major change or revolution may occur, at which point the system may become chaotic. Necessity takes a system down a certain path until it reaches a bifurcation point. At this time contingency plays an exaggerated role in nudging the system down a new path, which in time develops its own powerful necessities such that contingency is attenuated until the next bifurcation. It is the alloy of contingency and necessity that guides and controls the presence or absence of these bifurcations, and elsewhere I have provided numerous historical examples.

  GLORIOUS CONTINGENCY: A LITTLE TWIG CALLED HOMO SAPIENS

  The model of contingent-necessity and its corollaries is a formalization of Gould’s dangerous idea. In an essay entitled “Fungal Forgery,” Gould applied the model to a complex insect-flower system to show how it could have evolved, but in a very unpredictable manner in its early stages: “Fungal pseudoflowers are late necessities, and they give
us no reason to suppose that the complex contingent prerequisite for this sensible story—the evolution of the insect-flower system—has any similar predictability.” Before turning from this fascinating particular to broader generalities about contingency, Gould offered his usual caveat: “I do not, of course, deny that the history of life includes predictable events and recurrent patterns. I do, however, suspect that most predictable aspects of life lie at too ‘high’ a level of generality to validate what really stirs and troubles our souls—the hope that we might ratify as a necessary event the evolutionary origin of a little twig called Homo sapiens.” But beyond such anthropomorphic concerns, Gould shows why necessities may not always dominate:

  As an interesting consequence of Shermer’s model, we may ask why life as a whole doesn’t finally settle down to globally predictable unrolling, whatever the massive contingency of initial stages. Shermer points, correctly I think, to the importance of infrequent and highly disturbing events (such as mass extinction for faunas or punctuated equilibria for lineages) in derailing the stasis or predictable unrolling of systems otherwise stabilized. The theoretical importance of rare, and sometimes cataclysmic, events—as the preservers and reinvigorators of global contingency—may best be appreciated in the light of such historical models.

  Gould then returns to his familiar metaphor of the tape, applying the model to the entire history of life:

  But if I could rerun the tape of life from the origin of unicellular organisms, what odds would you give me on the reevolution of this complex and contingent insect-flower system…? Would we see anything like either insects or flowers in the rerun? Would terrestrial life originate at all? Would we get mobile creatures that we could call animals? Fine-scale predictability only arises when you are already 99 percent of the way toward a particular result—and the establishment of this 99 percent lies firmly in the domain of unrepeatable contingency.

  The contingent evolution of insect-flower systems, however, is not what makes contingency dangerous. It is that contingent little twig called Homo sapiens that tasks us. We want to be special. We want our place in the cosmos to be central. We want evolution—even Godless evolution—to have been directed toward us so that we stand at the pinnacle of nature’s ladder of progress. Rewind that tape of life and we want to believe that we (Homo sapiens) would appear again and again. Would we? Most likely not. There are simply too many contingent steps along the way, too many trigger points where the sequence could have bifurcated down some other equally plausible path. Alfred Russel Wallace, the codiscoverer of natural selection, toward the end of his life realized this in his book, Man’s Place in the Universe: “The ultimate development of man has, therefore roughly speaking, depended on something like a million distinct modifications, each of a special type and dependent on some precedent changes in the organic and inorganic environments, or in both. The chances against such an enormously long series of definite modifications having occurred twice over … are almost infinite.” And Wallace did not know what we know about human evolution: His “million distinct modifications” is probably off by orders of magnitude. We now know that human evolution goes back millions of years, and that is just for the lineage leading to us. What if we rewound the tape to include the evolution of all primates, or all mammals, or all life on Earth? Trillions of distinct modifications over the last three billion years since life began would need to proceed along similar lines to produce our little twig a second time.

  Is the cosmos itself so contingent? If we rewound the tape back to the beginning of the universe would there be another Big Bang, another universe just like ours? No one knows, but if recent cosmological models pan out it would appear that there are a near infinite number of bubble universes all with slightly different laws of nature. Chances are another universe like ours would reappear, which means that galaxies like ours with stars like ours would form again and again. Recent evidence also leads us to believe that planetary formation is a commonplace event in the galaxy. It is still a little soon to be drawing any definite conclusions, but with enough stars (roughly 400 billion in our galaxy alone), chances are there will be other Earth-like planets, maybe hundreds of thousands of them, the right distance from the home star to give rise to life. It would appear that physical systems are more governed by necessity, while living systems are more governed by contingency.

  But this is oversimplifying matters. The actual evolution of life on a planet is really governed by contingent-necessity, and since we cannot remove living organisms from their physical environment, these relative estimates of potential “other Earths” depend on when in the sequence the tape begins again. Moreover, since no one really cares about whether cockroaches would reappear, let’s cut to the chase and ask whether a primate species with a big enough brain to have consciousness, symbolic language, religion, awareness of its own mortality, and a developed enough system of thought to ask this very question would evolve again. We cannot run the experiment, of course, but we do not need to, because history has done it for us. The fossil record, while still fragmented and desultory, is complete enough now to show us that over the past thirty million years we can conservatively estimate that hundreds of primate species have lived out their lives in the nooks and crannies of rain forests around the world; over the past ten million years dozens of great ape species have forged specialized niches on the planet; and over the last six million years, since the hominid split from such great apes as gorillas, chimps, and orangutans occurred, dozens of bipedal, tool-using hominid species have struggled for survival.

  If these hominids were so necessitated by the laws of evolutionary progress, why is it that only a handful of those myriad pongids and hominids have survived? If braininess is such an inevitable product of necessitating trends of nature, then why has only one hominid species managed to survive long enough to ask the question? What happened to those big-brained hominids Homo habilis, Homo rudolfensis, Homo ergaster, Homo erectus. Homo heidelbergensis, and Homo neanderthalensis? If big brains are so great, why did all but one of their owners go extinct (including the Neanderthals, whose brains were slightly larger than our own)? And before them, what happened to the bipedal, tool-using Australopithecines: anamensis, afarensis, africanus, aethiopicus, robustus, boisei, and, most recently, garhi? Discovery after discovery coming out of Africa reveals our ancestors to be puny, small-brained creatures walking upright, using tools, and eating meat, allegedly the ingredients that go into making big brains. If necessitating evolutionary progress were so potent, then why aren’t there a dozen modern humanlike species that should have arisen out of these Australopithecine ancestors? Historical experiment after experiment reveals the same answer: We are a fluke of nature, a quirk of evolution, a glorious contingency.

  THE FULL IMPACT OF CONTINGENCY

  It is not surprising that the idea of glorious contingency does not have a wide following among the religious. But what is unexpected is that many scientists still cling to a more sophisticated notion of progress as “trends,” where humans—or sentience, cognition, big brains, or some other form of advanced mentation—sit atop the phylogenetic bush because evolution “moves” in this direction. In more extreme versions, such as in Freeman Dyson’s Infinite in All Directions or Frank Tipler’s The Physics of Immortality, it seems as if the universe “knew” we were coming, as argued in the strong anthropic principle. Even more modest progressivists manage to find a special place for humans on an evolutionary pedestal. Evolution does not “know” we are coming, but run that tape of life again and a species very like us would once again sit atop the heap. Philosopher of science Michael Ruse calls such evolutionism the “secular religion of progress.” Surveying the writings of some of today’s leading evolutionary biologists, and reading “the message between as well as on the lines,” Ruse concludes: “If one came away thinking that evolution is progressive and that natural selection is the power behind the throne, one would be thinking no more than what one had been told.” The full impact of cont
ingency is that even this belief in progress is wrong. There is no evolutionary trend toward us.

  As Gould shows in his 1996 book, Full House, these “apparent trends can be generated as by-products, or side consequences, of expansions and contractions in the amount of variation within a system, and not by anything directly moving anywhere.” Gould claims that things like .400 hitting in baseball are not “things” at all, in the Platonic sense of fixed “essences.” They are artifacts of trends, which disappear when the overall structure of the system changes over time. No one has hit .400 in baseball since Ted Williams did it in 1941 (for every ten times at bat he got four hits), and this unsolved mystery continues to generate arguments about why it hasn’t happened since. The mystery is now solved, says Gould. It is not because players were better then (what he calls the Genesis myth: “There were giants on the earth in those days”—or as Williams himself put it: “The ball isn’t dead, the hitters are, from the neck up”), or because players today have tougher schedules, night games, and cross-country travel. (Rod Carew says night games are easier on the eyes and travel by jet beats a train any day.) It is because the overall level of play—by everyone from Tony Gwynn and Eddie Murray to Backup Bob and Dugout Doug—has inexorably marched ever upward toward a hypothetical outer wall of human performance. Paradoxically, .400 hitting has disappeared because today’s players are better, not worse. But all of them are better, making the crème de la creme stand out from the mediocre far less than before. The best players may be absolutely better (better training, equipment, diet) than players fifty years ago, but they are relatively worse compared to the average level of play. It was easier for Ted Williams to “hit ‘em where they ain’t” fifty years ago than it is for Wade Boggs today, because every position in the field is manned by players whose average level of play is much better than before. Consider these numbers: Only seven other players have hit .400 since 1900, and three of those in one year (1922). Add Williams in 1941 and the list is complete at eight, out of tens of thousands who have played. And the difference between .400 and George Brett’s .390 in 1980, for example, based on his 175 hits in 449 at-bats, is five hits! That computes to only one hit in every thirty-two games. How many times did Brett face top relievers in late innings, or defensive alignments (based on computer analyses of his hitting style) that Williams and Cobb never faced? Surely at least once every thirty-two games. Williams’s feat of 1941 would not be discussed today except for three hits (the difference between .406 and .399 in his 185 hits out of 456 at-bats). Would Williams have been deprived of one hit per fifty-four games by today’s players? Most assuredly.

 

‹ Prev