The modern version chucks foreordination in favor of predictability. It abandons the idea that the germ of Homo sapiens lay embedded in the primordial bacterium, or that some spiritual force superintended organic evolution, waiting to infuse mind into the first body worthy of receiving it. Instead, it holds that the fully natural process of organic evolution follows certain paths because its primary agent, natural selection, constructs ever more successful designs that prevail in competition against earlier models. The pathways of improvement are rigidly limited by the nature of building materials and the earth’s environment. There are only a few ways—perhaps only one—to construct a good flyer, swimmer, or runner. If we could go back to that primordial bacterium and start the process again, evolution would follow roughly the same path. Evolution is more like turning a ratchet than casting water on a broad and uniform slope. It proceeds in a kind of lock step; each stage raises the process one step up, and each is a necessary prelude to the next.
Since life began in microscopic chemistry and has now reached consciousness, the ratchet contains a long sequence of steps. These steps may not be “preparations” in the old sense of foreordination, but they are both predictable and necessary stages in an unsurprising sequence. In an important sense, they prepare the way for human evolution. We are here for a reason after all, even though that reason lies in the mechanics of engineering rather than in the volition of a deity.
But if evolution proceeded as a lock step, then the fossil record should display a pattern of gradual and sequential advance in organization. It docs not, and I regard this failure as the most telling argument against an evolutionary ratchet. As I argue in essay 21, life arose soon after the earth itself formed; it then plateaued for as long as three billion years—perhaps five-sixths of its total history. Throughout this vast period, life remained on the prokaryotic level—bacterial and blue green algal cells without the internal structures (nucleus, mitochondria, and others) that make sex and complex metabolism possible. For perhaps three billion years, the highest form of life was an algal mat—thin layers of prokaryotic algae that trap and bind sediment. Then, about 600 million years ago, virtually all the major designs of animal life appeared in the fossil record within a few million years. We do not know why the “Cambrian explosion” occurred when it did, but we have no reason to think that it had to happen then or had to happen at all.
Some scientists have argued that low oxygen levels prevented a previous evolution of complex animal life. If this were true, the ratchet might still work. The stage remained set for three billion years. The screw had to turn in a certain way, but it needed oxygen and had to wait until prokaryotic photosynthesizers gradually supplied the precious gas that the earth’s original atmosphere had lacked. Indeed, oxygen was probably rare or absent in the earth’s original atmosphere, but it now appears that large amounts had been generated by photosynthesis more than a billion years before the Cambrian explosion.
Thus, we have no reason to regard the Cambrian explosion as more than a fortunate event that need not have occurred, either at all or in the way it did. It may have been a consequence of the evolution of the eukaryotic (nucleate) cell from a symbiotic association of prokaryotic organisms within a single membrane. It may have occurred because the eukaryotic cell could evolve efficient sexual reproduction, and sex distributes and rearranges the genetic variability that Darwinian processes require. But the crucial point is this: if the Cambrian explosion could have occurred any time during more than a billion years before the actual event—that is, for about twice the amount of time that life has spent evolving since then—a ratchet scarcely seems to be an appropriate metaphor for life’s history.
If we must deal in metaphors, I prefer a very broad, low and uniform slope. Water drops randomly at the top and usually dries before flowing anywhere. Occasionally, it works its way downslope and carves a valley to channel future flows. These myriad valleys could have arisen anywhere on the landscape. Their current positions are quite accidental. If we could repeat the experiment, we might obtain no valleys at all, or a completely different system. Yet we now stand at the shore line contemplating the fine spacing of valleys and their even contact with the sea. How easy it is to be misled and to assume that no other landscape could possibly have arisen.
I confess that the metaphor of the landscape contains one weak borrowing from its rival, the ratchet. The initial slope does impart a preferred direction to the water dropping on top—even though almost all drops dry before flowing and can flow, when they do, along millions of paths. Doesn’t the initial slope imply weak predictability? Perhaps the realm of consciousness occupies such a long stretch of shoreline that some valley would have to reach it eventually.
But here we encounter another constraint, the one that prompted this essay (though I have been, I confess, a long time in getting to it). Almost all drops dry. It took three billion years for any substantial valley to form on the earth’s initial slope. It might have taken six billion, or twelve, or twenty for all we know. If the earth were eternal, we might speak of inevitability. But it is not.
Astrophysicist William A. Fowler argues that the sun will exhaust its central hydrogen fuel after ten to twelve billion years of life. It will then explode and transform to a red giant so large that it will extend past the orbit of Jupiter, thus swallowing the earth. It is an arresting thought—the kind that makes you stop and contemplate, or that sends shivers up and down your spine—to recognize that humans have appeared on earth at just about the halfway point of our planet’s existence. If the metaphor of the landscape be valid, with all its randomness and unpredictability, then I think we must conclude that the earth need never have evolved its complex life. It took three billion years to go beyond the algal mat. It might as well have taken five times as long, if only the earth had endured. In other words, if we could run the experiment again, the most spectacular event in the history of our solar system, the explosive exhaustion of its parent, might just as well have had an algal mat as its highest, mute witness.
Alfred Russel Wallace also contemplated the eventual destruction of life on earth (though, in his day, physicists argued that the sun would simply burn out and the earth freeze solid). And he could not accept it. He wrote of “the crushing mental burthen imposed upon those who…are compelled to suppose that all the slow growths of our race struggling towards a higher life, all the agony of martyrs, all the groans of victims, all the evil and misery and undeserved suffering of the ages, all the struggles for freedom, all the efforts towards justice, all the aspirations for virtue, and the wellbeing of humanity, shall absolutely vanish.” Wallace eventually opted for a conventional Christian solution, the eternity of spiritual life: “Beings…possessing latent faculties capable of such noble development, are surely destined for a higher and more permanent existence.”
I would venture a different argument. The average species of fossil invertebrate lives five to ten million years, as documented in the fossil record. (The oldest may go back, though I doubt the story myself, more than 200 million years.) Vertebrate species tend to live for shorter times. If we are still here to witness the destruction of our planet some five billion years or more hence, then we will have achieved something so unprecedented in the history of life that we should be willing to sing our swan song with joy—sic transit gloria mundi. Of course, we might also fly off in those legions of space ships, only to be condensed a bit later into the next big bang. But then, I never have been a keen student of science fiction.
4 | Science and Politics of Human Differences
13 | Wide Hats and Narrow Minds
IN 1861, FROM February to June, the ghost of Baron Georges Cuvier haunted the Anthropological Society of Paris. The great Cuvier, Aristotle of French biology (an immodest designation from which he did not shrink), died in 1832, but the physical vault of his spirit lived on as Paul Broca and Louis Pierre Gratiolet squared off to debate whether or not the size of a brain has anything to do with the intelligence of its
bearer.
In the opening round, Gratiolet dared to argue that the best and brightest could not be recognized by their big heads. (Gratiolet, a confirmed monarchist, was no egalitarian. He merely sought other measures to affirm the superiority of white European males.) Broca, founder of the Anthropological Society and the world’s greatest craniometrician, or head measurer, replied that “study of the brains of human races would lose most of its interest and utility” if variation in size counted for nothing. Why, he asked, had anthropologists spent so much time measuring heads if the results had no bearing upon what he regarded as the most important question of all—the relative worth of different peoples:
Among the questions heretofore discussed within the Anthropological Society, none is equal in interest and importance to the question before us now…. The great importance of craniology has struck anthropologists with such force that many among us have neglected the other parts of our science in order to devote ourselves almost exclusively to the study of skulls…. In such data, we hope to find some information relevant to the intellectual value of the various human races.
Broca and Gratiolet battled for five months and through nearly 200 pages of the published bulletin. Tempers flared. In the heat of battle, one of Broca’s lieutenants struck the lowest blow of all: “I have noticed for a long time that, in general, those who deny the intellectual importance of the brain’s volume have small heads.” In the end, Broca won, hands down. During the debate, no item of information had been more valuable to Broca, none more widely discussed or more vigorously contended, than the brain of Georges Cuvier.
Cuvier, the greatest anatomist of his time, the man who revised our understanding of animals by classifying them according to function—how they work—rather than by rank in an anthropocentric scale of lower to higher. Cuvier, the founder of paleontology, the man who first established the fact of extinction and who stressed the importance of catastrophes in understanding the history both of life and the earth. Cuvier, the great statesman who, like Talleyrand, managed to serve all French governments, from revolution to monarchy, and die in bed. (Actually, Cuvier passed the most tumultuous years of the revolution as a private tutor in Normandy, although he feigned revolutionary sympathies in his letters. He arrived in Paris in 1795 and never left.) F. Bourdier, a recent biographer, describes Cuvier’s corporeal ontogeny, but his words also serve as a good metaphor for Cuvier’s power and influence: “Cuvier was short and during the Revolution he was very thin; he became stouter during the Empire; and he grew enormously fat after the Restoration.”
Cuvier’s contemporaries marveled at his “massive head.” One admirer affirmed that it “gave to his entire person an undeniable cachet of majesty and to his face an expression of profound meditation.” Thus, when Cuvier died, his colleagues, in the interests of science and curiosity, decided to open the great skull. On Tuesday, May 15, 1832, at seven o’clock in the morning, a group of the greatest doctors and biologists of France gathered to dissect the body of Georges Cuvier. They began with the internal organs and, finding “nothing very remarkable,” switched their attention to Cuvier’s skull. “Thus,” wrote the physician in charge, “we were about to contemplate the instrument of this powerful intelligence.” And their expectations were rewarded. The brain of Georges Cuvier weighed 1,830 grams, more than 400 grams above average and 200 grams larger than any nondiseased brain previously weighed. Unconfirmed reports and uncertain inference placed the brains of Oliver Cromwell, Jonathan Swift, and Lord Byron in the same range, but Cuvier had provided the first direct evidence that brilliance and brain size go together.
Broca pushed his advantage and rested a good part of his case on Cuvier’s brain. But Gratiolet probed and found a weak spot. In their awe and enthusiasm, Cuvier’s doctors had neglected to save either his brain or his skull. Moreover, they reported no measures on the skull at all. The figure of 1,830 g for the brain could not be checked; perhaps it was simply wrong. Gratiolet sought an existing surrogate and had a flash of inspiration: “All brains are not weighed by doctors,” he stated, “but all heads are measured by hatters and I have managed to acquire, from this new source, information which, I dare to hope, will not appear to you as devoid of interest.” In short, Gratiolet presented something almost bathetic in comparison with the great man’s brain: he had found Cuvier’s hat! And thus, for two meetings, some of France’s greatest minds pondered seriously the meaning of a worn bit of felt.
Cuvier’s hat, Gratiolet reported, measured 21.8 cm in length and 18.0 cm in width. He then consulted a certain M. Puriau, “one of the most intelligent and widely known hatters of Paris.” Puriau told him that the largest standard size for hats measured 21.5 by 18.5 cm. Although very few men wore a hat so big, Cuvier was not off scale. Moreover, Gratiolet reported with evident pleasure, the hat was extremely flexible and “softened by very long usage.” It had probably not been so large when Cuvier bought it. Moreover, Cuvier had an exceptionally thick head of hair, and he wore it bushy. “This seems to prove quite clearly,” Gratiolet proclaimed, “that if Cuvier’s head was very large, its size was not absolutely exceptional or unique.”
Gratiolet’s opponents preferred to believe the doctors and refused to grant much weight to a bit of cloth. More than twenty years later, in 1883, G. Hervé again took up the subject of Cuvier’s brain and discovered a missing item: Cuvier’s head had been measured after all, but the figures had been omitted from the autopsy report. The skull was big indeed. Shaved of that famous mat of hair, as it was for the autopsy, its greatest circumference could be equaled by only 6 percent of “scientists and men of letters” (measured in life with their hair at that) and zero percent of domestic servants. As for the infamous hat, Hervé pleaded ignorance, but he did cite the following anecdote: “Cuvier had a habit of leaving his hat on a table in his waiting room. It often happened that a professor or a statesman tried it on. The hat descended below their eyes.”
Yet, just as the doctrine of more-is-better stood on the verge of triumph, Hervé snatched potential defeat from the jaws of Broca’s victory. Too much of a good thing can be as troubling as a deficiency, and Hervé began to worry. Why did Cuvier’s brain exceed those of other “men of genius” by so much? He reviewed both details of the autopsy and records of Cuvier’s frail early health and constructed a circumstantial case for “transient juvenile hydrocephaly,” or water on the brain. If Cuvier’s skull had been artificially enlarged by the pressure of fluids early during its growth, then a brain of normal size might simply have expanded—by decreasing in density, not by growing larger—into the space available. Or did an enlarged space permit the brain to grow to an unusual size after all? Hervé could not resolve this cardinal question because Cuvier’s brain had been measured and then tossed out. All that remained was the magisterial number, 1,830 grams. “With the brain of Cuvier,” wrote Hervé, “science has lost one of the most precious documents it ever possessed.”
On the surface, this tale seems ludicrous. The thought of France’s finest anthropologists arguing passionately about the meaning of a dead colleague’s hat could easily provoke the most misleading and dangerous inference of all about history—a view of the past as a domain of naïve half-wits, the path of history as a tale of progress, and the present as sophisticated and enlightened.
But if we laugh with derision, we will never understand. Human intellectual capacity has not altered for thousands of years so far as we can tell. If intelligent people invested intense energy in issues that now seem foolish to us, then the failure lies in our understanding of their world, not in their distorted perceptions. Even the standard example of ancient nonsense—the debate about angels on pinheads—makes sense once you realize that theologians were not discussing whether five or eighteen would fit, but whether a pin could house a finite or an infinite number. In certain theological systems, the corporeality or noncorporeality of angels is an important matter indeed.
In this case, a clue to the vital importance of Cuvier’s brain
for nineteenth-century anthropology lies in the last line of Broca’s statement, quoted above: “In such data, we hope to find some information relevant to the intellectual value of the various human races.” Broca and his school wanted to show that brain size, through its link with intelligence, could resolve what they regarded as the primary question for a “science of man”—explaining why some individuals and groups are more successful than others. To do this, they separated people according to a priori convictions about their worth—men versus women, whites versus blacks, “men of genius” versus ordinary folks—and tried to demonstrate differences in brain size. The brains of eminent men (literally males) formed an essential link in their argument—and Cuvier was the crème de la crème. Broca concluded:
In general, the brain is larger in men than in women, in eminent men than in men of mediocre talent, in superior races than in inferior races. Other things equal, there is a remarkable relationship between the development of intelligence and the volume of the brain.
Broca died in 1880, but disciples continued his catalog of eminent brains (indeed, they added Broca’s own to the list—although it weighed in at an undistinguished 1,484 grams). The dissection of famous colleagues became something of a cottage industry among anatomists and anthropologists. E.A. Spitzka, the most prominent American practitioner of the trade, cajoled his eminent friends: “To me the thought of an autopsy is certainly less repugnant than I imagine the process of cadaveric decomposition in the grave to be.” The two premier American ethnologists, John Wesley Powell and W J McGee made a wager over who had the larger brain—and Spitzka contracted to resolve the issue for them posthumously. (It was a toss-up. The brains of Powell and McGee differed very little, no more than varying body size might require.)
The Panda's Thumb: More Reflections in Natural History Page 13