Book Read Free

Pandora's Seed

Page 12

by Spencer Wells


  Complex speech allowed early humans to communicate new ways of solving the problems that came their way. If our species had been highly adapted to a single ecological zone, such as the tropical forest or the high mountains of eastern Africa, it is likely that we never would have developed the complex minds we possess today. In fact, we have always been a species living in environmental flux. The savannas of Africa, with their complex mix of grasses, trees, and extreme seasonal changes, were the perfect breeding ground for adaptability. As the climate changed over the past few hundred thousand years, the shifts in the extent of the savanna encouraged population expansion, followed by contraction as the changing climate reduced the size of the environmental resource base. This led some early humans to live by the sea, eating the plentiful shellfish, as early as 120,000 years ago. It also drove some to seek greener pastures in other places.

  As with the dawn of the Neolithic era, our migrations out of Africa and throughout the rest of the world were probably set in motion by necessity—in this case, dwindling supplies of food and water. There is evidence for a small population migration around 110,000 years ago, when Homo sapiens skeletons turn up in the Middle East. Data from African lake sediments suggests that the continent was becoming much drier around this time, and this dry spell might have encouraged early human populations to migrate in order to find the food resources they needed. Ultimately, these pioneers disappear in the Middle East after 80,000 years ago, replaced by the cold-adapted Neanderthals at cave sites such as Qafzeh and Skhul. Why? It was around this time that the ice age really began to get rolling in a major way—it would eventually cover most of North America in ice. And this cooling process seems to have been kick-started by a wrench that was thrown into the global climate works around this time.

  VOLCANOES AND MACROMUTATIONS

  Today, Lake Toba, in northern Sumatra, is one of the jewels of western Indonesia. From the air it looks like a beautiful blue oval ring of water, around sixty miles long, with a large green island in the center. Its appearance seems somewhat odd until you learn how it was formed. Lake Toba is actually the water-filled caldera of a dormant volcano, known as Mount Toba, and is much like Crater Lake in Oregon—except Toba is ten times larger. In fact, it is the largest volcanic lake in the world, which means that the volcano it sits in must have been quite big as well. Toba was one of the largest volcanoes ever, and when it last roared to life its eruption was the largest in the past 2 million years, spewing out nearly three thousand times as much material as the eruption of Mount Saint Helens in 1980. And when did this extraordinary calamity happen? Sometime between 75,000 and 70,000 years ago.

  When volcanoes such as Toba erupt—and it isn’t very often—the material they spew into the atmosphere does more than simply blanket the surrounding region with a layer of ash (parts of central India were apparently buried under eighteen feet of Toba’s). The force of the eruption is such that much of the material, as well as the sulfuric acid from the blast, is forced into the higher layers of the atmosphere, where strong winds disperse it around the world. Since Toba is near the equator, its ash would have been dispersed quite easily throughout the region that receives the most sunlight: the tropics. The result was an instantaneous cooling effect, due to the particle-obscured sunlight, that lasted for several years—a “volcanic winter,” in which global temperatures dropped somewhere between nine and twenty-seven degrees Fahrenheit. This brief episode, devastating though it must have been, happened to then be followed by approximately 1,000 years of substantially cooler temperatures, among the coldest of the last ice age. Africa, as the most tropical continent on earth (85 percent of it lies between the Tropics of Cancer and Capricorn), would have experienced this cooling, but it would also have become drier as more of the earth’s moisture was tied up in expanding ice sheets. Overall, the effects of the Toba eruption, coupled with the sudden onset of an extremely cold period during the last ice age, would have been devastating to early human populations.

  Genetic evidence suggests that around this time the total number of people alive fell to fewer than ten thousand—perhaps, as I mentioned in Chapter 1, to as few as two thousand, according to a recent paper by the geneticist Marcus Feldman and his colleagues at Stanford University. The genetic and climatic data both paint a picture of a human population teetering on the brink of extinction. It is likely that the cataclysmic climatic shift created a scenario in which humanity had to adapt or die. And the response of these humans—as with the response of the Natufians in the Middle East 60,000 years later—was to change their culture.

  It began slowly, with mere hints of the extraordinary changes to come. The first pieces of evidence show up around the time of the Toba eruption, perhaps even a bit before. Beginning around 75,000 years ago, pieces of soft stone known as ochre, carved by human hands with complex geometric motifs, start to appear in the archaeological record. While obviously not the work of a precocious Michelangelo, they are the first evidence of art in the history of our (or any other) species. The creation of a clearly decorative motif on a stone that may have been used as some sort of primitive counting tool represents a distinct break with the culture of our hominid ancestors.

  Many readers will have already heard about the “Great Leap Forward,” as Jared Diamond has called the abrupt change in behavior that heralds the Upper Paleolithic period, or the Late Stone Age. For many years archaeologists believed that these early signs of fully modern human behavior—art, finely crafted stone tools, advanced hunting techniques—appeared only between 50,000 and 40,000 years ago. With the new discoveries of decorative African art dating back more than 70,000 years, what now seems clear is that these younger archaeological finds mark the expansion of people who had been refining their culture for tens of millennia prior to this. Instead of having to account for a sudden change in human behavior that explodes on the scene within a few thousand years of the Upper Paleolithic, probably caused by a few genetic mutations, judging from the suddenness with which they appeared, archaeologists now increasingly believe that the process was much more gradual. Humans, it seems, were probably preadapted to develop the material culture of the Upper Paleolithic period, and all they needed was the impetus—in the form of the intense selective pressure provided by the last ice age and the eruption of Mount Toba—to make use of their ability to solve problems in novel ways.

  FIGURE 22: THE WORLD’S EARLIEST ARTWORK. ENGRAVED OCHRE FROM BLOMBOS CAVE IN SOUTH AFRICA, DATING TO AROUND 75,000 YEARS AGO.

  The old model of a genetic “revolution” leading to the Upper Paleolithic period was influenced by the work of Richard Goldschmidt, an important twentieth-century German-born American geneticist. Research on gypsy moth evolution and sex determination led Goldschmidt to question the Darwinian model of how species evolve. This is actually one of the least understood processes in evolutionary biology, one even Darwin recognized as a challenge for his theory of evolution by natural selection. How does one species suddenly become two? Traditional Darwinian theory argues that this process—known to evolutionary biologists as macroevolution—follows the same process as change within a species, which is known as microevolution. If two populations of the same species become geographically isolated from each other, it is easy to see how, over time, small changes could eventually lead to enough genetic divergence that the populations become separate species. However, much of macroevolution seems to have taken place without such geographic isolation, through a process known as sympatric speciation, where the daughter species aren’t separated by geographic barriers. How might this occur?

  Goldschmidt proposed the idea of “macromutations”—genetic changes of large effect that create what he called “hopeful monsters.” In this model, he argued, not all mutations are created equal; some have a larger effect on the organism than others. For instance, some plants have formed separate species by doubling their chromosome number—the genetic process that led to the polyploid wheat and corn genomes we learned about in Chapter 2. In these c
ases the vastly different chromosome numbers in the plants prevent them from interbreeding, so two species are effectively formed instantaneously. Although this appears to have been a relatively common way for plants to speciate, it is unlikely to have been widespread in animals because they are generally incapable of reproducing through fusing sperm and eggs produced by the same organism (a process known as selfing). If you don’t have anyone who is reproductively compatible with you, your lineage dies out. But perhaps the macromutations envisioned by Goldschmidt weren’t this extreme, and they only made reproducing with members of the original population more difficult. The bearers of such genetic changes—the “hopeful monsters”—might be able to produce enough offspring to found a new population, one in which reproductive compatibility was normal among its own members. Selection would then act to discourage attempts at interbreeding, and the populations would be on their way to becoming separate species.

  While the utility of polyploidization for explaining the sympatric speciation process in plants is straightforward, we need another type of macromutation for sexually reproducing animals. It is certainly true, though, that seemingly trivial changes in an organism’s DNA can have an enormous effect on its physiology or behavior. Sickle-cell anemia, a devastating disease common in West Africa, is caused by a single nucleotide change. And of course FOXP2, as we have just seen, can destroy someone’s ability to speak with a single mutation. Was the change that first allowed our ancestors to speak an example of such a macromutation? Possibly, although as we just learned, Neanderthals had the human form of the gene and yet left no archaeological evidence that they were capable of complex thought and speech. How, then, did the behaviors that define the Upper Paleolithic era—and fully modern human behavior—come about?

  In a 2003 paper in the scientific journal Nature, the American evolutionary geneticist Richard Lenski and his colleagues suggested a model for how this transition might have occurred. The paper described a computer program they called Avida and its utility in explaining the evolution of complex traits such as eyes. Darwin had described these as “organs of extreme perfection and complication,” and their existence seems difficult to explain using microevolutionary processes—what use is 10 percent of an eye, after all? Avida modeled the evolution of digital organisms—sequences of 0’s and 1’s—over many generations of mutation, reproduction, and natural selection. Starting from an identical pool of organisms, the researchers let Avida’s computationally simulated evolution take its course, and gave “rewards” to their digital menagerie based on how well its members performed certain preassigned evolved computational functions in the digital universe, with more complex functions given higher rewards. The most complex of these, known as EQU, required a minimum of five “mutations” from the ancestral state, but of course you would have to know exactly what you were mutating toward in order to achieve this in only five steps. As in nature, Avida’s mutations were random, so the actual number of generations and steps was far higher than in the simulations.

  What Lenski and his coauthors showed in this simple computer experiment was that extremely complex traits—such as EQU, the abstract computer-code equivalent of an eye—could evolve from preexisting changes that occurred in an organism’s evolutionary past. Macromutations do occur—and, in fact, Lenski and his colleagues did see examples of multistep changes in their computer model—but they weren’t necessary to explain the evolution of a complex, multifaceted trait like EQU. Rather, it was the gradual accumulation of a few such changes of small effect, in the right combination and with strong natural selection, that eventually created the trait. What this showed was that microevolution could explain traits like eyes after all, and thus in the end Darwin was right.

  The obvious implication for our story is that a complex modern human trait like the capacity for abstract thought, first recognizable in the fossil record through artistic depictions, could have arisen in the same way, through small, incremental steps that eventually led to the right combination for natural selection to act on. This model might explain why we see sporadic evidence of modernity prior to 70,000 years ago, but only see it explode afterward. The individual genetic mutations that made such behavior possible had existed for perhaps tens of thousands of years, but their combination in such a way as to produce abstract thought was strongly selected for only after this time. The extreme climatic changes brought about by the ice age and, in all likelihood, the eruption of Mount Toba would have exerted on the human species selection for innovation and speedy adaptation—such strong selection, in fact, that we developed a new culture. Like the dawn of the Neolithic era 60,000 years later, a climatic crisis paved the way for cultural innovation.

  The fact that the same skills that allowed us to be more effective hunters and gatherers also gave us the eventual ability to write sonnets and compose electronic music is perhaps not such a mystery after all. What evolved around 70,000 years ago in the human lineage was the ability to adapt quickly—to innovate—using our culture, as opposed to our biology, as was the case with Neanderthals. Innovation is a complex process, but at its most basic it involves imagining new ways of solving a problem and then implementing them. The first step requires the sort of imagination that is reflected in the creation of art, like that produced by the untrained Gugging artists, and the second requires some way of explaining the innovation to others. The very process of imagining new possibilities is accelerated by cross-fertilization, in the same way that early agriculturalists crossbred different strains of wheat, rice, and corn from the fragmented, mountainous habitat in which the plants had originally evolved in order to create the traits they wanted. This process of trial and error (often using seemingly crazy insights) coupled with better communication became the model for human innovation—the first time such a successful system for problem solving had ever evolved. The change in human behavior at the dawn of the Upper Paleolithic period can be explained only by these two abilities working in concert.

  Modern hunter-gatherers provide a wonderful case study of this behavior. Everyone sits around the fire at the end of the day, telling stories, laughing, and discussing the day’s events. Some of these stories become part of group mythology—an account of a particularly successful hunt, perhaps—while some are a way of testing and refining new ideas. It’s a sort of “innovation think tank,” with the members of the group performing thought experiments as they discuss, dissect, and decide on narratives about their lives. This process of narrative refinement is not unlike the internal cellular process by which our short-term memories are turned into long-term ones—through reiteration of the story, reinforcing the neural connections that tell the narrative in the way we want it to be told. Modern humans, in effect, evolved to be social machines that produce and refine ideas, and perhaps this explains why management studies show that people seem to work best in the kind of small, focused teams where this ancient hunter-gatherer process can take place.

  This model for the evolution of a rapid innovation helps to explain why the Neanderthals were doomed from the moment modern behavior appeared. The changes that led us to innovate also made us curious, probably created a sense of wanderlust, and enabled us to change our culture incredibly quickly in response to new conditions. In effect, the seeds of the Neolithic era were really sown 70,000 years ago; it was just a matter of finding the right set of conditions in which the next step could occur, which would only come when the climatic conditions were right and human population densities had increased to the point where hunting and gathering were no longer viable alternatives. In this model, art and other nonadaptive accoutrements of modern behavior are what evolutionary biologists Stephen Jay Gould and Richard Lewontin would have called spandrels—by-products of other evolutionary forces, and not necessarily ends in themselves.

  There is a problem with the success of our cultural adaptability, though. In the process of creating a densely populated, agricultural way of life, we were forced to subsume our individual desires for novelty t
o the desires of the broader culture. In contrast to our hunter-gatherer ancestors, who were free to explore any and all cultural possibilities, from fishing for salmon to hunting mammoths on the central Asian steppes to creating beautiful artistic depictions on the walls of French caves, their Neolithic descendants were forced to channel theirs in order to suit the broader needs of society. In effect, minds that had once been free, with the endless territory of the Paleolithic globe in which to realize their musings, were now caged, limited in both geography and focus. We had gone from living in “the original affluent society,” as anthropologist Marshall Sahlins famously referred to hunter-gatherer populations, with free to time to devote to seemingly idle activities, to being a group of worker bees with looming deadlines to meet.

  This process of behavioral specialization was codified in the Hindu caste system of India, in the hierarchy of the Catholic church, in the rigid Confucian meritocracy of China, and in the feudal systems of medieval Europe. Society achieved its goals only if the individual components worked together in intricate harmony—a cog couldn’t strive to become a bolt or the whole machine would seize up. The process accelerated during the Industrial Revolution, as the benefits of specialization became even more apparent. In effect, people started to merge with their machines, spending their whole lives performing repetitive tasks that, while wonderful at producing large quantities of standardized, inexpensive goods, effectively robbed the average factory worker of his or her individuality and creativity.

 

‹ Prev