Before the development of modern molecular biology, there were severe limitations on our ability to study human evolution. Then, all we had to work with were the principles of genetics, easily observable differences between peoples such as skin color, and detailed knowledge of a limited number of genes, mostly blood proteins and those causing genetic diseases such as sickle-cell anemia.
But even then, we knew from our experience with animal and plant breeding, along with observation of many examples of rapid evolution in nature, that there could be significant evolutionary change in 10,000 years or less. It was also clear that modest genetic differences between groups could cause big trait differences. Indeed, entirely divergent life strategies can be caused by differences in a single gene, as we see in fire ants, where ants with one version of a pheromone receptor live in independent colonies, each having a single queen, while those with the other version live in a sprawling metacolony with many queens.17 Well before the revolution in genomics, it was clear enough that there could be significant differences between human populations in almost any trait, despite recent common ancestry. It was clear that this was entirely compatible with what we knew of genetics, and it was also clear that at least some such differences existed in skin color, size, morphology, and metabolism.
But as the molecular revolution has unfolded over the past few years, we have learned a great deal more. Recent studies have shown that many genes are currently being replaced by new variants, most strongly in Eurasians—and that those genesfavored by recent selection are for the most part different in different populations. The obvious between-population differences that we knew of a few years ago were only the tip of the iceberg.
LINKAGE
Most of the recent studies have used data from the HapMap, a database of common patterns of human genetic variation produced by an international group of researchers. Four populations were selected for the HapMap—ninety Nigerians, ninety Americans of European ancestry, forty-five individuals from Tokyo, and forty-five from Beijing. For some purposes we will group the Japanese and Chinese together as an "East Asian" sample.
The human genome has about 3 billion bases (the four molecular building blocks that make up DNA) organized into forty-six separate bundles of DNA called chromosomes. For the most part, DNA sequences are the same in all humans, but every few hundred bases, a variable site crops up. These are the only sites in which the bases of DNA are likely to vary from one individual to another.
A particular pattern of variation at these sites is called a haplotype. Imagine three successive variable sites—the first can be G or C (representing guanine or cytosine), the second can be A (adenine) or G, and the third can be T (thymine) or C. A particular individual might have C in the first site, A in the second site, and T in the third site—his haplotype would be CAT—while another person has the haplotype cytosine- guanine-thymine, or CGT. A haplotype is like a hand of poker, while the bases in the variable sites are like individual cards.
And just like cards, haplotypes are shuffled. In each generation, a new chromosome is assembled from the inherited parental chromosomes in much the same way that one can cut two decks of cards and assemble a new deck, a process we call recombination. There can be multiple cuts: in humans, an average of one to three per chromosome.
This means that haplotypes are partially broken down every generation: The complete pattern that existed over the whole of the parent's chromosome will no longer be intact after recombination. However, smaller parts of that pattern are likely to remain unchanged, since a chromosome is millions of bases long and the few breaks that occur are likely to be far away.
Over many generations, any haplotype will eventually be completely reshuffled. But if a favorable mutation occurs on a chromosome, people with that mutation will have more children survive than average, so over time, more and more people will bear that mutation. If the advantage is large enough, the mutation can rapidly become common, before recombination completely reshuffles its original haplotype, rapidly enough that people bearing that mutation will also carry the original local haplotype that surrounded it when it first came into existence. The longer the shared haplotype, the younger the mutation. It's as if part of your last hand of cards showed up again in the new deal: You would guess that there hadn't been much shuffling, and you'd be right.
The HapMap studies looked for long haplotypes (long un- shuffled regions) that existed in a number of individuals in the dataset. Any such shared pattern would be a sign of recent strong selection—quite recent, since recombination eventually breaks down all such patterns.
One well-known example is the gene that makes lactase, the enzyme that digests milk sugar. In most humans, and in mammals generally, lactase production stops around the age of weaning, but in many Europeans and some other peoples, production continues throughout life. This adaptation lets adults drink milk. Lactose-tolerant Europeans carry a particular mutation that is only a few thousand years old, and so those Europeans also carry much of the original haplotype. In fact, the shared haplotype around that mutation is over 1 million bases long.
Recent studies have found hundreds of cases of long haplo- types indicating recent selection: Some have almost reached 100 percent frequency, more have intermediate frequencies, and most are regional. Many are very recent: The rate of origination peaks at about 5,500 years ago in the European and Chinese samples, and at about 8,500 years ago in the African sample. Again and again over the past few thousand years, a favorable mutation has occurred in some individual and spread widely, until a significant fraction of the human race now bears that mutated allele. Sometimes almost everyone in a large geographic region, such as Europe or East Asia, shares a trait that goes back to one such allele. The mutation can affect many different things—skin color, metabolism, defense against infectious disease, central nervous system features, and any number of other traits and functions.
Since we have sequenced the chimpanzee genome, we know the size of the genetic difference between chimps and humans. Since we also have decent estimates of the length of time since the two species split, we know the long-term rate of genetic change. The rate of change over the past few thousand years isfar greater than this long-term rate over the past few million years, on the order of 100 times greater. If humans had always been evolving this rapidly, the genetic difference between us and chimpanzees would be far larger than it actually is.18
In addition, we see far more recent alleles at moderate frequencies (20 percent to 70 percent) than we do with frequencies close to 100 percent. Since a new favored allele spends a long time at low frequencies (starting with a single copy), a short time at moderate frequencies, and then a long time closing in on 100 percent, the only explanation is that this rush of selection began quite recently, so that few selected genes are in that final phase of increase.
The ultimate cause of this accelerated evolution was the set of genetic changes that led to an increased ability to innovate. Sophisticated language abilities may well have been the key. We would say that the new alleles (the product of mutation and/or genetic introgression) that led to this increase in creativity were gateway mutations because innovations they made possible led to further evolutionary change, just as the development of the first simple insect wings eventually led to bees, butterflies, and an inordinate number of beetles.
Every major innovation led to new selective pressures, which led to more evolutionary change, and the most spectacular of those innovations was the development of agriculture.
This page intentionally left blank
2 THE
NEANDERTHAL WITHIN
In their expansion out of Africa, modern humans encountered and eventually displaced archaic humans such as Neanderthals: You can't make an omelet without breaking eggs. Moderns showed up in Europe about 40,000 years ago, arriving first in areas to the east and north of Neanderthal territory, the mammoth steppe that Neanderthals had failed to settle permanently. A superior toolkit—in particular, needles for s
ewing clothes— may have made this possible.
Later, modern humans moved south and west, displacing the Neanderthals. This is more or less what one would expect to happen, since the two sister species were competing for the same kinds of resources—ecological theory says that one will win out over the other. It took just 10,000 years for modernhumans to completely replace Neanderthals, with the last Neanderthals probably living in what is now southern Spain.
Judging by outcomes, modern humans were competitively superior to Neanderthals, but we don't know what their key advantage was, any more than we know what drove the expansion of modern humans out of Africa. Several explanations have been suggested, and some or all of them may be correct.
One idea is that modern humans had projectile weapons, in contrast to the thrusting spears used previously. If lightly built modern humans could hunt just as well as Neanderthals while requiring fewer calories, strongly built Neanderthals would have become obsolete. Even if Neanderthals had managed to copy that technology, they would have expended more energy in hunts because of their heavier bodies. Finds of small stone points in the Aurignacian (a culture that existed in Europe between 32,000 and 26,000 BC) suggest that something like this scenario may have occurred, but the earliest known spearthrow- ers, or atlatls, were made considerably later. Another idea is that modern humans were smarter—which might have been the case, but it is hard to prove.
Probably the most popular and attractive hypothesis is that modern humans had developed advanced language capabilities and therefore were able to talk the Neanderthals to death. This idea has a lot going for it. It's easy to imagine ways in which superior language abilities could have conferred advantages, particularly at the level of the band or tribe. For example, hunter-gatherers today are well known for having a deep knowledge of the local landscape and of the appearance and properties of many local plants and animals. This includes knowledge of rare but important events that happened more than a human lifetime ago, which may have been particularly important in theunstable climate of the Ice Age. It is hard to see how that kind of information transmission across generations would be possible in the absence of sophisticated language. Without it, there may have been distinct limits on cultural complexity, which, among other things, would have meant limits on the sophistication of tools and weapons.
Beginning in Africa, and continuing in the European archaeological record, we see signs of long-distance trade and exchange among modern and almost-modern humans in the form of stone tools made out of materials that originated far away. The Neanderthals never did this: To the extent that such trade was advantageous, it would have favored moderns over Neanderthals, and it is easy to imagine how enhanced language abilities would have favored trade. Those trade contacts (and the underlying language ability) might have allowed the formation of large-scale alliances (alliances of tribes), and societies with trade and alliances would have prevailed over opponents that couldn't organize in the same way.
Whatever the driving forces, this population replacement was slow, at least when compared to the time scale of recorded history, and was most likely undramatic. The distance from Moscow to Madrid is a little over 2,000 miles; that's not a lot of ground to cover in 10,000 years. The actual advance of modern humans in Europe may have taken the form of occasional skirmishes in which moderns won more often than not. Perhaps modern humans were better hunters and made big game scarce, so that neighboring Neanderthal bands suffered. Perhaps moderns, with their less bulky bodies and more varied diet (including fish), were better at surviving hard times. Quite possibly, the actual advance was made up of a mix of all these patterns.
Of course, there are other possibilities. Biological advantages take many forms, and they don't have to be admirable—in fact, they can be downright embarrassing, disgusting, or, worst of all, boring. One realistic and embarrassing possibility is that modern humans expanding out of Africa carried with them some disease or parasite that was fairly harmless to them but deadly to Neanderthals and the hominid populations of East Asia—the "cootie" theory. There is no direct evidence for this, but then it's hard to see how there would be: Germs seldom leave fossils. We know of natural examples of this mechanism, however. White-tailed deer carry a brain worm that is fairly harmless to them but fatal to moose.1 So white-tailed deer are pretty good at displacing moose populations and have been doing so since their traditional enemies, such as wolves, have mostly disappeared. Another example: Since people imported American gray squirrels to England, the native red squirrel has declined dramatically. The gray squirrels carry a virus that they survive but that devastates the native red squirrels.2
We have heard charmingly goofy criticisms of the idea that Neanderthals were competitively inferior to modern humans. It has been suggested that such a position is racist. Somehow, saying that a population that split off from modern humans half a million years ago (one generally considered a separate species), had some kind of biological disadvantage is beyond the pale, even though we're here and Neanderthals are not. For that matter, we've seen people argue that the idea that some genes were picked up from archaic humans is racist, while others have argued that the idea that humans didn't pick up Neanderthal genes is racist.
Although the archaeological evidence suggests that moderns and Neanderthals did not coexist for very long in any oneplace during this replacement, there is reason to believe there was some contact between the two different populations. In several places, most clearly in central and southwestern France and part of northern Spain, we find a tool tradition that lasted from about 35,000 to 28,000 years ago (the Chatelperronian) that appears to combine some of the techniques of the Neanderthals (the Mousterian industry) with those of modern humans (the Aurignacian). The name Chatelperronian comes from the site of the Grotte des Fees (Fairy Grotto) near the French town of Chatelperron. Chatelperronian deposits contain flint cores characteristic of the Neanderthals' Mousterian technology mixed with more modern tools. One characteristic tool was a flint knife with a single cutting edge, in contrast with the double-edged knives we see in the Aurignacian industry. Most important, there are several skeletons clearly associated with the Chatelperronian industry, and all are Neanderthal. This strongly suggests that there were interactions between the populations, enough that the Neanderthals learned some useful techniques from modern humans. If this is the case, it tells us something about Neanderthal cognitive capabilities—mainly that they can't have been all that far behind modern humans. At minimum, they were much better at learning new things than chimpanzees.
There may have been important consequences from such interactions; familiarity may breed contempt, but lack of familiarity breeds nothing at all.
THE "BIG BANG"
"The Upper Paleolithic," according to Stanford anthropologist Richard Klein, "signals the most fundamental change in human behavior that the archaeological record may ever reveal, barringonly the primeval development of those human traits that made archaeology possible."3
He's not kidding. The archaeological record of the Upper Paleolithic, or last phase of the Old Stone Age—the product of the modern humans who displaced the Neanderthals in Europe 30,000 to 40,000 years ago—is qualitatively different from anything that came before. With the advent of modern humans in Europe, innovation was bustin' out all over.
Many of the new features that marked this "great leap forward" were impressive—cave paintings, sculpture, jewelry, dramatically improved tools and weapons. Some of them brought significant changes in the practical matters of daily life, but most important, from our point of view, is that they show an extraordinary increase in the human capacity to create and invent.
What's more, the innovations show that profound social and cultural changes were taking place. We developed new social arrangements as well as new tools: The spearpoints and scrapers of this period often used materials from hundreds of miles away, which must have been acquired through some form of trade or exchange. Before, tools were almost entirely made from local
materials. We also see the beginnings of cultural variation: Tools and weapons started showing regional styles.
At this point, people—some of them anyhow—were acting wildly different from their forebears of even 20,000 years earlier. The spark of innovation was taking them in all kinds of new directions. We're not saying that every Tom, Dick, and Harry was an inventor, but at least some people were coming up with new ideas—and doing so perhaps 100 times more often than in earlier times. The natural question is, "Why?" It doesn't really look as if being a modern human, in the sense of having ancestorswho were anatomically modern and who had originated in Africa, was enough, by itself, to trigger this change. We don't see this storm of innovation in Australia. Obviously, something important, some genetic change, occurred in Africa that allowed moderns to expand out of Africa and supplant archaic sapiens. Equally obviously, judging from the patchy transition to full behavioral modernity, there was more to the story than that. So probably being an "anatomically modern" human was a necessary but not sufficient condition for full behavioral modernity.
More generally, behavior has a physical substrate: Biology keeps culture on a leash, which is why you can't teach a dog to play poker, never mind all those lying paintings. We have every reason to think that back in the Eemian period (the interglacial period of about 125,000 years ago), the leash was too short for agriculture. Humans did not develop agriculture anywhere on earth during the Eemian, but they did so at least seven times independently in the Holocene, the most recent interglacial period, which began 10,000 years ago. Not only that, in the Eemian the leash was too short to allow for the expansion of anatomically modern humans out of Africa into cooler climates. In that period, biology somehow kept people from making at- latls or bows, and from sewing clothes or painting, all of which are routinely performed and highly valued by contemporary hunter-gatherers. People were different back then—significantly different, biologically different.
The 10,000 Year Explosion Page 3