Why the West Rules—for Now

Home > Other > Why the West Rules—for Now > Page 8
Why the West Rules—for Now Page 8

by Morris, Ian;


  Some paleoanthropologists even think that Neanderthals’ big brains and wide neural canals allowed them to talk more or less like us. Like modern humans they had hyoid bones, which anchor the tongue and let the larynx make the complex movements needed for speech. Other scholars disagree, though, noting that Neanderthal brains, while big, were longer and flatter than ours, and that the speech areas were probably less developed. They also point out that although the relevant areas survive on the bases of only three skulls, it looks as if Neanderthals’ larynxes were very high in their necks, meaning that despite their hyoid bones they could vocalize only a narrow range of sounds. Maybe they could just grunt single syllables (what we might call the “me Tarzan, you Jane” model), or maybe they could express important concepts—“come here,” “let’s go hunting,” “let’s make stone tools/dinner/love”—by combining gestures and sounds (the Clan of the Cave Bear model, where Neanderthals have an elaborate sign language).

  In 2001 it began to look like genetics might settle things. Scientists found that one British family that for three generations had shared a speech disorder called verbal dyspraxia also shared a mutation on a gene called FOXP2. This gene, it turned out, codes for a protein influencing how the brain processes speech and language. This does not mean that FOXP2 is “the language gene”: speech is a bewilderingly complex process involving countless genes working together in ways we cannot yet fathom. FOXP2 came to geneticists’ attention because sometimes it just needs one thing to go wrong for a whole system to crash. A mouse chews through a two-cent wire and my twenty-thousand-dollar car won’t start; FOXP2 malfunctions and the brain’s elaborate speech networks seize up. All the same, some archaeologists suggested, maybe random mutations producing FOXP2 and related genes gave modern humans linguistic skills that earlier species, including Neanderthals, lacked.

  But then the plot thickened. As everyone now knows, deoxyribonucleic acid—DNA—is the basic building block of life, and in 2000 geneticists sequenced the modern human genome. What is less well known is that back in 1997, in a scene reminiscent of Jurassic Park, scientists in Leipzig, Germany, extracted ancient DNA from the arm of the original Neanderthal skeleton found in the Neander Valley in 1856. This was an extraordinary feat, since DNA begins breaking down immediately upon death, and only tiny fragments survive in such ancient material. The Leipzig team is not about to clone cavemen and open a Neanderthal Park, so far as I know,* but in 2007 the process of sequencing a draft of the Neanderthal genome (which was completed in 2009) produced a remarkable discovery—that Neanderthals also had the FOXP2 gene.

  Maybe this means that Neanderthals were as chatty as us; or maybe that FOXP2 was not the key to speech. One day we will surely know, but for now all we can do is observe the consequences of Neanderthals’ interactions. They lived in bigger groups than earlier types of ape-men, hunted more effectively, occupied territories for longer periods, and cared about one another in ways earlier ape-men could not.

  They also deliberately buried some of their dead, and perhaps even performed rituals over them—the earliest signs of that most human quality of all, a spiritual life, if we are interpreting the evidence correctly. At Shanidar, for instance, several bodies had definitely been buried, and the soil in one grave contained high concentrations of pollen, which might mean that some Neanderthals laid a loved one’s body on a bed of spring flowers. (Rather less romantically, some archaeologists point out that the grave was honeycombed with rat burrows, and that rats often carry flowers into their lairs.)

  In a second case, at Monte Circeo near Rome, construction workers in 1939 exposed a cave that had been sealed by a rockfall fifty thousand years ago. They told archaeologists that a Neanderthal skull sat on the floor in the middle of a circle of rocks, but because the workers moved the skull before experts saw it, many archaeologists harbor doubts.

  Finally, there is Teshik-Tash in Uzbekistan. Here Hallam Movius (he of Movius Line fame) found the skeleton of a boy encircled, he said, by five or six pairs of wild goat horns. However, the deposits at Teshik-Tash are full of goat horns, and Movius never published plans or photographs of the finds to convince skeptics that these particular ones were in a meaningful pattern.

  We need clearer evidence to lay this question to rest. Personally, I suspect that there is no smoke without fire, and that Neanderthals did have some kind of spiritual life. Perhaps they even had medicine women and shamans like Iza and Creb in The Clan of the Cave Bear. Whether that is right or not, though, if the time machine I invoked earlier could transport you to Shanidar as well as to Zhoukoudian, you would see real behavioral differences between Eastern Peking Man and Western Neanderthals. You would also be hard-pressed to avoid concluding that the West was more developed than the East. This may already have been true 1.6 million years ago, when the Movius Line took shape, but it was definitely true a hundred thousand years ago. Again the specter of a racist long-term lock-in theory rears its head: Does the West rule today because modern Europeans are the heirs of genetically superior Neanderthal stock, while Asians descend from the more primitive Homo erectus?

  BABY STEPS

  No.

  Historians like giving long, complicated answers to simple questions, but this time things really do seem to be straightforward. Europeans do not descend from superior Neanderthals, and Asians do not descend from inferior Homo erectus. Starting around seventy thousand years ago, a new species of Homo—us—drifted out of Africa and completely replaced all other forms.* Our kind, Homo sapiens (“Wise Man”), did interbreed with Neanderthals in the process. Modern Eurasians share 1 to 4 percent of their genes with the Neanderthals, but everywhere from France to China it is the same 1 to 4 percent.† The spread of modern humans wiped the slate clean. Evolution of course continues, and local variations in skin color, face shape, height, lactose tolerance, and countless other things have appeared in the two thousand generations since we began spreading across the globe. But when we get right down to it, these are trivial. Wherever you go, whatever you do, people (in large groups) are all much the same.

  The evolution of our species and its conquest of the planet established the biological unity of mankind and thereby the baseline for any explanation of why the West rules. Humanity’s biological unity rules out racebased theories. Yet despite the overwhelming importance of these processes, much about the origins of modern humans remains obscure. By the 1980s archaeologists knew that skeletons more or less like ours first appeared around 150,000 years ago on sites in eastern and southern Africa. The new species had flatter faces, more retracted under their foreheads, than earlier ape-men. They used their teeth less as tools, had longer and less muscular limbs, and had wider neural canals and larynxes positioned better for speaking. Their brain cavities were a little smaller than Neanderthals’ but their skullcaps were higher and more domed, leaving room for bigger speech and language centers and stacked layers of neurons that could perform massive numbers of calculations in parallel.

  The skeletons suggested that the earliest Homo sapiens could walk the walk just like us, but—oddly—the archaeology suggested that for a hundred thousand years they stubbornly refused to talk the talk. Homo sapiens tools and behavior looked much like those of earlier ape-men, and—again like other ape-men, but utterly unlike us—early Homo sapiens seemed to have had just one way of doing things. Regardless of where archaeologists dug in Africa, they kept coming up with the same, not particularly exciting, kinds of finds. Unless, that is, they excavated Homo sapiens sites less than fifty thousand years old. On these younger sites Homo sapiens started doing all kinds of interesting things, and doing them in lots of different ways. For instance, archaeologists identify no fewer than six distinct styles of stone tools in use in Egypt’s Nile Valley between 50,000 and 25,000 BCE, whereas before then a single fashion prevailed from South Africa to the shores of the Mediterranean.

  Humans had invented style. Chipping stone tools this way, rather than that way, now marked a group off as different from their n
eighbors; chipping them a third way marked a new generation as different from their elders. Change remained glacial by the standards we are used to, when pulling out a four-year-old cell phone that can’t make movies, locate me on a map, or check e-mail makes me look like a fossil, but it was meteoric compared to all that had gone before.

  As any teenager coming home with hair dyed green or a new piercing will tell you, the best way to express yourself is to decorate yourself, but until fifty thousand years ago, it seemed that almost no one had felt this way. Then, apparently, almost everyone did. At site after site across Africa after 50,000 BCE archaeologists find ornaments of bone, animal tooth, and ivory; and these are just the activities that leave remains for us to excavate. Most likely all those other forms of personal adornment we know so well—hairstyles, makeup, tattoos, clothes—appeared around the same time. A rather unpleasant genetic study has suggested that human body lice, which drink our blood and live in our clothes, evolved around fifty thousand years ago as a little bonus for the first fashionistas.

  “What a piece of work is a man!” gasps Hamlet when his friends Rosencrantz and Guildenstern come to spy on him. “How noble in reason! how infinite in faculty! in form and moving how express and admirable! in action how like an angel! in apprehension how like a god!” And in all these ways, how unlike an ape-man. By 50,000 BCE modern humans were thinking and acting on a whole different plane from their ancestors. Something extraordinary seemed to have happened—something so profound, so magical, that in the 1990s it moved normally sober scientists to flights of rhetoric. Some spoke of a Great Leap Forward;* others of the Dawn of Human Culture or even the Big Bang of Human Consciousness.

  But for all their drama, these Great Leap Forward theories were always a little unsatisfactory. They required us to imagine not one but two transformations, the first (around 150,000 years ago) producing modern human bodies but not modern human behavior, and the second (around 50,000 years ago) producing modern human behavior but leaving our bodies unchanged. The most popular explanation was that the second transformation—the Great Leap—began with purely neurological changes that rewired the brain to make modern kinds of speech possible, which in turn drove a revolution in behavior; but just what this rewiring consisted of (and why there were no related changes to skulls) remained a mystery.

  If there is anywhere that evolutionary science has left room for supernatural intervention, some superior power breathing a spark of divinity into the dull clay of ape-men, surely it is here. When I was (a lot) younger I particularly liked the story that opens Arthur C. Clarke’s science-fiction novel 2001: A Space Odyssey (and Stanley Kubrick’s memorable, if hard to follow, movie version). Mysterious crystal monoliths drop from outer space to Earth, come to upgrade our planet’s ape-men before they starve into extinction. Night after night Moon-Watcher, the alpha ape-man in one band of earthlings, feels what Clarke calls “inquisitive tendrils creeping down the unused byways of his brain” as a monolith sends him visions and teaches him to throw rocks. “The very atoms of his simple brain were being twisted into new patterns,” says Clarke. And then the monolith’s mission is done: Moon-Watcher picks up a discarded bone and brains a piglet with it. Depressingly, Clarke’s vision of the Big Bang of Human Consciousness consists entirely of killing things, culminating in Moon-Watcher murdering One-Ear, the top ape-man in a rival band. Next thing the reader knows, we are in the space age.

  Clarke set his 2001 moment 3 million years ago, presumably to account for the invention of tools by Homo habilis, but I always felt that the place where a good monolith would really do some work was when fully modern humans appeared. By the time I started studying archaeology in college I had learned not to say things like that, but I couldn’t shake the feeling that the professionals’ explanations were less compelling than Clarke’s.

  The big problem archaeologists had in those far-off days when I was an undergraduate was that they simply had not excavated very many sites dating between 200,000 and 50,000 years ago. As new finds accumulated across the 1990s, though, it began to become clear that we did not need monoliths after all; in fact, the Great Leap Forward itself began to dissolve into a series of Baby Steps Forward, spread across tens of thousands of years.

  We now know of several pre-50,000-BCE sites with signs of surprisingly modern-looking behavior. Take, for instance, Pinnacle Point, a cave excavated in 2007 on the South African coast. Homo sapiens moved in here about 160,000 years ago. This is interesting in itself: earlier ape-men generally ignored coastal sites, probably because they could not work out how to find much food there. Yet Homo sapiens not only headed for the beach—distinctly modern behavior—but when they got there they were smart enough to gather, open, and cook shellfish. They also chipped stones into the small, light points that archaeologists call bladelets, perfect as tips for javelins or arrows—something that neither Peking Man nor Europe’s Neanderthals ever did.

  On a handful of other African sites people engaged in different but equally modern-looking activity. About a hundred thousand years ago at Mumbwa Cave in Zambia people lined a group of hearths with stone slabs to make a cozy nook where it is easy to imagine them sitting around telling stories, and at dozens of sites around Africa’s coasts, from its southern tip to Morocco and Algeria in the north (and even just outside Africa, in Israel), people were sitting down and patiently cutting and grinding ostrich eggshells into beads, some of them just a quarter of an inch across. By ninety thousand years ago people at Katanda in the Congo had turned into proper fishermen, carving harpoons out of bone. The most interesting site of all, though, is Blombos Cave on Africa’s southern coast, where in addition to shell beads, excavators found a 77,000-year-old stick of ocher (a type of iron ore). Ocher can be used for sticking things together, waterproofing sails, and all kinds of other tasks; but in recent times it has been particularly popular for drawing, producing satisfyingly bold red lines on tree bark, cave walls, and people’s bodies. Fifty-seven pieces turned up at Pinnacle Point, and by 100,000 BCE it shows up on most African sites, which probably means that early humans liked drawing. The truly remarkable thing about the Blombos ocher stick, though, is that someone had scratched a geometric pattern on it, making it the world’s oldest indisputable work of art—and one made for producing more works of art.

  At each of these sites we find traces of one or two kinds of modern behavior, but never of the whole suite of activities that becomes familiar after 50,000 BCE. Nor is there much sign yet that the modern-looking activities were cumulative, building up gradually until they took over. But archaeologists are already beginning to feel their way toward an explanation for the apparent baby steps toward fully modern humanity, driven largely by climate change.

  Geologists realized back in the 1830s that the miles-long, curving lines of rubble found in parts of Europe and North America must have been created by ice sheets pushing debris before them (not, as had previously been thought, by the biblical flood). The concept of an “ice age” was born, although another fifty years passed before scientists understood exactly why ice ages happen.

  Earth’s orbit around the sun is not perfectly round, because the gravity of other planets also pulls on us. Over the course of a hundred thousand years our orbit goes from being almost circular (as it is now) to being much more elliptical, then back again. Earth’s tilt on its axis also shifts, on a 22,000-year rhythm, as does the way the planet wobbles around this axis, this time on a 41,000-year scale. Scientists call these Milankovich cycles, after a Serbian mathematician who worked them out, longhand, while interned during World War I (this was a very gentlemanly internment, leaving Milankovich free to spend all day in the library of the Hungarian Academy of Sciences). The patterns combine and recombine in bewilderingly complex ways, but on a roughly hundred-thousand-year schedule they take us from receiving slightly more solar radiation than the average, distributed slightly unevenly across the year, to receiving slightly less sunlight, distributed slightly more evenly.

  Non
e of this would matter much except for the way Milankovich cycles interact with two geological trends. First, over the last 50 million years continental drift has pushed most land north of the equator, and having one hemisphere mostly land and the other mostly water amplifies the effects of seasonal variations in solar radiation. Second, volcanic activity has declined across the same period. There is (for the time being) less carbon dioxide in our atmosphere than there was in the age of the dinosaurs, and because of this the planet has—over the very long run and until very recently—steadily cooled.

  Through most of Earth’s history the winters were cold enough that it snowed at the poles and this snow froze, but normally the sun melted this ice every summer. By 14 million years ago, however, declining volcanic activity had cooled Earth so much that at the South Pole, where there is a large landmass, the summer sun no longer melted the ice. At the North Pole, where there is no landmass, ice melts more easily, but by 2.75 million years ago temperatures had dropped enough for ice to survive year-round there, too. This had huge consequences, because now whenever Milankovich cycles gave Earth less solar radiation, distributed more evenly across the year, the North Pole ice cap would expand onto northern Europe, Asia, and America, locking up more water, making the earth drier and the sea level lower, reflecting back more solar radiation, and reducing temperatures further still. Earth then spiraled down into an ice age—until the planet wobbled, tilted, and rotated its way back to a warmer place, and the ice retreated.

  Depending on how you count, there have been between forty and fifty ice ages, and the two that spanned the period from 190,000 through 90,000 BCE—crucial millennia in human evolution—were particularly harsh. Lake Malawi, for instance, contained just one-twentieth as much water in 135,000 BCE as it does today. The tougher environment must have changed the rules for staying alive, which may explain why mutations favoring braininess began flourishing. It may also explain why we have found so few sites from this period; most protohumans probably died out. Some archaeologists and geneticists in fact estimate that around 100,000 BCE there were barely twenty thousand Homo sapiens left alive.

 

‹ Prev