Lone Survivors

Home > Other > Lone Survivors > Page 17
Lone Survivors Page 17

by Chris Stringer


  Next we will look at the question of the sexual division of labor in early humans and the different views that have emerged.

  At one extreme, the archaeologist Lewis Binford provocatively suggested modern humans might have been the first to “invent” the nuclear family, and that Neanderthal social structures could have been more like those of some mammalian carnivores, with packs of males roaming the landscape for meat and living largely separate lives from females (apart from occasional visits to exchange meat for sex). The women were left to bring up their children on what they could gather close to their home bases and nurseries. At the other extreme is the view of the archaeologists Steven Kuhn and Mary Stiner, who argued that hunting big game was a family affair for the Neanderthals, with the women and children joining in, and that by contrast modern humans were the first to develop the patterns of division of labor and distinct roles that we see in hunter-gatherers today. In their opinion, the archaeological record of the Neanderthals showed little evidence of role specialization, and instead the population lived fast, burning energy from a high-calorie diet obtained from hunting large herbivores. Such high-yield food was rich and rewarding, but not always easy to obtain, so the Neanderthals were at the top of their food chain and could only persist in relatively small numbers, at low population densities.

  Several lines of argument support such a view. For one thing, from various data the Neanderthals seem to show low levels of sexual dimorphism—that is, males and females were nearly equal in size—which would not be expected if they had very different roles, including male specialization in hunting big game. Second, there are data from chemical analyses of Neanderthal bones (see chapter 3) that suggest they were indeed highly carnivorous, at least in the northern parts of their range. Third, research by Thomas Berger and Erik Trinkaus examined patterns of injury on Neanderthal skeletons and found a high frequency of lesions and fractures, particularly in the head and neck. When they compared the pattern with those in recent and archaeological samples of modern humans, they could not match it, and it was not until they turned to data on injuries in athletes that they could—in rodeo riders, of all people! This did not mean that Neanderthals regularly rode on wild animals, but it did indicate they shared a proximity to hostile wild animals that might bite, butt, kick, roll, or fall on them—and the pattern was found throughout the Neanderthal sample of men, women, and children.

  The anthropologist Steve Churchill and the archaeologist John Shea followed the biologist Valerius Geist in arguing that Neanderthals engaged in confrontational hunting, at close range, with wooden thrusting spears—a much more dangerous hunting method than “killing at a distance” with throwing projectiles, arrows, or blowpipes. So if Neanderthal women and children were involved in the hunt, even if only as drivers or beaters, they would have carried a risk of injury from large prey. In contrast, Kuhn and Stiner argue that early modern humans in Africa were able to exist in larger numbers at a greater density than Neanderthals, and in an environment with greater biodiversity. This would have encouraged a varied rather than single approach to procuring food, and the evolution of much more distinct roles for different components of human groups, especially males and females.

  As we discussed earlier, meat from large mammals provides rich returns, but it is a supply with inherent risks, both in obtaining it and in relying on such an unpredictable resource. By dividing labor and diversifying food intake to the maximum, modern humans were better able to guarantee where the next meal was coming from, and their reproductive core—women and children—were at reduced risk. Compared with other primate relatives and with what we know of the earliest humans, recent hunter-gatherers have diverse sources of animal protein and fat, of food gathering and processing, and of food storage. Much of this comes from the activities of the old, and from women and children, using snares, nets, and traps to collect small game and tools to extract plant staples. Among recent hunter-gatherers in Australia, Africa, and the Americas, nets have been effective in capturing prey from lizards and small birds up to the size of large deer, and a drive toward the nets is something that almost everyone can collaborate in and enjoy, whether on land or in shallow water. If a surplus of prey is obtained, it can be eaten in ceremonial feasts, traded with neighboring groups, or preserved through drying, smoking, or underground storage.

  In the late 1960s the archaeologists Lewis Binford and Kent Flannery both proposed that the “Broad Spectrum Revolution” of recent hunter-gatherers first developed in the last 20,000 years in the Middle East, under the pressure of climate change and increasing population density. In a sense it was forced on the late Paleolithic peoples of western Asia as a way of increasing the carrying capacity of the land from which they lived, and it was seen as a prelude to the domestication of plants and animals that followed soon afterward. But Stiner and Kuhn compared site data covering a much wider range in time and space, and they believe that this ratcheting up of resource exploitation began earlier in human evolution. Archaeological evidence for the broadening of Paleolithic diets in early moderns shows from at least 40,000 years ago, and there is supporting isotopic evidence for this, which I discussed in chapter 3. Grinding tools (sometimes just cobbles) become more common, and, as discussed earlier, they would have been useful for obtaining the maximum benefit (and sometimes also minimum risk of natural toxins) from energy-rich nuts, seeds, and tubers. While large game was still hunted, mainly using projectile spears but later enhanced by atlatls (spear throwers) and bows and arrows, evidence for the exploitation of small game such as tortoises, rabbits, wild fowl, and eggs increased. In addition, food from the sea, the shore, rivers, and lakes became more important.

  All of these elements of the diet were already there in some areas and, to some extent, in earlier moderns and in the Neanderthals. (Our work on Neanderthal sites in Gibraltar shows that they were well aware of the value of shellfish, marine mammals, rabbits, nuts, and seeds.) But it seems that for modern humans, such items started to form significant constituents of their diet. And the increasing breadth and processing of plant resources could also have been important in another way. For many hunter-gatherers, starchy foods could have been used to make pastes and gruels for baby food, accelerating the process of weaning, freeing up the mother’s time, and allowing alloparents a greater role. In turn, an earlier cessation of breast-feeding potentially returned the mother to the reproductive cycle—a significant factor in the close birth-spacings achieved by modern hunter-gatherers. This, too, could have been a key to the success of the modern human species.

  The archaeologist Olga Soffer collaborated in studies of many Czech Upper Paleolithic sites, and she challenged the prevailing view that these sites contain extensive evidence of the main component of the diet 30,000 years ago: mammoth meat from hunting carried out by Cro-Magnon men. Instead, study of the mammoth bone accumulations suggested that many were probably from animals that had died naturally and were then scavenged some time afterward for their bones and ivory. Moreover, many of those that had been butchered or cooked were either very young or very old individuals, that is, those that were the most vulnerable to natural deaths or other predators, or that would have been the easiest and least dangerous to catch. The implication was that mammoth meat may not have been the main or most reliable item on the menu for a lot of the time. But if not, what supported these large and sophisticated communities in the windswept plains of the last Ice Age? Well, the bones of hares and foxes were common, but microscopic studies of the hearths also revealed residues of plants, fruits, seeds, and roots that were full of starch.

  But Soffer also spotted something remarkable impressed on some of the clay fragments that littered the site: delicate parallel lines. The archaeologist Jim Adovasio subjected these to detailed study and found not only many more lines but also crisscross patterns forming a mesh—the telltale traces of woven fibers. Further study revealed the marks of textiles, basketry, nets, cord, and knots. And for those who may be skeptical about interpretati
ons from impressions in clay, fragments of actual flax fibers were discovered in the unusually dry environment of Upper Paleolithic levels at Dzudzuana Cave in Georgia. The work, led by Eliso Kvavadze and Ofer Bar-Yosef, dated some of these fibers as far back as 35,000 years ago. Some had been twisted to make cord or knotted, and others had apparently been dyed in colors ranging from pink to black. The sediments that yielded up the fibers also contained traces of hair and wool from wild ox and goat, as well as the remains of beetles, moths, and mold commonly associated with textiles today.

  So the production of twine and other material for sewing clothing and skins, for fastening composite tools together, and for containers, ropes, and netting also seems to have been part of the repertoire of some of the first modern humans in the Caucasus region of western Asia. Such materials would have provided protection from the environment, as well as containers to hold food, and they would have greatly expanded the methods of prey acquisition available to the Cro-Magnons. As we saw, woven nets and traps would also have allowed a wider range of the group to take part in the hunting process, since patience and planning are more important in their use than long-distance travel and physical strength. And such technological changes would also have brought social transformations in the development of specialized roles for the production of things like nets, clothing, and baskets, and the beginning of a whole new range of fashion accessories.

  But to return to the Neanderthals and their hunting patterns: despite Kuhn and Stiner’s careful arguments, I think that the risks would have been too great for Neanderthal women with young children to have moved far in support of hunting, and, as I explain later in this chapter, there are alternative explanations for the widespread patterns of trauma in their skeletons. Steve Churchill, working with fellow anthropologist Andrew Froehle, expanded on the Neanderthal–modern contrast in subsistence by bringing into the equation climate and the extent of cultural buffering (cultural protection from environmental extremes through heating, clothing, insulated dwellings, et cetera). They suggested that Neanderthals living in Ice Age Europe would typically have needed an extra 250 kilocalories a day compared with a modern human in the same situation, given the higher energetic demands of their lifestyle and to fuel their greater body—and, in particular, muscle—mass. Lower adult energy needs could have given modern humans a breeding and hence a competitive advantage through reduced birth-spacing and greater survivorship compared with the Neanderthals, who were more demanding both on their bodies and on what they needed to extract from their environment. Also, if modern humans were generalists living from a wider range of resources than the more carnivorous Neanderthals, they would have been better able to cope in stressful times.

  As we saw, data on injuries in Neanderthals were used by Berger and Trinkaus to suggest that many of these resulted from confrontational hunting, with the further implication that it was not just adult males who suffered in this way. However, they recognized that there were alternative explanations, and I think one in particular needs to be considered, at least as an additional factor: that of interpersonal violence. The pattern of trauma when humans attack each other varies, of course, depending on the weapon used (if not the hands and feet) and on any defense mounted by the recipient. But upper body and head damage invariably predominates, and such injuries may unfortunately be inflicted on women and children as well as men. If a weapon was used, further forensic clues may be left in the form of the weapon itself or traces from it, and we have such data for a couple of Neanderthal wounds, with some interesting speculation surrounding the nature of the assailant.

  In chapter 4, I discussed the impact that the discovery of a burial at Saint-Césaire in France had on our views of the Neanderthals in the early 1980s, since it was late in time and associated with the Upper Paleolithic Châtelperronian industry. Recently, Christoph Zollikofer (see chapter 3) and his colleagues studied a scalp injury on the skull, resulting from a slash that was apparently caused by a blade-shaped object. The wound was not deep, although it would certainly have caused blood loss, and it had healed over quite well, suggesting that the individual survived for at least a few months after the incident, perhaps providing evidence for social support among Neanderthals (for more on this, see later in this chapter). Its position suggested that it was not caused by an accident such as a fall or a rock fall, and if the individual was standing upright, it was probably inflicted by a high-energy impact or thrust to the head from the front or back, perhaps by a hafted stone tool such as a spear point.

  A second Neanderthal, from Shanidar Cave in Iraq, and known as the Shanidar 3 man, also carries the mark of a spear wound, this time in his rib cage. The partially healed wound was noted by Trinkaus in his study of the skeleton some thirty years ago, but Churchill and his colleagues conducted more detailed studies on the sharp and deep slice in his ninth rib on the left side, including experiments with crossbows that involved firing stone points into pig carcasses. The wound had started to heal, but unlike the Saint-Césaire CSI, it was probably ultimately fatal, either through lung damage or infection, as the spear point may have lodged in the body (although it was apparently not recovered, or at least recognized, during the original excavations). Possible scenarios include a stone knife wound, a hunting injury, or even self-inflicted trauma, but the experiments suggested that the most probable cause was a spear that impacted at a downward angle of about forty-five degrees, most likely one that had been thrown rather than thrust in. Speculating further, Churchill and his team favored the idea that only modern humans had throwing spears with stone tips, and thus they suggested that a modern human rather than a Neanderthal could have been responsible, in an act of interspecies aggression. But could a modern human have been around at the time that the Shanidar 3 man was wounded? That is a major uncertainty since the incident can only be dated to roughly 50,000 years ago, and we cannot reliably place modern humans in Iraq that far back. Equally, it is just possible that the Saint-Césaire individual was confronted by an early Cro-Magnon in France, and these cases can be added to the claimed cannibalism of a Neanderthal child at Les Rois (discussed in chapter 4) as slender evidence that the two species may have had unfriendly encounters.

  So we know that the Neanderthals suffered many bodily injuries, and in some cases it seems they must have had social support from others in their group to recover, or at least to prolong their survival. There is a particularly early example of this from the Sima de los Huesos site at Atapuerca in Spain, dating from about 400,000 years ago, where a child with a deformed skull and brain, perhaps caused by an injury sustained before birth, was almost certainly disabled physically and mentally; yet this individual was not rejected at birth and survived the most dependent stages of infancy, dying at around the age of eight, for reasons that may or may not have been connected with the disability. As we discussed earlier, the Atapuerca population lay at the very beginnings of Neanderthal evolution, and it seems that the Neanderthals continued this kind of social support, as may well have been the case for the wounded Saint-Césaire and Shanidar individuals.

  Another individual from Shanidar may well demonstrate even higher and longer-lasting levels of social care: the Shanidar 1 man was probably about forty when he died, a very respectable age for a Neanderthal. Yet he had suffered a heavy blow to the left side of his skull and face—perhaps from a rock fall—and as a result may have been partly blind and deaf. Possibly connected with the incident, his right arm had been severely damaged: the upper arm had a badly healed fracture and was withered to a thin stump, and he had completely lost his lower arm and hand. His legs show that he was disabled in walking too, perhaps because the blow to the left side of the brain had caused paralysis on his right side, as may happen in modern injuries of this kind. Despite all those difficulties, he had apparently survived for many years, implying assistance and provisioning from others. Apes with arm or leg fractures or amputations can sometimes survive in the wild without social support, but for a Neanderthal living in the Zagros
Mountains it seems likely that his injuries would have been an immediate death sentence without consistent help from his group.

  There are several other examples of survival with impairment in Neanderthals, and also comparable examples from Africa: the 400,000-year-old Salé cranium from Morocco and the Singa cranium (more than 130,000 years old) from Sudan both show evidence of long-lasting and probably disabling deformation, yet these individuals survived into adulthood. In my view, this level of social support probably led to the practice of intentional burial, since, for example, leaving a body on the floor of a cave to which you might return could entail seeing your father, mother, or siblings picked over by hyenas or vultures. Later, with repetition and the addition of ritual, the rise of symbolic burials could have followed, with grave goods as tributes or offerings to help passage to the spirit world.

  To what extent the Neanderthals shared this behavior is still hotly argued, and a few archaeologists like Robert Gargett even doubt that Neanderthals buried their dead at all, in which case all the supposed burials in caves were either accidental or the result of roof falls et cetera. But I think there is sufficient evidence for some level of ritual behavior in the later Neanderthals at least, including infants being buried with simple grave goods. However, it seems likely that one of the most famous examples, which gave rise to the notion that the Neanderthals were the first “flower people,” was the result of other, rather surprising, agencies. After the Shanidar 4 burial was excavated from this Iraqi cave in 1960, analyses showed that the sediments contained clusters of pollen, suggesting that bright flowers (perhaps even some with medicinal properties) had been strewn around the body. However, the zooarchaeologist Richard Redding subsequently excavated a number of burrows of a gerbil-like rodent found in the Zagros Mountains near Shanidar and noted that these animals stored flower heads in their tunnels. In turn, the anthropologist Jeffrey Sommer noticed that the original excavators had reported rodent bones and burrows around the Neanderthal skeletons; thus it seems likely that the supposed flower burial of the Shanidar 4 man had a more prosaic and less romantic explanation.

 

‹ Prev