Book Read Free

The Origin of Humankind

Page 7

by Richard Leakey


  In the majority of surviving hunter-gatherer societies that anthropologists have studied, there is a clear division of labor, with males responsible for hunting and females for gathering plant foods. The camp is a place of intense social interaction, and a place where food is shared; when meat is available, this sharing often involves elaborate ritual, which is governed by strict social rules.

  To Westerners, the eking out of an existence from the natural resources of the environment by means of the simplest of technologies seems a daunting challenge. In reality, it is an extremely efficient mode of subsistence, so that foragers can often collect in three or four hours sufficient food for the day. A major research project of the 1960s and 1970s conducted by a team of Harvard anthropologists showed this to be true of the !Kung San, whose homeland in the Kalahari Desert of Botswana is marginal in the extreme. Hunter-gatherers are attuned to their physical environment in a way that is difficult for the urbanized Western mind to grasp. As a result, they know how to exploit what to modern eyes seem meager resources. The power of their way of life lies in this exploitation of plant and animal resources within a social system that fosters interdependence and cooperation.

  The notion that hunting was important in human evolution has a long history in anthropological thought, going back to Darwin. In his 1871 book The Descent of Man, he suggested that stone weapons were used not only for defense against predators but also for bringing down prey. The adoption of hunting with artificial weapons was part of what made humans human, he argued. Darwin’s image of our ancestors was clearly influenced by his experience while on his five-year voyage on the Beagle. This is how he described his encounter with the people of Tierra del Fuego, at the southern tip of South America:

  There can hardly be any doubt that we are descended from barbarians. The astonishment which I felt on first seeing a party of Fuegans on a wild and broken shore will never be forgotten by me, for the reflection at once rushed into my mind—such were our ancestors. These men were absolutely naked and bedaubed with paint, their long hair was tangled, their mouths, frothed with excitement, and their expression was wild, startled and distrustful. They possessed hardly any arts, and like wild animals lived on what they could catch.

  The conviction that hunting was central to our evolution, and the conflation of our ancestors’ way of life with that of surviving technologically primitive people, imprinted itself firmly on anthropological thought. In a thoughtful essay on this issue, the biologist Timothy Perper and the anthropologist Carmel Schrire, both at Rutgers University, put it succinctly: “The hunting model assumes that hunting and meat-eating triggered human evolution and propelled man to the creature he is today.” According to this model, the activity shaped our ancestors in three ways, explain Perper and Schrire, “affecting the psychological, social, and territorial behavior of early man.” In a classic 1963 paper on the topic, the South African anthropologist John Robinson expressed the measure of import the science accorded to hunting in human prehistory:

  [T]he incorporation of meat-eating in the diet seems to me to have been an evolutionary change of enormous importance which opened up a vast new evolutionary field. The change, in my opinion, ranks in evolutionary importance with the origin of mammals—perhaps more appropriately with the origin of tetrapods. With the relatively great expansion of intelligence and culture it introduced a new dimension and a new evolutionary mechanism into the evolutionary picture, which at best are only palely foreshadowed in other animals.

  Our assumed hunting heritage took on mythic aspects, too, becoming equivalent to the original sin of Adam and Eve, who had to leave Paradise after eating of the forbidden fruit. “In the hunting model, man ate meat in order to survive in the harsh savanna, and by virtue of this strategy became the animal whose subsequent history is etched in a medium of violence, conquest, and bloodshed,” observe Perper and Schrire. This was the theme taken up by Raymond Dart in some of his writings in the 1950s and, more popularly, by Robert Ardrey. “Not in innocence, and not in Asia, was mankind born,” is the famous opening to Ardrey’s 1971 book African Genesis. The image proved to be powerful in the minds of both the public and the profession. And, as we shall see, image has been important in the way the archeological record has been interpreted in this respect.

  A 1966 conference on “Man the Hunter” at the University of Chicago was a landmark in the development of anthropological thinking about the role of hunting in our evolution. The conference was important for several reasons, not least for its recognition that the gathering of plant foods provided the major supply of calories for most hunter-gatherer societies. And, just as Darwin had done almost a century earlier, the conference equated what we know of the lifeways of modern hunter-gatherers with the behavior patterns of our earliest ancestors. As a result, apparent evidence of meat-eating in the prehistoric record—in the form of accumulations of stone tools and animal bones—had a clear implication, as my friend and colleague the Harvard University archeologist Glynn Isaac observed: “Having, as it were, followed an apparently uninterrupted trail of stone and bone refuse back through the Pleistocene it seemed natural ... to treat these accumulations of artifacts and faunal remains as being ‘fossil home base sites.’” In other words, our ancestors were considered to have lived as modern hunter-gatherers do, albeit in a more primitive form.

  Isaac promulgated a significant advance in anthropological thinking with his food-sharing hypothesis, which he published in a major article in Scientific American in 1978. In it he shifted the emphasis away from hunting per se as the force that shaped human behavior and toward the impact of the collaborative acquisition and sharing of food. “The adoption of food-sharing would have favored the development of language, social reciprocity and the intellect,” he told a 1982 gathering that marked the centenary of Darwin’s death.

  Five patterns of behavior separate humans from our ape relatives, he wrote in his 1978 paper: (1) a bipedal mode of locomotion, (2) a spoken language, (3) regular, systematic sharing of food in a social context, (4) living in home bases, (5) the hunting of large prey. These describe modern human behavior, of course. But, Isaac suggested, by 2 million years ago “various fundamental shifts had begun to take place in hominid social and ecological arrangements.” They were already hunter-gatherers in embryo, living in small, mobile bands and occupying temporary camps from which the males went out to hunt prey and the females to gather plant foods. The camp provided the social focus, at which food was shared. “Although meat was an important component of the diet, it might have been acquired by hunting or by scavenging,” Isaac told me in 1984, a year before his tragically early death. “You would be hard pressed to say which, given the kind of evidence we have from most archeological sites.”

  Isaac’s viewpoint strongly influenced the way the archeological record was interpreted. Whenever stone tools were discovered in association with the fossilized bones of animals, it was taken as an indication of an ancient “home base,” the meager litter of perhaps several days’ activity of a band of hunter-gatherers. Isaac’s argument was plausible, and I wrote in my 1981 book The Making of Mankind that “the food-sharing hypothesis is a strong candidate for explaining what set early humans on the road to modern man.” The hypothesis seemed consistent with the way I saw the fossil and archeological records, and it followed sound biological principles. Richard Potts, of the Smithsonian Institution, agreed. In his 1988 book titled Early Hominid Activities at Olduvai, he observed that Isaac’s hypothesis “seemed a very attractive interpretation,” noting:

  The home-base, food-sharing hypothesis integrates so many aspects of human behavior and social life that are important to anthropologists—reciprocity systems, exchange, kinship, subsistence, division of labor, and language. Seeing what appeared to be elements of the hunting-and-gathering way of life in the record, in the bones and stones, archeologists inferred that the rest followed. It was a very complete picture.

  In the late 1970s and early 1980s, however, this thin
king began to change, prompted by Isaac and by the archeologist Lewis Binford, then at the University of New Mexico. Both men realized that much of prevailing interpretation of the prehistoric record was based on unspoken assumptions. Independently, they began to separate what could truly be known from the record from what was simply assumed. It began at the most fundamental level, questioning the significance of finding stones and animal bones in the same place. Did this spatial coincidence imply prehistoric butchery, as had been assumed? And if butchery could be proved, does that imply that the people who did it lived as modern hunter-gatherers do?

  Isaac and I talked often about various subsistence hypotheses, and he would create scenarios in which bones and stones might finish up in the same place but have nothing to do with a hunting-and-gathering way of life. For instance, a group of early humans might have spent some time beneath a tree simply for the shade it afforded, knapping stones for some purpose other than butchering carcasses—for example, they might have been making flakes for whittling sticks, which could be used to unearth tubers. Some time later, after the group had moved on, a leopard might have climbed the tree, hauling its kill with it, as leopards often do. Gradually, the carcass would have rotted and the bones would have tumbled to the ground to lie amid the scatter of stones left there by the toolmakers. How could an archeologist excavating the site 1.5 million years later distinguish between this scenario and the previously favored interpretation of butchering by a group of nomadic hunters and gatherers? My instinct was that early humans did in fact pursue some version of hunting and gathering, but I could see Isaac’s concern over a secure reading of the evidence.

  Lewis Binford’s assault on conventional wisdom was rather more acerbic than Isaac’s. In his 1981 book Bones: Ancient Men and Modern Myth, he suggested that archeologists who viewed stone-tool and bone assemblages as the remains of ancient campsites were “making up ‘just-so’ stories about our hominid past.” Binford, who has done little of his work on early archeological sites, derived his views initially from study of the bones of Neanderthals, who lived in Eurasia between about 135,000 and 34,000 years ago.

  “I became convinced that the organization of the hunting and gathering way of life among these relatively recent ancestors was quite different than that among fully modern Homo sapiens,” he wrote in a major review in 1985. “If this was true then the almost ‘human’ lifeways depicted in the ‘consensus’ view of the very early hominids stood out as an extremely unlikely condition.” Binford suggested that systematic hunting of any kind began to appear only when modern humans evolved, for which date he gives 45,000 to 35,000 years ago.

  None of the early archeological sites could be regarded as remains of living floors from ancient campsites, argued Binford. He reached this conclusion through analyzing other people’s data on the bones at some of the famous archeological sites in Olduvai Gorge. They were the kill sites of nonhuman predators, he said. Once the predators, such as lion and hyena, had moved on, hominids came to the site to pick up what scraps they could scavenge. “The major, or in many cases the only, usable or edible parts consisted of bone marrow,” he wrote. “There is no evidence supporting the idea that the hominids were removing food from the locations of procurement to a base camp for consumption. . . . Similarly, the argument that food was shared is totally unsupported.” This idea presents a very different picture of our forebears, 2 million years ago. “They were not romantic ancestors,” wrote Binford, “but eclectic feeders commonly scavenging the carcasses of dead ungulates for minor food morsels.”

  In this view of early human prehistory, our ancestors become much less humanlike, not just in their mode of subsistence but also in other elements of behavior: for instance, language, morality, and consciousness would be absent. Binford concluded: “Our species had arrived—not as a result of gradual, progressive processes but explosively in a relatively short period of time.” This was the philosophical core of the debate. If early Homo displayed aspects of a humanlike way of life, then we have to accept the emergence of the essence of humanity as a gradual process—one that links us to the deep past. If, however, truly humanlike behavior emerged rapidly and recently, then we stand in splendid isolation, disconnected from the deep past and the rest of nature.

  Although Isaac shared Binford’s concerns about past overinterpretation of the prehistoric record, he took a different approach to rectifying it. Where Binford worked largely with other people’s data, Isaac decided he would excavate an archeological site, looking at the evidence with new eyes. Although the distinction between hunting and scavenging was not crucial to Isaac’s food-sharing hypothesis, it became important in reexamining the archeological record. Hunter or scavenger? This was the crux of the debate.

  In principle, hunting should imprint itself in a different way on the archeological record from scavenging. The record of the difference should be evident in the body parts left behind by the hunter and the scavenger. For instance, when a hunter secures a kill, he has the option of carrying the entire carcass, or any part of it, back to camp. A scavenger, by contrast, has available only whatever he might find at an abandoned kill site: the choice of body parts he can take back to camp will be more limited. The variety of bones found at the camp of a hominid hunter should therefore be wider—including, at times, an entire skeleton—than that at the camp of a hominid scavenger.

  There are, however, many factors that can confound this neat picture. As Potts has observed: “If a scavenger finds the carcass of an animal that has just died of natural causes, then all the body parts are available to the scavenger, and the bone pattern that results will look just like hunting. And if a scavenger manages to drive a predator off its kill very early, the pattern will again look like hunting. What are you to do?” The Chicago anthropologist Richard Klein, who has analyzed many bone assemblages in southern Africa and Europe, believes the task of distinguishing between the two subsistence methods may be impossible: “There are so many ways that bones can get to a site, and so many things that can happen to them, that the hunter-versus-scavenger question may never be resolved for hominids.”

  The excavation Isaac embarked upon to test the new thinking was known as site 50, which is located near the Karari Escarpment about 15 miles east of Lake Turkana, in northern Kenya. During a period of three years, beginning in 1977, he and a team of archeologists and geologists exposed an ancient land surface, the sandy bank of a small stream. Carefully, they unearthed 1405 pieces of stone artifacts and 2100 fragments of bone, some large, most small, which had been buried some 1.5 million years ago when a seasonal stream had flooded early in a rainy season. Today, the region is arid, with bush and scrub interspersed among badlands carved by eons of erosion. The goal Isaac and his team set themselves was to discover what had occurred 1.5 million years ago, when stone artifacts and many animal bones came to rest in the same place.

  In his earlier critiques, Binford had suggested that many co-occurrences of bone and stone were the result of water action. That is, a fast-running stream can carry pieces of bone and stone along and then dump them at a point of low energy, such as where the stream widens or at the inside bank of a bend. In this case, the accumulation of bone and stone in the same location would be the result of chance, not hominid activity. The “archeological site” would be no more than a hydraulic jumble. Such an explanation seemed unlikely for site 50, because the ancient land surface had been on the bank of a stream, not in it, and because clues from geology showed that the site had been buried very gently. Nevertheless, a direct association between bone and stone had to be demonstrated, not assumed. That demonstration came in a most unexpected way and formed one of the landmark discoveries in archeology in recent times.

  When an animal is dismembered or a bone is defleshed with a knife, either of metal or of stone, the butcher inevitably slices into the bone occasionally, leaving long grooves or cut marks. During dismemberment, the cut marks would be concentrated around the joints, while in defleshing they would be i
nflicted elsewhere, too. When the University of Wisconsin archeologist Henry Bunn was examining some of the bone fragments from site 50, he noticed such grooves. Under the microscope, they could be seen to be V-shape in cross section. Was this a cut mark, made 1.5 million years ago by a hominid forager? Experiments with modern bone and stone flakes confirmed it, proving conclusively a causal relationship between the bone and the stone at the site: hominids had brought them there and processed them for a meal. This discovery was the first direct demonstration of a behavioral link between bones and stones at an early archeological site. It was the smoking gun in the mystery of ancient sites.

  It often happens in science that important discoveries are made independently at about the same time. So it was with cut marks. Working with bones from archeological sites around Lake Turkana and at Olduvai Gorge, Richard Potts and the Johns Hopkins archeologist Pat Shipman also found cut marks. Their methods of study were slightly different from Bunn’s, but the answer was the same: hominids close to 2 million years ago were using stone flakes to dismember carcasses and deflesh bones (see figure 4.1). In retrospect, it is surprising that cut marks had not been discovered earlier, because the bones examined by Potts and Shipman had been studied many times by many people. A moment’s reflection would have convinced the alert mind that, if the prevailing archeological theory was correct, signs of butchery must be present on some fossil bones. But no one had looked assiduously, because the answer was assumed. Once the unspoken assumptions of prevailing theory were questioned, however, the time was right to look for and find them.

  Site 50 yielded further evidence of hominids’ using stone on bone as part of their daily life. Some of the long bones at the site were shattered into pieces, the result, it developed, of someone’s placing the bone on a stone, like an anvil, and then delivering a series of blows along the bone, thus giving access to the marrow inside. This scenario was reconstructed from a paleolithic jigsaw puzzle, in which the fragments were assembled to form the entire bone and analysis made of the pattern of breakage, which included characteristic signs of percussion. “Finding the fitting pieces of hammer-shattered bone shafts invites one to envisage early proto-humans in the very act of extracting and eating marrow,” Isaac and his colleagues wrote in a paper describing their findings. Of the cut marks they said: “Finding an articular end of bone, with marks apparently formed when a sharp-edged stone was used to dismember an antelope leg, cannot but conjure up very specific images of butchery in progress.”

 

‹ Prev