Evolving Brains, Emerging Gods

Home > Other > Evolving Brains, Emerging Gods > Page 12
Evolving Brains, Emerging Gods Page 12

by E Fuller Torrey


  Although human figures are relatively rare in cave art, human handprints are very common, especially in the earlier painted caves. Chauvet Cave, with drawings dated to 36,000 years ago, has hundreds of handprints, the largest number being palm prints that are now faded and look like red dots. It also has complete handprints, made by covering the hand with pigment and then placing the hand against the wall, as well as “negatives” made by holding a hand against the wall and outlining it by blowing red ochre pigment over it. Gargas Cave in southwestern France, with drawings dated to 27,000 years ago, has over 200 handprints. The handprints in these caves are similar to those dating to later periods found in at least 30 other European caves and at rock art sites in South Africa, Indonesia, Australia, Papua New Guinea, Argentina, and the United States.28

  In addition to animal figures and human handprints, the third category of commonly found figures in cave art is geometric figures. These are extremely numerous and found in almost all caves and rock art sites from this period. They vary from small dots and lines to circles and spirals to claviform (club-shaped) and tectiform (hut-shaped) drawings. They are commonly included on panels that depict animals but also stand alone. In some cases, the geometric shapes are superimposed on the animals; straight lines so placed have been thought to represent spears or arrows. These geometric figures are often referred to as “signs” or “symbols,” but it is unknown what they symbolize. They have been called “the most mysterious figures in cave art.”29

  Improved and novel forms of tools and weapons; memory devices; diversified and widespread self-ornamentation; intentional human burials with grave goods; musical instruments; painted caves; sculptures and figurines; decorated objects of all kinds—it was an outpouring of human creativity unlike anything seen in the six-million-year hominin history. As Randall White summarized it, “material forms of representation exploded onto the scene between 40,000 and 30,000 years ago in Europe.” When these developments are placed on a timeline (see figure 5.1), many of them are seen to have occurred at approximately the same time in geographically disparate parts of the world. Some writers have referred to this period as a “human revolution.”30

  But was this really a “human revolution”? In 2000 two anthropologists, Sally McBrearty and Alison Brooks, published an influential article titled “The Revolution That Wasn’t.” They argued that many of the developments described as occurring about 40,000 years ago had, in fact, been seen 40,000 to 60,000 years earlier, including the use of bone tools, fishing, the use of body ornaments, human burials, and trade networks. Rather than being a revolution, they argued that it had been “an accretionary process, a gradual accumulation of modern behaviors” that had taken place primarily in Africa over 200,000 years.31

  FIGURE 5.1  Time Line: 45,000 to 13,000 years ago.

  McBrearty and Brooks are correct that many of the developments seen beginning about 40,000 years ago had been seen thousands of years earlier, even if they were seen less commonly. I believe they are incorrect, however, in not viewing these changes as a human revolution. As noted in the previous chapter, it seems likely that a major cognitive change—the acquisition of introspection—occurred approximately 100,000 years ago and that this cognitive change largely accounts for the new behaviors seen at that time. If that is true, is it possible that another major cognitive change took place approximately 40,000 years ago? If so, what was it?

  MASTERING THE FUTURE: THE EVOLUTION OF AUTOBIOGRAPHICAL MEMORY

  At about age four, children develop the first phases of what is known as autobiographical memory, sometimes also referred to as episodic memory. Prior to age four a child is said to live “in a comparatively foreshortened world of time. The present is what is outstanding for it. Its life goes neither far into the past nor far into the future.” This was illustrated by experiments carried out by Daniel Povinelli and his colleagues in which they asked at what age young children “come to conceive of the self as possessing explicit temporal dimensions.” They assessed time sense in children between ages two and five by placing a large sticker on their forehead, then showing the children videotapes of themselves with the sticker, with a delayed time interval between placing the sticker and showing the videotape. Almost none of the two- and three-year-old children reached up to remove the sticker, but most of the four year olds did. Povinelli et al. concluded that “the younger children possess a different understanding of the self than the older children. In particular, they may not readily appreciate that past events in which they participated (and hence of which they have memories) happened to them … although the events depicted may be recalled, they were not encoded as autobiographical memories and hence the children do not understand that they happened to them.” As children get older, they are able “to knit together historical instances of him- or herself into a unique, unduplicated self.” The result, in the words of psychologist and philosopher William James, is an “unbrokenness in the stream of selves,” one able to project experiences from the past and present into the future.32

  Studies of children also suggest that the development of basic cognitive skills, including an ability to compare a drawing or picture with what the person has seen in the past, may be necessary for an understanding of art. Such studies have demonstrated that children younger than two do not understand the nature of a picture; for example, they may try to pick up a ball that is in a picture. At age three, some children still believe that a picture of an ice cream cone will feel cold and a picture of a rose will smell sweet. Until age four, “many children think that turning a picture of a bowl of popcorn upside down will result in the depicted popcorn falling out of the bowl,” according to University of Illinois psychologist Judy DeLoache, who has pioneered research on children’s understanding of pictures. This suggests that the outpouring of art that occurred beginning about 40,000 years ago may have been dependent on the cognitive developments taking place at that time.33

  Autobiographical memory is one of two types of long-term memory. Short-term, also called working, memory serves to “hold and handle information in the mind needed for the execution of cognitive tasks such as reasoning, comprehension, learning and carrying out sequences of action.” Short-term memory is the memory used when you try to remember a new telephone number as you dial it. Long-term memory, by contrast, consists of memory “traces” that may be stored for decades. One type of long-term memory is called semantic memory. This is the long-term memory that stores facts, such as the capital of France. The second type of long- term memory is the autobiographical form. In contrast to semantic memory, autobiographical memory is a reliving of past events both sensually and emotionally. The difference has been described as follows: “It is our semantic memory that allows us to state the name and location of the high school we attended, [but] it is episodic [autobiographical] memory that allows us to re- experience the emotions and events during our first day at this school.” Marcel Proust, in his Remembrance of Things Past, provided one of literature’s best examples of autobiographical memory:

  One day in winter, as I came home, my mother, seeing that I was cold, offered me some tea, a thing I did not ordinarily take.… I raised to my lips a spoonful of the tea in which I had soaked a morsel of the cake. No sooner had the warm liquid, and the crumbs with it, touched my palate than a shudder ran through my whole body, and I stopped, intent upon the extraordinary changes that were taking place. An exquisite pleasure had invaded my senses, but individual, detached, with no suggestion of its origin.… And suddenly the memory returns. The taste was that of the little crumb of madeleine which on Sunday mornings at Combray … my aunt Léonie used to give me, dipping it first in her own cup of real or of lime- flower tea.… But when from a long-distant past nothing subsists … the smell and taste of things remain poised a long time, like souls, ready to remind us, waiting and hoping for their moment, amid the ruins of all the rest; and bear unfaltering, in the tiny and almost impalpable drop of their essence, the vast structure of recollection.34<
br />
  Although researchers have largely focused on the past dimensions of autobiographical memory, there also is a future dimension. For example, your semantic memory will tell you the address of a four-star restaurant at which you have a reservation, but the future equivalent of your autobiographical memory will allow you to anticipate the visual and gustatory delights you hope to experience there. This has been called “pre-experiencing an event.” Studies of the development of autobiographical memory in children have shown that the past and future dimensions develop simultaneously and are cognitively integrated. Together, they form the temporal self, enabling the person to use the past to master the future. This linking of the past with the future, according to Sir John Eccles, demonstrates “the extraordinary ability of humans to plan for the future while profiting from the memory of past experiences.” Eccles added that “we live in a time paradigm of past-present-future. When humans are consciously aware of the time NOW, this experience contains not only the memory of past events, but also anticipated future events.” It has even been claimed that “the primary role of episodic [autobiographical] memory … may be to provide information from the past for the simulation of the future.”35

  Several writers have noted the future as well as the past dimensions of autobiographical memory. The opening lines of T. S. Eliot’s Four Quartets describe it succinctly:

  Time present and time past

  Are both present in time future,

  And time future contained in time past.

  And in Lewis Carroll’s Through the Looking Glass, the White Queen instructs Alice that “one’s memory works both ways.”

  “I’m sure mine only works one way,” Alice remarked. “I can’t remember things before they happen.”

  “It’s a poor sort of memory that only works backwards,” the Queen remarked. “What sort of things do you remember best?” Alice ventured to ask.

  “Oh, things that happened the week after next,” the Queen replied in a careless tone.36

  Both semantic and autobiographical past memories may be lost in individuals who have Alzheimer’s disease and, significantly, such individuals also lose their ability to envision the future. Occasional individuals have been described with other brain abnormalities who retain their semantic memory but lose their autobiographical memory. One such man “still had memories of brain facts,” such as how to make a long-distance telephone call, but he “could not recall a single event from his own life.” When asked about the future, he said his mind was just “blank”—“it’s like being in a room with nothing there and having a guy tell you to go find a chair.” Another man, who suffered brain damage following a heart attack, retained his semantic memory for past public events but “was unable to consciously bring to mind a single thing he had done or experienced before his heart attack.” Thus, “he knew the name of the company where he had worked … but he could not recall a single occasion when he was at work or a single event that occurred there.” Similarly, he cited global warming as a threat to the future but “had severe difficulty imagining what his experiences might be like in the future.” The authors of this study concluded that autobiographical memory “enables a person to mentally travel back in time to relive previously experienced personal events,” which in turn provides “a foundation for imagining what one’s experiences might be like in the future.”37

  Do animals other than humans have an autobiographical memory? Many animals prepare for the future, storing food and migrating, but it is thought that they do so automatically, by instinct. Some researchers have claimed that chimpanzees have an ability to use the past to plan the future, since they have been known to save tools for possible future use. And then there is Santino, a chimpanzee at a Swedish zoo, who is said to collect stones in a pile so he can throw them at visitors when the zoo opens later in the morning. Other researchers have claimed that scrub jays have an autobiographical memory, since they not only store food but also anticipate when other birds are likely to steal their stored food. Most recently, some researchers have claimed that rats have an autobiographical memory based on the activation of their hippocampus when they are running mazes. The debate whether such behaviors represent true autobiographical memory continues, with the majority of researchers calling the evidence equivocal.38

  The acquisition of autobiographical memory by modern Homo sapiens would have provided them with a significant evolutionary advantage over Neandertals and the other remaining species of Archaic Homo sapiens, who apparently did not possess this cognitive skill. It allowed humans to flexibly consider a variety of past events in planning future behaviors. To illustrate, consider the difference between hunters equipped with only semantic memory 75,000 years ago and hunters equipped with both semantic and autobiographical memory 25,000 years ago. A hunter 75,000 years ago might have planned as follows: “I remember that the reindeer came down the valley and crossed the river when the sun went down over the hill. I killed two of them and will hunt again next year.”

  In contrast, a hunter 25,000 years ago might have planned as follows:

  I remember that the reindeer came down the valley and crossed the river at the time when the sun went down next to the large tree on the hill, because that was the same time my sister died giving birth. We only killed twelve reindeer because my brother-in-law’s clan, with which we were hunting, brought along young boys who made too much noise and couldn’t follow orders. So next fall, we will not hunt with them but rather with the clan of my mother’s sister. And we will station women downstream around the bend of the river to drag the dying reindeer out of the water so the men don’t have to take the time to do that but can keep on killing the reindeer. My brother-in-law may be angry with me, but I will give him my fox-teeth pendant, which he has admired in the past, so then we will remain on good terms. If we plan the hunt carefully and give everyone assigned tasks, we should be able to kill thirty or more reindeer. This will give us good food to store for winter.

  This hypothetical scenario illustrating the advantages of an autobiographical memory is consistent with the fact that hunting at this time is said to have “shifted from hunting individual and small groups of animals to slaughtering mass herds of reindeer and red deer … likely to have been attacked at critical points on their annual migration routes when the animals were constrained in narrow valleys, or when crossing rivers.” Modern humans were probably keeping precise records telling them when the animal migration was likely to occur, and they could then envision various scenarios predicting where the animals would be most vulnerable. Thus, during the spring salmon run or annual deer migrations, modern Homo sapiens could use their autobiographical memory to maximize their acquisition of food.39

  There are additional suggestions in the later years of this era, between approximately 18,000 and 11,000 years ago, that such mass killings of animals were carried out by large groups of people in a cooperative manner. Instead of each small group of hunter-gatherers going its own way, they increasingly joined together at preappointed times to hunt cooperatively. As South African archeologist David Lewis-Williams described it: “We also need to note the presence of large Upper Paleolithic settlements.… These settlements were probably aggregation sites. Communities split up into small bands during some seasons of the year and then united at recognized aggregation sites at others.” Some of these aggregation sites have been identified in France and Spain, such as the area around the cave at Altamira, and include structures suggesting that large numbers of people lived in what appear to have been permanent shelters.40

  Such cooperative hunting would also have been facilitated by the combination of autobiographical memory with language. As Australian psychologist Thomas Suddendorf and colleagues noted: “The evolution of language itself is intimately connected with the evolution of mental time travel.… Language allows personal episodes and plans to be shared, enhancing the ability to plan and construct viable futures.” Suddendorf even claimed that “mental time travel was a prime mover in human evo
lution.”41

  THE EMERGENCE OF RELIGIOUS THOUGHT 1: THE MEANING OF DEATH

  Using an evolutionary perspective, the emergence of religious thought was described by British anthropologist Edward B. Tylor in his book Primitive Culture, published in 1871. Tylor had been strongly influenced by Charles Darwin and his publication On the Origin of Species, published in 1859. As Darwin theorized that modern Homo sapiens had evolved from earlier hominins and primates, so Tylor theorized that “higher” cultures had evolved from “lower” or “primitive” cultures such as those Tylor had studied in Mexico. Tylor and Darwin corresponded, and Tylor cited Darwin’s cultural findings in his book. Tylor believed that “primitive” people had initially developed religious ideas based on their understanding of death and dreams. Such an understanding would have been made possible by the acquisition of an autobiographical memory.42

  Prior to about 40,000 years ago, hominins had been observing other hominins die for more than six million years. They were intimately acquainted with death as something that happened to others. They observed people die within their living group—children from disease, women from childbirth, men from hunting accidents, and older adults from starvation. They also occasionally encountered deceased hominins as they foraged for food or followed herds of deer. Unlike today, when the biological realities of death are relegated to the offices of medical examiners and morticians, early hominins saw corpses in all stages of decomposition, since even the occasional burial of bodies was apparently not practiced until the last 100,000 years.

 

‹ Prev