Book Read Free

Evolving Brains, Emerging Gods

Page 10

by E Fuller Torrey


  The simultaneous development of an introspective self and language would also be synergistic from an evolutionary point of view. Each would independently improve genetic fitness, but the people who could both think introspectively and talk about these thoughts would be able to discuss complex behaviors and therefore be more successful in passing on their genes. Early Homo sapiens was thus the first hominin who had a lot to talk about. And, as Steven Mithen noted, “Once Early Humans started talking, they just couldn’t stop.”25

  THE INTROSPECTIVE SELF AND THE GODS

  The acquisition of the introspective self was a defining event in the cognitive development of hominins. As Teilhard de Chardin noted, it is “no longer merely to know, but to know oneself; no longer merely to know, but to know that one knows.”26 At the developmental equivalent of about two years of age, we had acquired the ability to think about ourselves; at about the developmental equivalent of four years of age, we had acquired the ability to think about other people’s thoughts. Now, at the developmental equivalent of six years of age, we had acquired a second-order theory of mind, an ability to think about what another person thinks about us.

  At first glance, this cognitive ability would appear to make it possible for early Homo sapiens to conceive of, indeed to worship, the gods. By acquiring a theory of mind, Archaic Homo sapiens acquired the ability to appreciate that gods also had thoughts. Then, by acquiring a second-order theory of mind—introspection—early Homo sapiens acquired an ability to think about the fact that the gods may be thinking about us, and what they may be thinking, and what we think about what the gods are thinking about us. In short, early Homo sapiens had acquired the cognitive ability to enter into a conversation with the gods, just as modern Homo sapiens does today.

  But wait—where did the gods come from 100,000 years ago? Early Homo sapiens certainly had conversations with other early Homo sapiens regarding what they thought of one another, and what they thought about a third early Homo sapiens who had insulted them, and why they were no longer on speaking terms with the other person, ad finitum, just as happens today. But you cannot have such conversations about the gods or with the gods unless you have gods.

  One theory regarding the origins of gods is the human tendency to anthropomorphize—attribute human agency to—inanimate things or events. Thus we assume that thunder and lightning, floods and droughts, the rising of the sun and the phases of the moon must all be caused by some superhuman or divine power. Such pattern-seeking theories have given rise to multiple theories of the origins of gods and religions, as will be discussed in chapter 8. Perhaps 100,000 years ago early Homo sapiens listened to the thunder, watched the lightning, and decided that there must be gods living in the sky who were watching them.

  Since early Homo sapiens had acquired an awareness of the thoughts of others as well as the ability to think about what others were thinking, such a scenario for the creation of gods is theoretically possible. For a variety of reasons, however, it seems unlikely. First, why should an explanation for thunder and lightning require the concept of gods or other unseen spirits rather than phenomena that were more familiar to early Homo sapiens? Possible examples include such things as large animals who lived in the sky, or trees falling in an unseen world. Second, no religious symbols, effigies, or other artifacts have been found from this period that might have had some religious meaning. Much later, when it is known that gods did exist, such artifacts became very common. Third, an understanding of the importance of recurrent natural phenomena requires a cognitive ability to fully integrate the past and present into thoughts about the future. As will be described in the next chapter, early Homo sapiens apparently had not yet acquired this ability. Fourth, it may be questioned whether understanding natural phenomena is a sufficient stimulus, by itself, to elicit the creation of gods. When the gods finally emerged, they impelled some believers to build pyramids and cathedrals, spend long periods praying to the gods, forgo sexual pleasure by remaining celibate, and lay down their lives in warfare to defend their gods. Pattern-seeking theories would not seem powerful enough to elicit such personal sacrifice. For all of these reasons, it seems doubtful whether gods existed among early Homo sapiens 100,000 years ago.

  THE BRAIN OF EARLY HOMO SAPIENS

  In view of the impressive behavior exhibited by early Homo sapiens, one would expect to find equally impressive changes in their developing brain. But since the brain had already reached an average 1,350 cubic centimeters at least 100,000 years earlier, it could not grow larger, or the head of newborn babies would no longer fit through the bony outlet of a woman’s birth canal. The brain changes leading to early Homo sapiens, therefore, did not involve an increase in brain size but rather internal changes. This is an answer to the problem posed by linguist Derek Bickerton: “Any adequate account of how our species came into existence has to explain how it was that the brain grew to at least its present size without changing the hominid [hominin] way of life in any significant manner, and then, without further increase, made possible the stunning explosion of creativity that characterized our species.”27

  The brain areas associated with an introspective self have been well studied in recent years using neuroimaging techniques. Typically in such studies, “subjects are presented with trait adjectives or sentences and are asked whether the trait or sentence applies to them”; at the same time, their brain is being scanned by a PET or functional MRI machine. A meta-analysis of 20 such studies, done between 1999 and 2009, identified four major brain clusters that are activated by introspective thinking, as seen in figure 4.1.28

  One cluster is the anterior cingulate (BA 24, 32) and insula, areas that are also activated by an awareness of self and by an awareness of others’ thinking. Thus, it would be surprising if these areas were not also activated by introspective thinking. A second brain cluster activated by introspective thinking includes parts of the prefrontal cortex, including the frontal pole (BA 10), the lateral prefrontal cortex (BA 9, 46), and the orbitofrontal region (BA 47). This is consistent with observations claiming that “self-awareness, consciousness, or self-reflectedness” is “the highest psychological attribute of the frontal lobe.” Similarly, a review of studies of “social cognition,” defined as including “self-reflection, person perception, and making inferences about others’ thoughts,” concluded that the broadly defined medial prefrontal area has “a unique role” in social cognition.29

  FIGURE 4.1  Early Homo sapiens: an introspective self.

  A third brain area activated by introspective thinking is the posterior cingulate (BA 23), situated in the midline behind the anterior cingulate. It is strongly connected to many parts of the prefrontal cortex as well as to the temporo-parietal junction. The posterior cingulate is said to have become activated “when subjects had to indicate whether a word or statement was self- descriptive or not.” The final brain area activated by introspective thinking is the most anterior portion of the temporal lobe, called the temporal pole (BA 38). This is an older brain area that is not well understood but is known to play some role in thinking about other people’s thinking. It has been shown in neuroimaging studies to be especially activated by “tasks that require one to analyse other agent’s emotions, intentions or belief.”30

  At the same time that the brain areas associated with introspective thinking were developing, it seems likely that the white matter connecting tracts were also continuing to develop, especially the superior longitudinal, uncinate, and arcuate fasciculi. As described in the previous chapters, these tracts connect the anterior cingulate, insula, and prefrontal cortex anteriorally and the parietal and temporal regions posteriorally. One can easily imagine these areas becoming progressively interconnected as the white matter tracts mature, initially to make possible first-order theory of mind thinking, then second-order theory of mind thinking, then increasingly complex theory of mind assessments, and finally the ability of individuals to think about themselves thinking about themselves thinking about themsel
ves ad infinitum.

  Indeed, one of the great mysteries in human evolution is how human behavior changed so dramatically over the past 100,000 years, a brief moment in time compared to the entire span of human evolution. It seems likely that a major answer to this question will be found in the development of the white matter tracts and the cognitive skills and behaviors that became possible as the disparate brain areas became increasingly interconnected.

  Thus, by 60,000 years ago, when early Homo sapiens are thought to have left Africa, they had apparently acquired intelligence, an awareness of self, an awareness of others’ thinking, and, most remarkably, the ability to think about themselves thinking about themselves. They had acquired the cognitive skills they would need to displace all other existing hominins and to become the lords of the earth. But they had apparently not yet acquired one additional skill they would need to become truly modern Homo sapiens. That would come next.

  5

  MODERN HOMO SAPIENS

  A Temporal Self

  Time present and time past

  Are both perhaps present in time future,

  And time future contained in time past.

  —T. S. Eliot, Four Quartets, 1952

  The Homo sapiens who exited Africa about 60,000 years ago, perhaps earlier, were remarkable creatures—intelligent, self-aware, empathic, and introspective. In the following few thousand years, they spread widely, from Australia and Papua New Guinea in the east to Europe in the west, interbreeding with, and ultimately displacing, the older hominin species as they went.

  But as remarkable as these people were, you would not want your son or daughter to marry one of them. Although impressive as hominins, they were still missing the one critical cognitive skill that would be needed to become truly modern Homo sapiens and to worship the gods. It seems likely that the fundamental brain changes needed for the development of this skill were well underway at the time these Homo sapiens left Africa. Thus, this cognitive skill continued to evolve over the next several thousand years, no matter where they ultimately settled, another apparent example of parallel evolution.

  Beginning approximately 40,000 years ago, several new behaviors that we associate with modern Homo sapiens appeared. Specifying a period, such as 40,000 years ago, is, of course, merely a milestone. Evolution is a continuous process, and hominin brains were undergoing constant change. It should also be remembered that the dating of events in the distant past, even within the past 40,000 years, is subject to at least a 10 percent error; thus, something dated to 37,000 years ago may have actually taken place earlier than something dated to 40,000 years ago.

  Nevertheless, there is clear evidence that after Homo sapiens left Africa, the behavior of both those who left and those who remained underwent significant changes. For example, improvements were seen in the development of tools and weapons. Beginning about 49,000 years ago, there was said to be “a gradual abandonment of the technology and tool types” that had been used by early Homo sapiens and their predecessors. Although bone tools had occasionally been used by early Homo sapiens in South Africa as far back as 75,000 to 100,000 years ago, a “bone industry” emerged at this time. For the first time, animal bones, reindeer antlers, and mammoth ivory tusks became widely used as raw material for making tools and weapons. Bone, antlers, and tusks had, of course, been available for hundreds of thousands of years, but they had never previously been widely exploited as raw material. According to University of Alaska biologist Dale Guthrie, “Bone, antler and ivory are composites … [making them] harder and more durable than wood yet lighter and less subject to breakage than stone.” What followed was the emergence of a much broader variety of tools and weapons, including “spear points, chisels, wedges, spatulas, awls, drills, needles, perforated antlers used as shaft straighteners, and, later on, harpoons and spear propellers.”1

  Needles, for example, permitted the sewing of clothing, for which there is evidence that this occurred approximately 35,000 years ago in caves in the Republic of Georgia. Wool fibers from goat hair and plant fibers from flax, some of them dyed, were apparently used to make ropes, nets, and baskets as well as clothing. The sewn clothing would have been especially useful during the Ice Age that was to follow. As Brian Fagan noted: “For the first time, women could fashion garments tailored to size for infants, children, and growing youths, as well as for adults and old people. They could also sew clothes of all kinds.”2

  Another tool that came into use at this time was the lamp. The controlled use of fire had been known for at least 500,000 years, but the use of lamps was new. Many lamps have been found in caves in France and Spain where drawings and painting were created, as will be discussed later. Most lamps were stones with a natural depression that could be filled with tallow; lichen, moss, conifers, and juniper were used for the wick. A few of the lamps that have been found have engravings on them, and one even has a handle.3

  In addition to the tools, new and improved weapons also were developed beginning about 40,000 years ago. Spear throwers, also called atlatls, appeared about 30,000 years ago and could guide a spear faster, longer, and more accurately. They were especially effective for hunting dangerous animals, since a hunter using an atlatl could throw the spear from a safer distance.

  The bow and arrow, which had been used occasionally in earlier millennia, came into widespread use by at least 20,000 years ago; it also allowed a hunter to remain at a safer distance from the prey and enabled humans to kill birds in flight. Given weapons like the atlatl and bow and arrow, modern Homo sapiens began to hunt animals that had previously been inaccessible, such as the dangerous Cape buffalo in South Africa and the elusive ibex in Spain. The use of fish hooks and netting also enabled Homo sapiens to fish in deeper waters. In Indonesia, for example, the bones of tuna and shark dated to more than 40,000 years ago provide evidence for pelagic fishing at that time.4

  It was not merely the use of new tools and weapons that was impressive but also the speed with which they were introduced and improved. This is said to have demonstrated “an extraordinary increase in the human capacity to create and invent.” In the Prehistory of the Mind, Steven Mithen likens this period to “a Paleolithic arms race”:

  It is not simply the introduction of new tools at the start of the Upper Palaeolithic [45,000–11,000 years ago] which is important. It is how these were then constantly modified and changed. Through the Upper Palaeolithic we can see the processes of innovation and experimentation at work, resulting in a constant stream of new hunting weapons appropriate to the prevailing environmental conditions and building on the knowledge of previous generations.

  The speed of such technological innovation and experimentation stands in sharp contrast to the preceding hundreds of thousands of years in which change was glacially slow, when it occurred at all.5

  Other items that became more widespread beginning about 40,000 years ago are what appear to be memory devices. These include pieces of bone that have been engraved with a series of lines or dots, similar to the engraved ochre dating to more than 90,000 years ago found in South Africa and described in the last chapter. Alexander Marshack, a self-taught archeologist at Harvard’s Peabody Museum, carried out extensive studies of these engraved bones, studying the incisions with a microscope and speculating on how they had been used. The best-known example, found in France, dates to approximately 30,000 years ago and was originally interpreted by Marshack as representing the phases of the moon during two lunar cycles. Marshack claimed the engraver of the bone “not only had an image of the waxing and waning of the moon, but he had also created an abstracted image of the continuity and periodicity of time itself.” Sir John Eccles put a picture of this bone on the cover of his book Evolution of the Brain: Creation of the Self. Marshack referred to the bones as “notational devices,” and in his book The Roots of Civilization, published in 1972, suggested that they represent “the evolved, time-factored, and time-factoring human capacity, the cognitive ability to think sequentially in terms of proc
ess within time and space.” In more recent years, critics have charged Marshack with overinterpreting the meaning of the incised bones, even as most agree that the bones represent some sort of “external memory devices,” although not necessarily for recording lunar cycles.6

  In addition to using new tools, weapons, and memory devices, beginning about 40,000 years ago modern Homo sapiens exhibited increasingly varied and sophisticated forms of self-ornamentation. Randall White, an archeologist at New York University, described it as an “explosion of items of bodily adornment at the beginning of the Upper Paleolithic.… The emergence of technology for the manufacture of body ornaments in Europe is sudden and the technology itself is complex and full-blown from the beginning.” Whereas the necklaces and bracelets found earlier in South Africa had been made with only seashells, the newer forms of self-ornamentation also used animal teeth, animal bones, antlers, ivory tusks, snail shells, bird claws, ostrich eggshells, and colorful stones to make rings, pins, and pendants in addition to necklaces and bracelets. One necklace, for example, was made from “nearly 150 perforated antler, bone and stone beads and 5 pendants, some of them incised and decorated.” Some sites in the Dordogne region of France are said to have been “factories” for making beads and pendants, where “the manufacture of items of personal adornment became an industry in itself.”7

  Examples of self-ornamentation during this period are geographically widespread in Europe and the Middle East, having been found in France, Spain, the Czech Republic, Bulgaria, Lebanon, and Turkey. In the latter, necklaces made of snail shells and bird claws have been dated to 43,000 years ago. Similar examples of self-ornamentation have been found in Africa, including in Morocco, Algeria, Kenya, Tanzania, and South Africa. The use of ostrich eggshells was especially common in Africa. Everywhere there appears to have been distinct regional styles, depending on the availability of materials, with materials that were most unusual, such as luminous white or brightly colored shells, being most highly valued.8

 

‹ Prev