A Troublesome Inheritance: Genes, Race and Human History
Page 7
Social behavior changes because, over a period of generations, genes and culture interact. “The genes hold culture on a leash,” Wilson writes. “The leash is very long but inevitably values will be constrained in accordance with their effects on the human gene pool.” 19 Harmful cultural practices may lead to extinction, but advantageous ones create selective pressures that can promote specific genetic variants. If a cultural practice provides a significant survival advantage, genes that enable a person to engage in that practice will become more common.
This interaction between the genome and society, known as gene-culture evolution, has probably been a powerful force in shaping human societies. At present it has been documented for only minor dietary changes, but these establish the principle. The leading example is that of lactose tolerance, the ability to digest milk in adulthood by means of the enzyme lactase, which breaks down lactose, the principal sugar in milk.
Figure 3.1. Distribution of lactose tolerance in present-day Europe (dark gray = 100%). Dotted area shows homeland of Funnel Beaker Culture, which flourished 6,000 to 5,000 years ago.
SOURCE: ALBANO BEJA-PEREIRA, NATURE GENETICS 35 (2003), PP. 311–15
In most human populations, the lactase gene is permanently switched off after weaning so as to save the energy required to make the lactase enzyme. Lactose, the sugar metabolized by the lactase enzyme, occurs only in milk, so that when a person has finished breast-feeding, lactase will never be needed again. But in populations that learned to herd cattle and drink raw cow’s milk, notably the Funnel Beaker Culture that flourished in north central Europe from 6,000 to 5,000 years ago, there was a great selective advantage in keeping the lactase gene switched on. Almost all Dutch and Swedish people today are lactose tolerant, meaning they carry the mutation that keeps the lactase gene permanently switched on. The mutation is progressively less common in Europe with increasing distance from the core region of the ancient Funnel Beaker Culture.
Three different mutations that have the same result have been detected in pastoral peoples of eastern Africa. Natural selection has to work on whatever mutations are available in a population, and evidently different mutations were available in the European and various African peoples who took up cattle raising and drinking raw milk. The lactase-prolonging mutations conferred an enormous advantage on their owners, letting them leave ten times more surviving children than those without the mutation.20
Lactose tolerance is a fascinating example of how a human cultural practice, in this case cattle raising and drinking raw milk, can feed back into the human genome. The genes that underlie social behavior have for the most part not yet been identified, but it’s a reasonable assumption that they too would have changed in response to new social institutions. In larger societies requiring a higher degree of trust, people who trusted only their close kin would have been at a disadvantage. People who were more trusting would have had more surviving children, and any genetic variation that promoted this behavior would become more common in each successive generation.
The Shaping of Human Social Behavior
Changes in the social behaviors that underlie a society’s institutions take many generations to accomplish. It may have been in hunting or scavenging that early humans first faced strong selective pressure to cooperate. Hunting is much more efficient when done as a group; indeed that’s the only way that large game can be taken down, butchered and guarded from rivals. Hunting may have induced the shared intentionality that is characteristic of humans; groups that failed to cooperate closely did not survive. Along with cooperativeness emerged the rules for sharing meat in an equitable way and the gossip machinery that punished bragging and stinginess.
A hunter-gatherer society consists of small, egalitarian bands without leaders or headmen. This was the standard human social structure until 15,000 years ago. That it took 185,000 years for people to take the seemingly obvious step of settling down and putting a permanent roof over their heads strongly suggests that several genetic changes in social behavior had to evolve first. The aggressive and independent nature of hunter-gatherers, accustomed to trusting only their close kin, had to yield to a more sociable temperament and the ability to interact peaceably with larger numbers of people. A foraging society that turns to agriculture must develop a whole new set of institutions to coordinate people in the unaccustomed labor of sowing and harvesting crops.
In this novel environment, people skilled in farming and in operating in larger communities prospered and left more children; those whose only skill was in hunting did less well and placed fewer of their children and genes in the next generation. In time, the nature of the society and its members changed as its institutions were transformed to serve the new way of life.
After the first settlements, a wave of new societies then came into being in response to population pressure and new ways of gathering food. The anthropologist Hillard Kaplan and colleagues have worked out the dynamics of several of these adaptations.21
One reason why hunter-gatherer societies are egalitarian is that their usual food sources—game animals, tubers, fruits and nuts—tend to be dispersed and are not easily monopolized. In tribal horticulture, as practiced in New Guinea and parts of South America, people live in settled villages with gardens that must be planted and defended. This mode of life requires more structure than a hunter-gatherer band. People accept the governance of a headman to organize defense and conduct diplomatic relations with neighboring groups.
Tribal pastoralism creates an even greater demand for military leadership because the tribe’s chief resource, herds of cattle or sheep, can easily be captured and driven off. Competition for grassland is another source of friction. Pastoralists have developed the necessary institutions for frequent warfare, which often include the social segregation of young warrior classes and expansionary male lineages.
The rise of the first city-states, based on large scale agriculture, required a new kind of social structure, one based on large, hierarchically organized populations ruled by military leaders. The states overlaid their own institutions on those of the tribe. They used religion to legitimate the ruler’s power and maintain a monopoly of force.
The common theme of all these developments is that when circumstances change, when a new resource can be exploited or a new enemy appears on the border, a society will change its institutions in response. Thus it’s easy to see the dynamics of how human social change takes place and why such a variety of human social structures exists. As soon as the mode of subsistence changes, a society will develop new institutions to exploit its environment more effectively. The individuals whose social behavior is better attuned to such institutions will prosper and leave more children, and the genetic variations that underlie such a behavior will become more common. If the pace of warfare increases, a special set of institutions will emerge so as to increase the society’s military preparedness. These new institutions will feed back into the genome over the course of generations, as those with the social behaviors that are successful in a militaristic society leave more surviving children.
This process of continuous adaptation has taken a different course in each region of the world because each differed in its environment and exploitable resources. As population increased, coordinating the activities of larger numbers of people required more complex social structures. Tribes merged into archaic states, states became empires, and empires rose and fell, leaving behind the large scale structures known as civilizations.
The process of organizing people in larger and larger social structures, with accompanying changes in social behavior, has most probably been molded by evolution, though the underlying genetic changes have yet to be identified. This social evolution has proceeded roughly in parallel in the world’s principal populations or races, those of Africans, East Asians and Caucasians. (Caucasian includes Europeans, the peoples of the Indian subcontinent and Middle Easterners.) The same process is visible i
n a fourth race, the natives of North and South America. Because the Americas were populated much later than Africa and Eurasia—the first settlers crossed the Bering Strait from Siberia only 15,000 years ago—social evolution got a much later start and the great empires of the Incas and Mayans emerged several thousand years later than their counterparts in Eurasia. In a fifth race, the peoples of Australia and Papua New Guinea, population numbers were always too low to ignite the processes of settlement and state building.
How Evolution Creates Different Societies
People are entirely different from ants, yet there is something to be learned from the creatures that occupy the other pinnacle of social evolution in nature. An ant is an ant is an ant, yet natural selection has crafted a profusion of widely different ant societies, each adapted to its own ecological niche. Leaf-cutter ants are superb agriculturalists, tending underground gardens of a mushroomlike fungus which they protect with special antibiotics. There are ants that live in the hollow thorns provided for them by acacia trees. Some ants specialize in preying on termite nests. Weaver ants sew leaves together to construct shelters for their colonies. Army ants kill every living thing that cannot escape from their intense raiding parties.
In the case of ants, evolution has generated their many different kinds of society by keeping the ant body much the same and altering principally the behavior of each society’s members. People too live in many different types of society, and evolution seems to have constructed these with the same strategy—keep the human body much the same but change the social behavior.
A principal difference is that people, with their far greater intelligence, construct societies full of complex interactions in which an individual with stereotyped behavior like an ant’s would be at a severe disadvantage. Learned behavior, or culture, plays a dominant role in human societies, shaped by a small, though critical, set of genetically influenced social behaviors. In ant societies, by contrast, social behavior is dominated by the genes and the genetically prescribed pheromones that govern the major activities of an ant society.
In human societies, individuals’ behavior is therefore flexible and generalist, with much of a society’s specificity being embedded in its culture. Human societies are not nearly as diverse as those of ants because evolution has had a mere 50,000 years in which to shape modern human populations, compared with the 100 million years of ant evolution.
Another major difference is that among people, individuals can generally move easily from one society to another. Ants will kill ants from other species or even a neighboring colony of the same species. Apart from slavery—some species of ant will enslave other species—ant societies are immiscible. The institutions of ant societies are shaped almost entirely by genetics and little, if at all, by culture. There is no way that army ants can be trained to stop raiding and turn to peaceful horticulture like leaf-cutter ants. With human societies, institutions are largely cultural and based on a much smaller genetic component.
In the case of both ants and people, societies evolve over time as natural selection modifies the social behavior of their members. With ants, evolution has had time to generate thousands of different species, each with a society adapted to survival in its particular environment. With people, who have only recently dispersed from their ancestral homeland, evolution has so far generated only races within a single species, but with several major forms of society, each a response to different environments and historical circumstances. New evidence from the human genome now makes it possible for the first time to examine this differentiation of the human population at the genetic level.
4
THE HUMAN EXPERIMENT
There is, however, no doubt that the various races, when carefully compared and measured, differ much from each other. . . . The races differ also in constitution, in acclimatisation and in liability to certain diseases. Their mental characteristics are likewise very distinct; chiefly as it would appear in their emotional, but partly in their intellectual faculties.
—CHARLES DARWIN1
Through independent but largely parallel evolution among the populations of each continent, the human species has differentiated into races. This evolutionary process is hard to explore, however, when the question of race is placed under taboo or its existence is denied outright.
Many scholars like to make safe nods to multicultural orthodoxy by implying that human races do not exist. Race? Debunking a Scientific Myth is the title of a recent book by a physical anthropologist and a geneticist, though their text is not nearly so specific.2 “The concept of race has no genetic or scientific basis,” writes Craig Venter, who was the leading decoder of the human genome but has no known expertise in the relevant discipline of population genetics.3
Only people capable of thinking the Earth is flat believe in the existence of human races, according to the geographer Jared Diamond. “The reality of human races is another commonsense ‘truth’ destined to follow the flat Earth into oblivion,” he asserts.4 For a subtler position, consider the following statement, which seems to say the same thing. “It is increasingly clear that there is no scientific basis for defining precise ethnic or racial boundaries,” writes Francis Collins, director of the National Human Genome Research Institute in a review of the project’s implications.5 This form of words, commonly used by biologists to imply that they accept the orthodox political take on the nonexistence of race, means rather less than meets the eye. When a distinct boundary develops between races, they are no longer races but separate species. So to say there are no precise boundaries between races is like saying there are no square circles.
A few biologists have begun to agree that there are human races, but they hasten to add that the fact means very little. Races exist, but the implications are “not much,” says the evolutionary biologist Jerry Coyne.6 Too bad—nature has performed this grand 50,000 year experiment, generating scores of fascinating variations on the human theme, only to have evolutionary biologists express disappointment at her efforts.
From biologists’ obfuscations on the subject of race, sociologists have incorrectly inferred that there is no biological basis for race, confirming their preference for regarding race as just a social construct. How did the academic world contrive to reach a position on race so far removed from reality and commonsense observation?
The politically driven distortion of scientific views about race can be traced to a sustained campaign from the 1950s onward by the anthropologist Ashley Montagu, who sought to make the word race taboo, at least when referring to people. Montagu, who was Jewish, grew up in the East End district of London, where he experienced considerable anti-Semitism. He was trained as a social anthropologist in London and New York, where he studied under Franz Boas, a champion of racial equality and the belief that culture alone shapes human behavior. He began to promote Boas’s ideas with more zeal than their author. Montagu developed passionate views on the evils of race. “Race is the witchcraft, the demonology of our time, the means by which we exorcise the imagined demoniacal powers among us,” he wrote. “It is the contemporary myth, humankind’s most dangerous myth, America’s Original Sin.” 7
In the postwar years, with the horror of the Holocaust weighing on people’s minds, Montagu found ready acceptance of his views. These were prominent in the influential UNESCO statement on race, first issued in 1950, which he helped draft. He believed that imperialism, racism and anti-Semitism were driven by notions of race and could be undermined by showing that races did not exist. However much one may sympathize with Montagu’s motives, it is perhaps simplistic to believe that an evil can be eliminated by banning the words that conceptualize it. But suppression of the word was Montagu’s goal, and to a remarkable extent he succeeded.
“The very word race is itself racist,” he wrote in his book Man’s Most Dangerous Myth: The Fallacy of Race. 8 Many scholars who understood human races very well began to drop the use of the term rather than
risk being ostracized as racists. In a survey taken in 1987, only 50% of physical anthropologists (researchers who deal with human bones) agreed that human races exist, and among social anthropologists (who deal with people) just 29% did so.
The physical anthropologists best acquainted with race are those who do forensics. Human skulls fall into three distinctive shapes, which reflect their owners’ degree of ancestry in the three main races, Caucasian, East Asian and African. African skulls have rounder nose and eye cavities, and jaws that protrude forward, whereas Caucasians and East Asians have flatter faces. Caucasian skulls are longer, have larger chinbones and tear-shaped nose openings. East Asian skulls tend to be short and broad with wide cheekbones. There are many other features characteristic of the three skull types. As is often the case, there is no single feature that suffices to assign a skull to a particular racial type; rather, each feature is more common in one race than the others, allowing a combination of such features to be diagnostic.
By taking just a few measurements, physical anthropologists can tell police departments the race of a skull’s former owner with better than 80% accuracy. This ability has occasioned some anguish among those persuaded by Montagu that human races shouldn’t be acknowledged. How could they identify a skull’s race so accurately if race doesn’t exist? “That forensic anthropologists place our field’s stamp of approval on the traditional and unscientific concept of race each time we make such a judgement is a problem for which I see no easy solution,” wrote one physical anthropologist. His suggestion was to obfuscate, by retaining the concept but substituting a euphemism for the word race, such as ancestry.9 This advice has been followed by a wide range of researchers who, while retaining the necessary concept of race, refer to it in print with bland periphrases like “population structure” or “population stratification.” As for the actual DNA elements now used by biologists to assign people to their race, or races if of mixed parentage, these are known discreetly as AIMs, or ancestry informative markers.