The 10,000 Year Explosion

Home > Other > The 10,000 Year Explosion > Page 7
The 10,000 Year Explosion Page 7

by Gregory Cochran


  Agriculture comprised what was surely the most important set of innovations since the expansion of modern humans out of Africa, resulting in changes in human diet, disease exposure, and social structure. Another consequence (one of great evolutionary significance) was a huge population boom. Human numbers had already been on the increase since the advent of behavioral modernity, partly as the result of migration into the far northern regions of Asia, over the sea into Australia, and across a land bridge into the Americas—all places that archaic humans had been unable to settle—and partly because of improvements in food production technology (such as nets and bows). An educated guess puts the total population of the world 100,000 years ago at half a million, counting both anatomically modern humans in Africa and archaic humans (Neanderthals and evolved erectus) in Eurasia. By the end of the Ice Age some 12,000 years ago, there may have been as many as 6 million modern humans—still hunter-gatherers, but far more sophisticated and effective hunter-gatherers than ever before.

  Farming, which produces 10 to 100 times more calories per acre than foraging, carried this trend further. Over the period from 10,000 BC to AD 1, the world population increased approximately a hundredfold (estimates range from 40 to 170 times). That growth in itself transformed society—sometimes, quantity has a quality all its own. And as we have pointed out, this larger population was itself an important factor in evolution.

  The advent of agriculture changed life in many ways, not all of them obvious. It vastly increased food production, but the nutritional quality of the food was worse than it had been amonghunter-gatherers. It did not materially increase the average standard of living for long, since population growth easily caught up with improvements in food production. Moreover, higher population density, permanent settlements, and close association with domesticated animals greatly increased the prevalence of infectious disease.

  The sedentary lifestyle of farming allowed a vast elaboration of material culture. Food, shelter, and artifacts no longer had to be portable. Births could be spaced closer together, since mothers didn't have to continually carry small children. Food was now storable, unlike the typical products of foraging, and storable food could be stolen. For the first time, humans could begin to accumulate wealth. This allowed for nonproductive elites, which had been impossible among hunter-gatherers. We emphasize that these elites were not formed in response to some societal need: They took over because they could.

  Combined with sedentism, these developments eventually led to the birth of governments, which limited local violence. Presumably, governments did this because it let them extract more resources from their subjects, the same reason that farmers castrate bulls. Since societies were generally Malthusian, with population growth limited by decreasing agriculture production per person at higher human density, limits on interpersonal violence ultimately led to a situation in which a higher fraction of the population died of infectious disease or starvation.

  All these changes generated new selective pressures, which is another way of saying that humans didn't fit the new environment they had created for themselves, so the species was under pressure to adapt. Because of the newness of the environment, genetic improvements were relatively easy to find—definitelyeasier at this point than finding ways to become better hunter- gatherers. Modern humans had been adapting to their hunting- gathering lifestyle for a very long time and had already exhausted most such possibilities. Adaptation to the farming life was doable, but as always, it would require concrete genetic changes.

  GENETIC RESPONSE

  When agriculture was new, natural selection must have operated with the genetic variation that already existed, just as it does in small-scale artificial-selection experiments. Such experiments cause changes in the frequency of existing alleles.

  Most preexisting genetic variation must have taken the form of a few neutral variants of each gene—variants that are not significantly different from each other. They may well do something, but the neutral alleles all do the same thing. We doubt if many of those neutral genes turned out to be the solution for the problems faced by the future farmers of Eurasia. More likely, preexisting functional variation mattered more. For example, there is a gene whose ancestral form helps people to conserve salt. Since humans spent most of their history in hot climates, this variant was generally useful. A high frequency of this ancestral allele among African Americans probably plays a role in their increased risk of high blood pressure today. In tropical Africa, in fact, almost everyone has the ancestral version of the gene. In Eurasia, a null variant (one that does nothing at all) becomes more and more common as one moves north.4 Perhaps the gene's action of promoting salt conservation becomes harmful—by causing higher blood pressure—in cooler areas, where people sweat less and lose less salt.

  Significantly, the null allele is the same in both Europe and eastern Asia—which suggests that it originated in Africa and is ancient. If it had separate European and Asian origins, then we would expect to see different versions in the two regions, just as different broken pigment genes lead to light skin in the two regions.

  The most reasonable explanation for this dud salt- conservation gene is that parts of Africa (before the expansion out of Africa) were cool enough that salt retention was not a major concern, so that in these regions an inactive form of the gene was in fact advantageous. This might have happened in Ethiopia during glacial periods, considering that the climate on the Ethiopian plateau is moderate even today. If so, the null allele would represent preexisting adaptive variation caused by environmental variations inside Africa rather than neutral variation. Such internal variation inside Africa must have helped prepare humans for environments outside Africa.

  Another kind of preexisting genetic variation would have consisted of balanced polymorphisms. Balanced polymorphisms occur within a population when the population maintains two different alleles of a gene, and the reason the polymorphism can be stable is that heterozygous individuals will have greater fitness than homozygous individuals. A heterozygote advantage exists, for example, in sickle cell and other malaria defenses. There are also alleles that have positive effects when rare, but whose advantages decrease as they become common, eventually becoming negative (this is called frequency-dependent selection). Some of the most interesting examples involve behavior and lend themselves to a game-theory analysis.

  The best-known model is the hawk-dove game, where some individuals are genetically aggressive while others are geneticallypeaceful. When hawks are rare, they easily defeat doves and have higher fitness. As they become more common, however, they run into other hawks more often and have costly fights that decrease their fitness. At some frequency, the fitness of hawks and doves is the same, leading to a balanced polymorphism.5

  Balanced behavioral polymorphisms could respond quickly to new selective pressures. If the original mix was 50 percent doves and 50 percent hawks, an environmental change that raised the costs of aggressive behavior would lead to a shift in frequency—say to 70 percent doves and 30 percent hawks. This kind of evolutionary change is very rapid, especially when compared to new sweeping genes, which are rare in the beginning and take thousands of years to reach frequencies of 20 percent or more. If the doves acquired a selective advantage of 5 percent, that change (from 50 percent to 70 percent) could occur in less than ten generations.

  Human genetic variation was limited in the days before agriculture, in part because populations were small, and it was often not useful, since many of the changes that were favored among agriculturalists would actually have been deleterious among their hunter-gatherer ancestors. This means that some of the alleles with the right effects in farmers would have been extremely rare or nonexistent in their hunter-gatherer ancestors. For example, variants of G6PD (for glucose-6-phosphate dehydrogenase) with reduced function protect against falci- parum malaria but also have negative effects, especially in men. Today, those G6PD variants have a net positive effect in malarious regions and have become common in man
y populations. Before the spread of falciparum malaria, those variants likely had a net negative effect in all populations, and so were extremely rare.

  Therefore, new mutations must have played a major role in the evolutionary response to agriculture—and as luck would have it, there was a vast increase in the supply of those mutations just around this time because of the population increase associated with agriculture. We're not saying that the advent of agriculture somehow called forth mutations from the vasty deep that fitted people to the new order of things. Mutations are random, and as always, the overwhelming majority of them had neutral or negative effects. But more mutations occurred in large populations, some of them beneficial. Increased population size increased the supply of beneficial mutations just as buying many lottery tickets increases your chance of winning the prize.

  By the beginnings of recorded history some 5,000 years ago, new adaptive mutations were coming into existence at a tremendous rate, roughly 100 times more rapidly than in the Pleistocene. This means that recent human evolution differs qualitatively from typical artificial selection acting on domesticated animals. It is simply a matter of scale. In the artificial- selection experiments, which typically involve no more than tens or hundreds of animals, very few new favorable mutations occur, and selection must act primarily on preexisting genetic variation. In recent human evolution, we're talking anywhere from millions to hundreds of millions of individuals, all of them potential mutants, so most of the advantageous variants would have been new.

  You might think that alleles that were already common would be more likely than new variants to grow to high frequency under agriculture. It stands to reason that the new mutations, which would start out with a single copy, would face disadvantages. But that reasoning underestimates the effect ofthe advantage that the mutation conferred on the individual who carried it and his or her descendants. Even a single copy of an advantageous gene has a fair chance of succeeding (10 percent for a gene with a 5 percent advantage), and exponential growth allows it to spread rapidly. Many new mutations must have occurred in those large farming populations, and the great majority of the sweeping genes must have been new.

  Not only did post-agricultural evolution involve much higher numbers than would be possible in any artificial-selection experiment, it also involved a much longer time frame. Post- agricultural evolution occurred over some 400 generations, which would be impractical for selection experiments using mammals. That long time scale also makes for a qualitative difference, since it is long enough to allow new mutations to rise to high frequency and make up a major part of adaptive variation.

  Recent studies have found hundreds of ongoing sweeps— sweeps begun thousands of years ago that are still in progress today. Some alleles have gone to fixation, more have intermediate frequencies, and most are regional. Many are very recent: The rate of origination peaks at about 5,000 years ago in the European and Chinese samples, and at about 8,500 years ago in the African sample. There are so many sweeps under way, in fact, that we can do some useful statistical analysis. Often we have some idea of a gene's function—for example, by seeing what tissues it is highly expressed in, or by knowing what goes wrong when it's inactivated. Using that information, we can look at the hundreds of genes undergoing sweeps and see what kinds of jobs they do. And when we do that kind of analysis, we see that most of the sweeping alleles fall into a few functional categories: Many involve changes in metabolism and digestion, in defensesagainst infectious disease, in reproduction, in DNA repair, or in the central nervous system.

  YOU ARE WHAT YOU EAT

  Early farmers ate foods that hunter-gatherers did not eat, or at least they ate them in much greater quantities, and at first they were not well adjusted to the new diet. In Europe and western Asia, cereals became the dietary mainstay, usually wheat or barley, while millet and rice became the primary foods in eastern Asia. Those early farmers raised other crops, such as peas and beans, and they ate some meat, mostly from domesticated animals, but it looks as if the carbohydrate fraction of their diet almost tripled, while the amount of protein tanked.6 Protein quality decreased as well, since plant foods contained an undesirable mix of amino acids, the chemical building blocks of which proteins are made. Almost any kind of meat has the right mix, but plants often do not—and trying to build muscle with the wrong mix is a lot like playing Scrabble with more Qs than U's.

  Shortages of vitamins are also likely to have been a problem among those early farmers, since the new diet included little fresh meat and was primarily based on a very limited set of crops. Hunter-gatherers would rarely have suffered vitamin- deficiency diseases such as beri-beri, pellagra, rickets, or scurvy, but farmers sometimes did. There is every reason to think that early farmers developed serious health problems from this low- protein, vitamin-short, high-carbohydrate diet. Infant mortality increased, and the poor diet was likely one of the causes. You can see the mismatch between the genes and the environment in the skeletal evidence. Humans who adopted agriculture shrank: Average height dropped by almost five inches.7

  There are numerous signs of pathology in the bones of early agriculturalists. In the Americas, the introduction of maize led to widespread tooth decay and anemia due to iron deficiency, since maize is low in bioavailable iron. This story is not new: Many researchers have written about the health problems stemming from the advent of agriculture.8 Our point is that, over millennia, populations responded to these new pressures. People who had genetic variants that helped them deal with the new diet had more surviving children, and those variants spread: Farmers began to adapt to an agricultural diet. Humanity changed.

  We are beginning to understand some of the genetic details of these dietary adaptations, which took several forms. Some of the selected alleles appear to have increased efficiency—that is to say, their bearers were able to extract more nutrients from an agricultural diet. The most dramatic examples are mutations that allow adults to digest lactose, the main sugar in milk. Hunter-gatherers, and mammals in general, stop making lactase (the enzyme that digests lactose) in childhood. Since mother's milk was the only lactose-containing "food" available to humans in days of yore, there wasn't much point in older children or adults making lactase—and shutting down production may have decreased destructive forms of sibling rivalry. But after the domestication of cattle, milk was available and potentially valuable to people of all ages, if only they could digest it. A mutation that caused continued production of lactase originated some 8,000 years ago and has spread widely among Europeans, reaching frequencies of over 95 percent in Denmark and Sweden. Other mutations with a similar effect have become common (despite starting several thousand years later) in some of the cattle- raising tribes in East Africa, so that 90 percent of the Tutsi arelactose tolerant today. These mutations spread quite rapidly and must have been very advantageous.

  When you think about it, the whole process is rather strange: Northern Europeans and some sub-Saharan Africans have become "mampires," mutants that live off the milk of another species. We think lactose-tolerance mutations played an important role in history, a subject we will treat at some length in Chapter 6.

  Some genetic changes may have helped to compensate for shortages in the new diet. For example, we see changes in genes affecting transport of vitamins into cells.9 Similarly, vitamin D shortages in the new diet may have driven the evolution of light skin in Europe and northern Asia. Vitamin D is produced by ultraviolet radiation from the sun acting on our skin—an odd, plantlike way of going about things. Less is therefore produced in areas far from the equator, where UV flux is low. Since there is plenty of vitamin D in fresh meat, hunter-gatherers in Europe may not have suffered from vitamin D shortages and thus may have been able to get by with fairly dark skin. In fact, this must have been the case, since several of the major mutations causing light skin color appear to have originated after the birth of agriculture. Vitamin D was not abundant in the new cereal-based diet, and any resulting shortages would have been serious
, since they could lead to bone malformations (rickets), decreased resistance to infectious diseases, and even cancer. This may be why natural selection favored mutations causing light skin, which allowed for adequate vitamin D synthesis in regions with little ultraviolet radiation.

  There were other changes that ameliorated nasty side effects of the new unbalanced diets. The big increase in carbohydrates,especially carbohydrates that are rapidly broken down in digestion, interfered with the control of blood sugar and appears to have caused metabolic problems such as diabetes. A high- carbohydrate diet also apparently causes acne and tooth decay, both of which are rare among hunter-gatherers. More exactly, both are caused by infectious organisms, but those organisms only cause trouble in the presence of a high-carbohydrate diet.

  Some of the protective changes took the form of new versions of genes involved in insulin regulation. Researchers in Iceland have found that new variants of a gene regulating blood sugar protect against diabetes.10 Those variants have different ages in the three populations studied (Europeans, Asians, and sub-Saharan Africans), and in each population the protective variant is roughly as old as agriculture. Alcoholic drinks, also part of the new diet, had plenty of bad side effects, and in East Asia there are strongly selected alleles that are known to materially reduce the risk of alcoholism.

  Clearly, the evolutionary responses to an agricultural diet must differ, since different peoples adopted different kinds of agriculture at different times and in different environments. This variation has caused biological differences in the metabolic responses to an agricultural diet that persist today, but it has also generated differences in every other kind of adaptive response to the new society. Agriculture began in the Middle East 10,000 years ago and took almost 5,000 years to spread throughout Europe. Amerindians in the Illinois and Ohio river valleys adopted maize agriculture only 1,000 years ago, but the Australian Aborigines never domesticated plants at all. Peoples who have farmed since shortly after the end of the Ice Age (such as the inhabitants of the Middle East) must have adaptedmost thoroughly to agriculture. In areas where agriculture is younger, such as Europe or China, we'd expect to see fewer adaptive changes—except to the extent that the inhabitants were able to pick up genes from older farming peoples. And we'd expect to see fewer adaptive changes still among the Amerindians and sub-Saharan Africans, who had farmed for even shorter times and were genetically isolated from older civilizations by geographical barriers. In groups that had remained foragers, there would presumably be no such adaptive changes—most certainly not in isolated forager populations.

 

‹ Prev