Populations that have never farmed or that haven't farmed for long, such as the Australian Aborigines and many Amerindians, have characteristic health problems today when exposed to Western diets. The most severe such problem currently is a high incidence of type 2 diabetes. Low physical activity certainly contributes to that problem today, but genetic vulnerability is a big part of the story: Navajo couch potatoes are far more likely to get adult-onset diabetes than German or Chinese couch potatoes. The prevalence of diabetes among the Navajo is about two and a half times higher than it is in their European- descended neighbors, and about four times more common among Australian Aborigines than in other Australians. We think this is a consequence of a lesser degree of adaptation to high-carbohydrate diets. Interestingly, Polynesians are also prone to diabetes (with roughly three times European rates), even though they practiced agriculture, raising crops such as yams, taro, bananas, breadfruit, and sweet potato. We believe that their case still fits our general picture of incomplete adaptation, however. Among the Polynesians, adaptation would have been limited by the relatively small population size and the low rate of protective mutations it would have generated. In addition, settlement bottlenecks and limited contacts between the populations of the far-flung Polynesian islands would have interfered with the spread of any favorable mutations that did occur.
Our explanation of this susceptibility pattern differs from the well-known "thrifty genotype" hypothesis originally promulgated by James Neel. He suggested that pre-agricultural peoples were especially prone to famine and that the metabolic differences that led to diabetes in modern environments had helped people survive food shortages in the past.11 This seems unlikely. The lower rungs of agricultural societies in Europe and East Asia usually suffered food shortages severe enough to cause below-replacement fertility, and weather-related crop failure often struck whole nations or even larger regions. Sometimes this led to famines severe enough to lead to widespread cannibalism, as seems to have occurred in the great famine that struck most of northern Europe from 1315 to 1317.
Hunter-gatherers should have been, if anything, less vulnerable to famine than farmers, since they did not depend on a small set of domesticated plant species (which might suffer from insect pests or fungal blights even in a year with good weather), and because local violence usually kept their populations well below the local carrying capacity.12 State societies limited local violence, but in a Malthusian world, something always limits population growth. In this case, fewer deaths by violence meant more deaths due to starvation and infectious disease. Moreover, hunter-gatherer societies do not appear to have been divided into well-fed elites and hungry lower classes, a situation that virtually guarantees malnourishment and/or famine among a significant fraction of the population, whereas agricultural societies did have divisions of this sort. We believe that our explanation, based on the evolutionary response to awell-established increase in carbohydrate consumption among farmers, is more likely to be correct than an explanation based on the idea that hunter-gatherers were particularly prone to famine, a notion that has no factual support.
Most populations that are highly vulnerable to type 2 diabetes also have increased risks of alcoholism. This is no coincidence. It's not that the same biochemistry underlies both conditions, but that both stem from the same ultimate cause: limited previous exposure to agricultural diets, and thus limited adaptation to such diets.
Booze inevitably accompanies farming. People have been brewing alcoholic beverages since the earliest days of agriculture: Beer may date back more than 8,000 years. There's even a hypothesis that barley was first domesticated for use in brewing beer rather than bread. Essentially all agricultural peoples developed and routinely consumed some kind of alcoholic beverage. In those populations with long exposure, natural selection must have gradually increased the frequency of alleles that decreased the risk of alcoholism, due to its medical and social disadvantages. This process would have gone furthest in old agricultural societies and presumably would not have occurred at all among pure hunter-gatherers.
We must wonder why farming peoples didn't just evolve an aversion to alcohol. It seems as if that would have been a bad strategy, since moderate consumption of traditional, low-proof alcoholic drinks was almost certainly healthful. People who drank wine or beer avoided waterborne pathogens, which were a lethal threat in high-density populations. Alleles that reduced the risk of alcoholism therefore prevailed.
There is also some reason to believe that populations that have been drinking alcohol for hundreds of generations mayhave also evolved metabolic changes that reduced some of alcohol's other risks. In particular, we know that alcohol consumption by pregnant women can have devastating effects on their offspring. Those effects, called fetal alcohol syndrome, or FAS, include growth deficiency, facial abnormalities, and damage to the central nervous system. FAS is, however, far more common in some populations than in others: Its prevalence is almost thirty times higher in African American or Amerindian populations in the United States than it is among Europeans—even though the French, for example, have been known to take a drink or two. Some populations, such as those of sub-Saharan Africa and their diaspora, may run higher risks of suffering from FAS than others consuming similar amounts of alcohol. If so, study of the alleles protecting against FAS in resistant populations might lead to greater understanding of the biochemical mechanisms underlying the syndrome. With luck, we might be able to use that information to decrease the incidence of FAS in vulnerable populations.
This picture of adaptation to agricultural diets has two important implications: Populations today must vary in their degree of adaptation to such diets, depending on their historical experience, and populations must have changed over time.
For example, there must have been a time when no one was lactose tolerant, a later time in which the frequency was intermediate, and finally a time when it reached modern levels. In this instance, we have hard evidence of such change. In a 2007 study, researchers studied DNA from the skeletons of people who died between 7,000 and 8,000 years ago. These skeletons were from central and northern Europe, where today the frequency of the lactase-persistence variant is around 80 percent. None of those ancient northern Europeans had that allele.13 In another study, a different group of researchers looked at central
European skeletons from the late Bronze Age, some 3,000 years ago. Back then, the gene frequency (judging from their sample) was apparently around 25 percent.14 This shows that the frequency of lactose tolerance really has changed over time in the way indicated by the HapMap genetic data. The theory made sense, but experimental confirmation is always welcome. We expect that there will be many similar results (showing ongoing change in sweeping genes) in studies of ancient DNA over the next few years.
Over time, if our argument is correct, farming peoples should have become better adapted to their agricultural diets in many ways, and we might expect that some of the skeletal signs of physiological stress would have gradually decreased. Although such genetic adaptation clearly occurred, cultural changes that improved health must have occurred as well. For example, the adoption of new crops and new methods of food preparation would have improved the nutritional quality of the average peasant's diet. Of course, some of those new methods (polishing rice) and new crops (sugarcane)—actually made things worse. Adaptive change is slow and blind, but it is also sure and steady. Cultural change is less reliable.
But cultural change is important. Although many traditional archaeologists and anthropologists will probably see us as biological imperialists out to explain everything that ever happened with our pet genetic theories, we firmly believe that cultural change—new ideas, new techniques, new forms of social organization—were powerful influences on the historical process. We're simply saying that the complete historical analyst must consider genetic change as well as social, cultural, and political change. Once a list of battles and kings seemed plenty good enough, but life keeps getting more complicated.r />
4
CONSEQUENCES OF AGRICULTURE
Agriculture reshaped human society, resulting in selective pressures that changed us in many ways. Some of those changes involved fairly obvious accommodations to new problems in nutrition and infectious disease. Others consisted of subtle psychological and cognitive changes, some of which eventually led to revolutionary social innovations—possibly including the birth of science. In this chapter, we discuss many of those evolutionary responses.
INFECTIOUS DISEASE
Of course, diet is not the only thing that changed under agriculture. Farming revolutionized human infectious disease—but not in a good way.
The population expansion associated with farming increased crowding, while farming itself made people sedentary. Mountains of garbage and water supplies contaminated with human waste favored the spread of infectious disease. As farmers, humans acquired new commensals, animals that lived among them. We already had ride-along commensals such as lice and intestinal worms—now we had rats and mice as well, which spread devastating diseases such as typhus and bubonic plague.
Quantitative changes in population density and disease vectors resulted in qualitative changes in disease prevalence—not only did old infectious diseases become a more serious threat, entirely new ones appeared.
Most infectious diseases have a critical community size, a number and concentration of people below which they cannot persist. The classic example is measles, which typically infects children and remains infectious for about ten days, after which the patient has lifelong immunity. In order for measles to survive, the virus that causes it, the paramyxovirus, must continually find unexposed victims—more children. Measles can only persist in a large, dense population: Populations that are too small or too spread out (under half a million in close proximity) fail to produce unexposed children fast enough, so the virus dies out. This means that measles, at least in the form we know it today, could not have existed in the days before agriculture—there was no concentrated population that large anywhere on earth. (The virus that causes chicken pox is different: It lingers in the nervous system and often reemerges late in life in the form of shingles, which can be incredibly painful. Children can catch chicken pox from their grandparents—cycle of life! Since the critical community size of chicken pox is less than 100 people,epidemiologists judge that it has probably been around for a long time.)
In any case, the new conditions that accompanied agriculture brought more than measles. Many other diseases that just didn't exist in hunter-gathering days could now thrive as well. Some originated as mutated versions of milder infectious diseases that already existed in humans; we picked up others (probably most) from animals, especially domesticated herd animals. Later, as trade and travel increased, civilizations exchanged some of their regional diseases, with disastrous results.
Infectious disease was thus a far bigger threat to farmers than it had been to hunter-gatherers—which meant that farmers experienced strong selective pressures on that account. They eventually developed much more effective genetic defenses against infectious disease than those sported by their Neolithic ancestors, and these defenses were also far more effective than those possessed by people who remained hunter-gatherers.
The best-understood genetic defenses are those that protect people against falciparum malaria. There are several kinds of malaria, but falciparum is the most serious and accounts for the most deaths. Increased population density and the use of slash- and-burn agricultural techniques (cutting and burning forests to create fields) may have favored the spread of this virulent form of malaria. The trend was particularly unpleasant in Africa, where mosquitoes that preferred humans to animals evolved, facilitating transmission of this deadly disease.
Wherever falciparum malaria has existed for a long time, mainly in the tropical areas of the Old World, people have developed genetic defenses against it, and the side effects of those defenses account for most cases of genetic disease in populationsoriginating in those regions. We know a lot about malaria defenses because they cause illness, and more time and effort has been spent on medical research than on understanding the evolutionary underpinnings of the disease. This is not surprising, since these illnesses have been so troubling in tropical areas. But understanding the root causes of the medical conditions may be worthwhile: A little more evolutionary thought in medicine might actually have practical payoffs.
The most important mutations that protect against malaria are those that change some feature in the red blood cells that are the primary target of the malaria parasite—usually, the hemoglobin molecule (for example, sickle cell hemoglobin [HbS], hemoglobin C [HbC], hemoglobin E [HbE], alpha- and beta- thalassemia, Melanesian ovalocytosis, and glucose-6-phosphate dehydrogenase [G6PD] deficiency). We also know of a number of alleles (such as the glycophorin C variant in New Guinea1) that are almost certainly malaria defenses but do not cause noticeable disease as side effects. In fact, it looks as if the well- known defenses, such as sickle cell, that cause obvious disease are only the tip of the iceberg.
The expensive malaria defenses (defenses with serious side effects) are far more common than any single genetic disease caused by random mutations. Some 400 million people, 7 percent of the world's population, have G6PD deficiency, which can be serious. About 250,000 children are born with sickle-cell anemia each year (which is very serious), while about 20,000 boys are born with Duchenne muscular dystrophy, one of the most common of all mutation-driven genetic diseases.2
These malaria defenses became common because they gave an advantage to carriers (people with one copy of the gene variant); however, they cause problems (from mild to lethal) in people with two copies. This is unusual: We seldom see such crude adaptations in other functions. For example, humans don't have an allele that makes carriers run faster while crippling those with two copies. Normally, genes work together in an efficient and coordinated way. We think that this evolutionary sloppiness exists because falciparum malaria, as we know it today, has not been around very long—perhaps as little as 4,000 years. The same appears to be true of the antimalaria genetic defenses. For example, the main African variety of G6PD deficiency is roughly 2,500 years old, HbE in Thailand is roughly 2,000 years old.3
These adaptations to falciparum malaria were both recent and local. They occurred in the tropical and subtropical areas of the Old World: Peoples who lived in the cooler parts of Eurasia, in Australia, and in the Americas either remained unexposed or were only exposed even more recently. Malaria reshaped the human genome, but only in some peoples. It has been one of the forces differentiating human populations over the past few thousand years.
Malaria defenses are only one example of a more widespread phenomenon. Recent whole-genome selection scans suggest that there have been many other genetic changes related to defense against disease. Again, the extent of these adaptations has varied regionally.
We see evidence of a number of cases in which new alleles related to pathogen defense and the immune system have rapidly reached high frequency: These alleles involve the production of antibodies, control of white cells that attack intruder organisms and infected cells, genes affecting viral infection, andcellular interaction with pathogens such as Helicobacter pylori, the bacterium that causes most stomach ulcers and stomach cancer. Again, most such changes are regional. But even before we began to discover these new defenses, it was obvious that something of the sort must exist, since some populations were much more vulnerable than others to diseases such as smallpox and influenza.
It's time to address the old chestnut that biological differences among human populations are "superficial," only skin- deep. It's not true: We're seeing genetically caused differences in all kinds of functions, and every such difference was important enough to cause a significant increase in fitness (number of offspring)—otherwise it wouldn't have reached high frequency in just a few millennia. These were not just superficial changes affecting things like hair color, skin color, and the
shape of the nose, although even those apparently superficial differences sometimes had important consequences. Some of these differences were far from being superficial or insignificant and profoundly affected the populations in which they appeared, sometimes in unexpected ways. They had a major influence on history; some continue to shape the course of events today.
Populations that experienced different ecological histories had different evolutionary responses. In the case of infectious disease, it was in the main population centers of the Old World that human populations developed the strongest defenses. Populations isolated from the Old World diseases did not have an opportunity to develop such protections.
Amerindians, for example, experienced very little infectious disease. The story is similar in other isolated populations, such as the Australian Aborigines, Polynesians, and the inhabitantsof the Andaman Islands: They didn't experience millennia of infectious disease, didn't evolve improved defenses as most Old Worlders did, and were decimated upon contact with the wider world.
The 10,000 Year Explosion Page 8