Pandora's Seed

Home > Other > Pandora's Seed > Page 9
Pandora's Seed Page 9

by Spencer Wells


  FIGURE 15: THE CHANGING “WAVES” OF DISEASE BURDEN OVER THE PAST 15,000 YEARS.

  This progression from one disease threat to another seems to be one of the laws of human development, and lets us make predictions about what will happen as the world’s less developed countries move from subsistence agriculture, with a way of life not unlike that of our ancestors during the Neolithic period, to enjoying the wonders of the modern world. At the moment, however, infectious disease is still the main threat to most of what used to be called the Third World. And one infectious disease in particular kills more people than any other in the world, more than two million per year, 90 percent of whom are African children under the age of five: malaria. But malaria, unlike most other infectious diseases afflicting us, did not originate in one of our domestic animals. It was probably around, albeit at lower levels, even in the Paleolithic. Its rise to infamy illustrates another continuing effect of the Neolithic Revolution, and shows how human architectural and city planning decisions can have disastrous unintended consequences. The evidence comes not from the archaeological record, though, but from DNA.

  LANDSCAPING THE GENOME

  It is perhaps fitting that the man who is credited with discovering Angkor, the magnificent temple complex in Cambodia, died of malaria in the jungles of Laos in 1861. Henri Mouhot, a French explorer, had spent the previous three years exploring the monuments of Thailand, Cambodia, and Laos, and certainly popularized them for Western readers in his book Travels in the Central Parts of Indo-China, Cambodia and Laos During the Years 1858, 1859 and 1860, published in 1863. Of course the local people knew about the existence of the monuments, but in the age of colonialism Mouhot was credited with being the first to discover them. His poetic descriptions led to restoration efforts, and to this day they remain one of the world’s great sights.

  The Angkor complex was built in stages by the rulers of the Khmer Empire, the most powerful in mainland Southeast Asia, between the ninth and fifteenth centuries. It includes the famous temples of Angkor Wat and Bayon, with their huge stone constructions and immense Buddha heads. Satellite surveys have recently revealed the extent of Angkor to be a staggering 380 square miles, the largest preindustrial city in the world—larger even than modern New York. This complex of temples and houses was home to more than 750,000 people at its peak, but was abandoned in the fifteenth century, all of its temples left to the creeping jungle except for Angkor Wat.

  Several theories have been advanced for why its residents deserted this huge city, but no one knows for sure. A war with the Thai Empire, based northwest of the Khmer, is thought to have contributed, but most scholars now think that ecological stresses had a deciding effect on the fortunes of the Khmer capital. To feed 750,000 people and support the complex political and religious infrastructure of Angkor required enormous quantities of natural resources, as with any city today. The most important of these would have been water, which was necessary not only for drinking but also for the wet-field rice agriculture that fed the city’s population. To provide such enormous amounts of water, a complex system of canals carried water from the Puok, Roluos, and Siem Reap Rivers to huge reservoirs known as barays. And to grow the rice, large portions of Angkor were given over to open fields.

  FIGURE 16: ANGKOR, THE LARGEST PREINDUSTRIAL CITY IN THE WORLD.

  It seems that climatic shifts in the Northern Hemisphere associated with the end of the medieval Little Ice Age period in the fourteenth through seventeenth centuries may have changed the monsoonal pattern in Southeast Asia enough to have led to a shortage of water for Angkor, which would have led to some of the rice paddies being abandoned when their water usage became untenable. Religious changes during this period, perhaps in part a response to the more challenging climatological conditions, also probably led some residents to leave the Khmer capital. But the human-engineered landscape probably contributed in an unforeseen way.

  Malaria is the best-known vector-borne human disease (“vector-borne” means that it is transmitted from one person to another by another animal—the “vector”). In this case, the vector is a mosquito, in particular those belonging to the genus Anopheles. Female Anopheles mosquitoes, in order to have enough amino acids to produce eggs, drink blood from mammals through modified mouth parts that function as a kind of hypodermic syringe. In order to keep the victim’s blood from forming a clot that would clog the insect’s blood-collection system, the mosquito injects a small amount of her saliva, containing anticoagulants, into the tissue around the bite. This allows her unfettered access to the animal’s blood, and it also serves as a wonderful opportunity for microorganisms to move from host to host.

  We now know that malaria is caused by protozoa of the genus Plasmodium. These small multicellular organisms were probably originally spread, like most other infectious diseases, through direct ingestion of infectious material, such as waste or tainted water. However, at one point one of their distant ancestors developed the ability to infect red blood cells, which opened up a whole new route of transmission. Infected animals, when bitten by a bloodthirsty Anopheles female, would have their parasite-tainted blood transmitted into the mosquito’s salivary glands, which would then be injected into the next animal bitten. A wonderful example of evolution in action, and the root cause of the scourge that now kills millions of people every year.

  Until the twentieth century, though, malaria was thought to be caused by a bad landscape—mala aria is Italian for “bad air,” particularly the damp air around swampy areas, and it was this that early European explorers sought to avoid. Mosquitoes were certainly annoying, but their link to the awful fevers brought on by this killer disease was not recognized until the era of modern medicine. However, in a way the early Europeans were correct: bad geography did lead to the disease, although not through the poor air quality surrounding swamps. Rather, it was the presence of wet, dank mosquito breeding grounds in the swampy areas that made for a large number of hungry female mosquitoes, and therefore more likely malarial transmission.

  In an article published in 1992, the French epidemiologist Jacques Verdrager suggested that as Angkor’s rice paddies were abandoned, they served as a perfect breeding habitat for malarial mosquitoes, particularly the common Southeast Asian species A. dirus. As the malaria-transmitting species increased in numbers, the human population became widely infected, and a vicious cycle ensued. More people died or left Angkor, which led to more paddies being abandoned, and within a few generations Angkor had been nearly completely depopulated. While the degree to which this process played a significant role in Angkor’s demise is uncertain, it seems plausible that a nagging insect helped to bring about the ruin of the greatest city in the world.

  Anopheles mosquitoes are found worldwide, and nearly forty species can harbor Plasmodium. Largely tropical or subtropical species, they are particularly common in Africa, Southeast Asia, and Latin America. In part because of the mosquitoes’ ubiquity and their role in transmitting malaria, many scholars of human prehistory have long assumed that malaria has been a scourge of humanity throughout most of its evolutionary history. Recent work in genetics, however, is revealing a more complex story.

  Studies of DNA from Plasmodium parasites by National Institutes of Health researcher Deirdre Joy and her colleagues have revealed that worldwide populations of Plasmodium falciparum, which causes the most dangerous form of malaria, have been diverging for at least 50,000 years. This date suggests that early humans took African malaria with them on their journeys out of Africa to populate the globe, the earliest of which began at that time. What is perhaps more interesting is that Joy and her colleagues also found evidence of a massive expansion of falciparum malaria out of Africa within the past 10,000 years—the same time as the expansion of agriculture during the Neolithic period.

  Another genetic study has revealed a complementary insight into a change in the recent past, but this time from patterns detected in the human genome. University of Pennsylvania geneticist Sarah Tishkoff and he
r colleagues, through a careful analysis of genetic variation associated with the G6PD gene, found evidence of strong selection acting on the gene in the past 10,000 years. G6PD is an enzyme that helps to convert glucose—sugars in the diet—into the subcellular energy packets known as nicotinamide adenine dinucleotide phosphate, or NADPH. NADPH is one of the energy currencies of the cell, and it and its biochemical brethren NADH and ATP are ultimately where all of the energy in your food ends up. G6PD is a rather important enzyme, in other words, and its function has been fine-tuned over hundreds of millions of years of evolutionary history. In some human populations, though, G6PD carries mutations that reduce its ability to function. Favism is the common name for the disease that results, so called because its symptoms often manifest themselves when fava beans are eaten. These symptoms include anemia, jaundice, and kidney failure. Full-blown favism is a nasty disease, something you wouldn’t wish on your enemies, let alone your children.

  However, some reduction in G6PD function has an interesting side effect: because G6PD seems to be particularly active in red blood cells, the effects of a deficiency are felt most acutely there—exactly where the Plasmodium parasites are also active. It seems that during their reproductive cycle in the red blood cells, the malarial parasites siphon off the NADPH supply for their own uses—they are parasites, after all—and this stresses the cell’s metabolism enough so that the cell essentially commits suicide, killing the parasites in the process. Children who inherit these defects in G6PD function, while inheriting the predilection to favism, also gain protection from the malaria parasite.

  Tishkoff and her colleagues applied methods similar to those used by Jonathan Pritchard, which we learned about in Chapter 1, to look at variations surrounding the G6PD variants. They looked at two in particular, one common in Africa and the other in Mediterranean populations (typically found at frequencies of 20 percent or so, depending on the population). By assessing the pattern of genetic variation linked to the variants, the geneticists estimated the African version to be between 3,840 and 11,760 years old; the Mediterranean form appeared to be even younger—between 1,600 and 6,640 years old. In other words, both had arisen in the past 10,000 years. It was a startling discovery and, coupled with the results from the plasmodium genome, suggested that malaria had become a significant human scourge only within the past 10,000 years.

  As those of you who have been paying attention will realize, this is precisely the time of the great changes in human society brought on during the Neolithic period. Malaria, a very old disease that probably afflicted hunter-gatherers living in the tropics tens of thousands of years ago, became a much greater threat once we settled down and started farming. While part of this increase probably stems from the increasing population density in farming communities, some of it also seems attributable to the farming methods themselves. In particular, the landscaping choices made by early farmers in malarial regions almost certainly led to a greater incidence of the disease. As at Angkor, the creation of open areas in the forest (in the case of tropical Africa) and reservoirs and slow-moving canals (in the Middle East) would have resulted in ideal breeding grounds for mosquitoes. The Anopheles mosquito needs shallow, sunlit water in which to breed, and there weren’t nearly as many such pools in the preagricultural era. Once humans started to clear the forest and plant crops, these insects would have become much more common. By growing food and reengineering the landscape, it seems, early African farmers were probably sowing the seeds of a new epidemic as well.

  Malaria, then, seems to fit the pattern of other infectious diseases, which increased in frequency during the Neolithic period. The second wave had arrived in full force, and its effects are seen today both in the genome of the disease-causing organisms and in our own genome. The increase in mobility over the past two centuries is stirring up the infectious disease pot at an alarming rate, and the airplane is today’s equivalent of the irrigation canal. Emerging infectious diseases promise to be a serious challenge over the next few decades, as swine flu, new-variant Creutzfeldt-Jakob disease, and HIV all attest. But the longer-term battle will not be with these microorganisms; rather, it will be with our own biology. This third wave of chronic diseases is still cresting, and its rise from obscurity is where we’re headed next.

  CARBS AND CAVITIES

  The village of Mehrgarh in present-day Pakistan lies in the shadow of the Toba Kakar Range of the western Himalayas, whose peaks rise to over ten thousand feet. These mountains are extremely remote and rarely visited, and were thought in 2004 to be where Osama bin Laden was hiding from Western military troops. The Bolan Pass, the only relatively easy route through the range, has served as a gateway to southern Asia for millennia. Travelers making the journey from central Asia down to India would leave the mountains and find themselves on the dry Kachi Plain to the west of the great Indus River, which meanders two thousand miles from its source in Tibet to the Arabian Sea. It seems an unlikely place to build a village, but this is where Mehrgarh was founded—over 9,000 years ago.

  Mehrgarh is one of the oldest Neolithic settlements in the world, and the oldest in southern Asia. A precursor to the third-millennium B.C. Indus Valley civilization, which included the extensive settlements of Harappa and Mohenjo Daro to the east, Mehrgarh has been the subject of extensive archaeological excavations since the 1970s. As with Neolithic settlements in the Middle East, the pattern is one of ever-greater reliance on domesticated animals and plants over time. Judging from the oldest layers of the excavations, people still hunted large game, but this ended abruptly as domesticated animals made their appearance. The inhabitants of Mehrgarh raised wheat and barley, and they kept cattle, goats, and sheep. They lived in mud-brick houses and made pottery. They worked metal and traded with the surrounding regions—lapis lazuli from the Pamir Plateau, five hundred miles to the northeast, has been found, as have seashells from the Indian Ocean. The people of Neolithic Mehrgarh had what in many ways was a typical Neolithic lifestyle. They also had cavities.

  FIGURE 17: NEOLITHIC DRILLED MOLAR FROM MEHRGARH, PAKISTAN. (PHOTO COURTESY OF DR. LUCA BONDIOLI AND DR. ROBERTO MACCHIARELLI.)

  One of the most intriguing things to come out of the work on Mehrgarh is the earliest evidence yet found for dental work. This evidence comes from the very earliest layers at the site, dating to between 9,000 and 7,500 years ago. As this was the Neolithic era, the tool used to drill the holes would have been made of stone. The authors of a study of teeth from the site suggest that a bow was used to rotate a fine drill bit, which the Neolithic dentists would have wielded to produce a hole in a few seconds. The fact that the holes are limited to the rear molars shows that the goal was not to decorate the teeth—these were not early examples of hip-hop tooth art. They also see evidence of wear around the holes that could have been produced only by chewing after the hole was made, proving that the drilling was done on living people. The implication is that people undertook this painful treatment to try to alleviate the pain of a cavity.

  Cavities, extremely rare in earlier, Paleolithic teeth (as well as in modern hunter-gatherers), show a marked increase in Neolithic communities. The best-studied example comes from Clark Spencer Larsen’s work on ancient Native Americans. The earliest populations, living a hunter-gatherer lifestyle, have cavities in fewer than 5 percent of the teeth studied. Nearly a quarter of teeth from the period after the adoption of agriculture are afflicted, though—a shocking increase. It’s perhaps no wonder that the people of Mehrgarh were willing to resort to a hand-cranked, stone-tipped drill—their teeth were literally rotting out of their mouths!

  The increase in cavities during the Neolithic period occurred because the proportion of carbohydrates—starches—in the diet increased dramatically. Paleolithic hunter-gatherers ate a very diverse diet consisting of a wide variety of unprocessed animal and vegetable material, which actually served to clean the teeth during chewing, while much of the Neolithic diet consisted of the processed, starchy seeds of cultivated grasses—separate
d from the husk, ground, and cooked (often with other ingredients) so that their original identities were entirely lost. I have visited remote villages in the lowlands of New Guinea where nearly all of the calories in the diet come from starchy “puddings” and “pancakes” extracted from the trunks of the sago palm. Not the most likely food source—imagine being the first person to suggest eating the macerated trunk of a rather spiky tropical tree—but a ready source of starch. And it’s much easier to make use of abundant starches than it is to gather a wide variety of foods, or to hunt and fish for protein, so people take the easiest route.

  FIGURE 18: THE INCREASE IN CAVITIES AFTER THE INTRODUCTION OF AGRICULTURE, SEEN AT SEVERAL SITES IN THE SOUTHEASTERN UNITED STATES. THE LATE WOODLAND PERIOD, WHEN AGRICULTURE FIRST APPEARS, DATES FROM A.D. 500-1000.

  FIGURE 19: PREPARING STARCH FROM THE SAGO PALM, KARAWARI RIVER, PAPUA NEW GUINEA.

 

‹ Prev