by Nathan Wolfe
When our ancestors began to cook extensively, in addition to the advantages that cooking offered them by making food more manageable and palatable, they also benefited from its remarkable ability to kill microbes. While some microbes can survive at incredible temperatures (such as the hot spring microbial hyperthermophiles that grow and reproduce at temperatures above the boiling point of water), the vast majority of microbes that make their living off of animals cannot survive the temperatures associated with cooking. As microbes are heated during cooking, their normally solid, densely packed proteins are made to unfold and open, allowing digestive enzymes quick and easy access to destroy any capacity to function. As with the population bottlenecks that our ancestors swung through, the cooking that became their standard way of life served to again diminish their uptake of new microbes, helping limit their microbial diversity.
The earliest solid evidence that humans controlled fire comes from archaeological finds in northern Israel where burned stone flakes dating back almost eight hundred thousand years were found near fire pits. This is almost certainly an underestimate. African sites dating to over a million years ago contain burned bones that could be the remains of cooking, yet the lack of archaeological evidence makes these finds more ambiguous. In Wrangham’s analysis, the evidence of cooking goes back much further. By examining the remains of our ancient ancestors, paleontologists have found physiological clues indicating that they consumed cooked food. For example Homo erectus, a human ancestor from 1.8 million years ago, had exactly the larger bodies and smaller digestive tracks and jaws to imply that they consumed higher-energy diets that were easy to chew and easy to digest—in other words, foods that had been cooked.
Whatever the exact date of our ancestors’ culinary dawn, it has certainly exploded since then. Cooked foods make up the vast majority of contemporary diets. In my work with hunters around the world, I’ve had a chance to sample from a vast range of these foods—from roasted porcupine and python in Cameroon to fried wood grubs in rural DRC. On one occasion, my “friendly” Kadazan collaborator in Borneo even gave me dog stew as a practical joke (I didn’t really see the humor). I’ve had a chance to sample food far beyond the beef, lamb, and chicken staples that I grew up eating in America. Yet no matter what I’ve eaten, or where I’ve eaten it, one thing can be certain: if the food has been cooked sufficiently, the likelihood that it will make me sick is small.
* * *
The dual factors of diminished population sizes and cooking were not the only things that served to decrease the microbial repertoires of our early ancestors. The transition from rain forest habitat to a savanna habitat meant different vegetation and climate but also an entirely different set of animals to interact with and hunt. And different animals meant different microbes.
While we still understand very little about the ecological factors that lead to microbial diversity, there are some key factors that certainly play a role. We know, for example, that the biodiversity of animals, plants, and fungi supported by tropical rain forest systems is higher than any other ecosystem on land. When our ancestors left the rain forest, they entered into regions with diminished biodiversity. The diversity of microbes would almost certainly have been reduced, as would the diversity of the host animals that they infected. So the savanna grassland habitats likely housed fewer animals and a lower diversity of microbes capable of infecting them, which in turn contributed to lower microbial repertoires for our ancestors.
The kinds of animals living in the savanna also differed in critical ways from those in the forests, including a marked contrast in the diversity of apes and other primates. Simply put—primates love forests. The king of the jungle is a primate, not a lion. While some primates, like baboons and vervet monkeys, live very successfully in savanna habitats, forest regions trump savanna regions in terms of primate diversity. When we consider the microbes that could most easily infect our ancestors, the diversity of primates in any given habitat plays an important role. They are certainly not the only species that contribute to our microbial repertoires—in my own studies, I focus not only on primates but also on bats and rodents—but they do play an important role.
* * *
Some years ago, I began considering what factors might improve or decrease the chances that a microbe would jump from one host to the next successfully enough to catch on and spread in the new host. It may seem that bats and snakes, for example, would provide similar sources for novel microbes. Yet there is a strong argument against this idea. Long evident to those doing work on microbes in laboratories is the fact that closely related animals have similar susceptibility to certain infectious agents. So a mammal, like a bat, would have many more microbes that could be successfully shared with a human than a snake. If not for the logistics and ethics, chimpanzees would make the ideal models for studying just about every human infectious disease. As our closest living relatives, they have nearly identical susceptibility to the microbes that infect us. Over time, less and less laboratory research on human microbes is conducted in chimpanzees, but this is largely because of the valid ethical concerns associated with conducting research on them and the difficulty of controlling these large and aggressive animals in captivity.
Closely related animal species will share similar immune systems, physiologies, cell types, and behaviors, making them vulnerable to the same groups of infectious agents. In fact, the taxonomic barriers that we place on species are constructs of our own scientific systems, not nature. Viruses don’t read field guides. If two different hosts share sufficiently similar bodies and immune systems, the bug will move between them irrespective of how a museum curator would separate them. I named this concept the academically accurate but unwieldy taxonomic transmission rule, and it holds up for chimpanzees and humans as it would for dogs and wolves.1 The idea is that the more closely related any two species are, the higher the probability that a microbe can successfully jump between them.
Most of the major diseases of humans originated at some point in animals, something I analyzed in a paper for Nature, written with colleagues in 2007. We found that among those for which we can easily trace an animal origin, virtually all came from warm-blooded vertebrates, primarily from our own group, the mammals, which includes the primary subjects of my own research, the primates, bats, and rodents. In the case of primates, while they constitute only 0.5 percent of all vertebrate species, they seeded nearly 20 percent of major infectious diseases in humans. When we divided the number of animal species in each of the following groups by the number of major human diseases they contributed, we obtained a ratio that expresses the importance of each group for seeding human disease. The numbers are striking: 0.2 for apes, 0.017 for the other nonhuman primates, 0.003 for mammals other than primates, and a number approaching 0 for animals other than vertebrates. So as our early ancestors left the primate-packed rain forests and spent more time with lower overall primate biodiversity in savanna habitats, they moved into regions that likely had a lower diversity of relevant microbes.
* * *
Multiple factors likely conspired to decrease the microbial repertoires of our early ancestors. As they spent more time in savanna habitats, our early ancestors interacted with fewer host species, and those hosts were on average more distantly related to them. The advent of cooking increased the safety of meat consumption and stopped many of the microbes that would have normally crossed over during the course of hunting, butchering, and ingesting raw meat. And the population bottlenecks that our ancestors went through further winnowed down the diversity of microbes that already infected them. All in all, the conditions associated with becoming human served to decrease the diversity of microbes present in our ancient relatives. Though many microbes undoubtedly remained in our early ancestors, there were likely far fewer than those that were retained in the separate lineages of our ape relatives.
During the time that our own ancestors went through their microbial cleansing, their ape cousins continued to hunt and accumulate nov
el microbes. They also maintained microbes that would have been lost in our own lineage. From a human perspective, the ape lineages served as a repository for the agents we’d lose—a microbial Noah’s ark of sorts, preserving the bugs that would disappear from our own bloodlines. These great ape2 repositories would collide with expanding human populations many centuries later, leading to the emergence of some of our most important human diseases.
* * *
Perhaps the single most devastating infectious disease that afflicts humans today is malaria.3 Spread by mosquitoes, it is estimated to kill a staggering two million people each year. Malaria has had such a profound impact on humanity that our own genes maintain its legacy in the form of sickle cell disease. Sickle cell, a genetic disease, exists because its carriers are protected from malaria. Protection was so important that natural selection maintained it despite the debilitating disease that appears in approximately 25 percent of the offspring of couples that each carry the gene. People who are afflicted with sickle cell have their origins almost exclusively in one of the world’s most intensely malaria affected areas—west central Africa.
My interest in malaria is both personal and professional. During my time working in malaria-infested areas of Southeast Asia and central Africa, I was infected by it on three separate occasions. On the last of those occasions, I almost died. The first two times I’d had malaria were both in regions where malaria was common. I’d exhibited all the typical symptoms—severe neck ache (similar to how you’d feel if you slept in a strained position) followed by intense fever and profuse sweating. On each of my first two bouts, I simply went to a local doctor and received a quick diagnosis and treatment. While the pain and illness were miserable, they both resolved reasonably quickly.
I was in complete denial at the time I had my third round with this deadly disease. I wasn’t in the tropics; I was in Baltimore! I had returned from Cameroon to do research at Johns Hopkins University, and I had very different symptoms, led by intense abdominal pain. I must have also had fever since I remember complaining to friends who were putting me up in their local bed and breakfast that my room was too cold. These new symptoms and the fact that I’d left Africa many weeks earlier fed my denial that this could possibly be malaria. I finally realized I needed urgent care while sitting half delirious in a tub of scalding water and watching the overflow hit the floor of my friends’ bathroom. Although I recovered after a few days in the hospital, the illness brought home for me the huge impact that this disease has on the millions of people who are regularly sickened by it.
My professional interest in malaria had started much earlier. As a doctoral student studying the malaria of orangutans in Borneo, I’d had the good fortune to spend a year working with some of the world’s foremost experts on malaria evolution at the CDC in Atlanta. There I had the luxury of spending afternoons with Bill Collins, perhaps the world’s greatest expert on the malaria parasites of primates, discussing how malaria might have originated. Among the prominent themes of our chats was the importance of wild apes.
At the time, we knew that wild apes had a number of seemingly distinct malaria parasites. One of them was particularly intriguing. Plasmodium reichenowi was named after a famous German parasitologist, Eduard Reichenow, who had first documented the parasites in chimpanzees and gorillas in central Africa. Reichenow and his contemporaries saw a number of these particular parasites, collector’s items for the German researcher, and correctly identified them through examination by microscope as closely related to our own Plasmodium falciparum. In the 1990s, during my time at the CDC, molecular techniques were paving the way to detailed examination of these parasites, allowing us to compare them accurately to our own parasites and providing much greater evolutionary resolution than a microscope could ever offer. Sadly, all of the parasites of Reichenow’s time had been lost, and all that remained was a single lone specimen.
Initial work with this lone P. reichenowi parasite showed that in fact it was the closest of the many primate malarias to our own deadly human malaria, P. falciparum. Yet with only a single specimen, it remained impossible to say much about the origins of these parasites. Perhaps, long ago, the common ancestor had a parasite that over millions of years had gradually evolved into distinct lineages of P. reichenowi and P. falciparum, a hypothesis favored by some at the time. Or perhaps the ape parasite simply resulted from the transmission of the common human parasite to wild apes at some point in fairly recent evolutionary history. A third possibility, neglected by most considering the huge number of humans and the incredible proliferation of P. falciparum among them compared to the existence of only a few dozen known parasites in apes, was that perhaps P. falciparum was in fact an ape parasite that had moved over to human populations.
Bill and I understood that to truly address the evolutionary history of these parasites we’d need to get more samples from wild apes, ideally many. As a young doctoral student, I was ambitious yet still naïve about the difficulties associated with getting these kinds of samples. But I promised Bill I’d do it and set about planning ways to sample apes in the wild.
Unbeknownst to me at the time, I was about to be called away by my soon-to-be postdoctoral mentor Don Burke to conduct research in Cameroon. I was unaware at the time that I’d spend nearly five years establishing a long-term infectious-disease-monitoring site there in Cameroon. Eventually, though, I did follow through on my promise to Bill and got those ape samples. Ultimately, in collaboration with sanctuaries in Cameroon that helped to provide homes to orphan chimpanzees, we discovered that ape malaria parasites were not as uncommon as people had suspected. By teaming up with Fabian Leendertz, a veterinary virologist who had done similar work in the Ivory Coast, molecular parasitologist Steve Rich, and the legendary evolutionary biologist Francisco Ayala, we took an important step toward cracking the origin of this disease.
Together we were able to compare the genes in hundreds of human P. falciparum samples that already existed with around eight new P. reichenowi specimens from chimpanzees in locations throughout west Africa. The genetic comparison surprised us all. Amazingly, we found that the entire diversity of P. falciparum (the human malaria) was dwarfed by the diversity of the handful of P. reichenowi chimpanzee parasites we’d managed to uncover. This discovery told us that the most compelling explanation for P. falciparum was that it had been an ape parasite and only jumped over to humans through a bite by some confused mosquito, sometime after our split with the chimpanzee lineage. Human malaria had, in fact, originated in wild apes. In the years that followed our work, a number of researchers documented more and more of the parasites in wild apes.
Subsequent work by my collaborators Beatrice Hahn and Martine Peeters (the same scientists who have done work on SIV evolution) has shown that the malaria parasites infecting wild apes are even more diverse than our study indicated. They have shown that the ape parasites most closely related to human P. falciparum exist in wild gorillas, rather than chimpanzees. How these parasites have been maintained among wild apes and whether or not they’ve moved back and forth between chimpanzees and gorillas remain questions for future studies. Either way, there is no longer any doubt that human P. falciparum moved from wild apes into humans and not in the opposite direction.
* * *
That malaria crossed from a wild ape into humans makes great sense when viewed from the perspective of the evolution of our lineage. The microbial cleansing that resulted from habitat change, cooking, and population bottlenecks among our own ancestors had cleared our microbial slate, decreasing the diversity of microbes that were present before. Perhaps the many years with leaner microbial repertories had also decreased selective pressure on the many innate mechanisms that we have to fight against infectious diseases, effectively robbing us of some of our protective disease-fighting tactics.
In more recent times, as our population sizes began to increase, wild ape diseases, some of which we’d lost millions of years earlier, had the potential to infect us again. W
hen these diseases reentered humans, they acted on us like uniquely suited novel agents. Malaria was not the sole microbe to make the leap from apes to modern humans, and the stories of others, like HIV, tell a strikingly similar tale. The loss of microbial diversity in our early ancestors and the resulting decrease in their genetic defenses would make us susceptible to the microbial repositories that our ape cousins maintained during our own microbial cleansing. While we continued to change as a species, yet another part of the stage would be set for the brewing viral storm.
4
CHURN, CHURN, CHURN
The oysters were excellent, but the company was even more striking. As I sat in the small Parisian bistro with a tray of fresh shellfish, I savored the taste of the ocean. But the more powerful memory of that day was of another patron of the restaurant. At the table next to me sat an impeccably put together Frenchwoman. Her bag, skirt, and socks all matched—not exactly, but just enough to notice. Her dining companion sat to her right—a miniature poodle, sitting on the chair and drinking water from a bowl on the table. Pieces of his meal—chicken I think—fell over the side of his plate, mingling with the crumbs from his owner’s bread.
Dogs play an important role in the lives of many people around the world. I had stopped only briefly in Paris on the way home from a month-long trip conducting research in Asia and Africa. It might have been the jet lag, but my recollection of the event could only be described as surreal. During my trip I’d spent time in a part of Borneo where people eat dog, including on at least one occasion my unsuspecting self. I’d also visited Muslim areas of the Malay Peninsula, where devout people won’t even touch dogs because of religious beliefs. And I’d spent time in central Africa, where I’d seen local hunters work with their small, silent basenji hunting dogs—dogs that lived on their own but in exchange for scraps followed hunters into the forests, helping them catch their prey. In the United States, many people treat dogs as members of their families, paying large fees for medical expenses and mourning for them when they die. Sitting on the beach near my home in San Francisco, it would be hard for me to spend an hour without seeing someone kiss his or her pet dog on the mouth. Watching that woman in Paris sharing a meal with her dog solidified just how linked we are to these animals.