To add insult to injury, humans appear to be unique in the struggle to maintain healthy levels of iron. Neither rampant anemia nor iron deficiency has been documented in any successful species besides ours.
How, then, do other animals deal with the challenge of acquiring sufficient iron? After all, the need for this essential mineral is not uniquely human, and no other animals can produce it either. Surely evolution has devised solutions to this challenge somewhere along the line, even if not in humans?
The answer is complicated. First of all, aquatic animals, whether fish, amphibians, birds, mammals, or invertebrates, do not have the same challenge in getting iron because iron ions are plentiful in both seawater and fresh water. These animals still need to extract iron ions from the water, of course, but finding it is a nonissue. Similarly, iron is abundant in rocks and soil, so plants get it easily.
It appears that herbivorous or mostly vegetarian animals are either better at incorporating abundant iron sources into their diets or better at extracting it from their food than we are. When these species experience famine, displacement, or other stressors, sure, iron deficiency is common, but that is an effect, not a cause. Humans are the only animals that seem to suffer from iron deficiency even when they’re doing fine otherwise.
The frustrating part is that we don’t completely understand why it’s so hard for us to get enough iron. Why are humans so bad at extracting iron from plant sources? Why are we so sensitive to the inadvertent pairing of iron-rich foods with those that inhibit the ability to extract it? These do seem to be uniquely human problems. It is possible that humans suffered one or more mutations in the genes that are responsible for iron absorption, and it just didn’t matter at the time because our ancestors had rich sources of animal-derived iron in their diets, probably fish or big game. That’s a plausible hypothesis, although it has not yet been proven.
Deficiencies of other heavy metals are much more rare than iron deficiency, mostly because we need so little of these other minerals. We need only the tiniest amounts of copper, zinc, cobalt, nickel, manganese, molybdenum, and a few others. In some cases, we can go for months or years at a time without ingesting those metals, relying solely on our internal reserves.
Nevertheless, trace amounts of these heavy metals are crucial, and a diet completely devoid of them would eventually be lethal. Was it an evolutionary error that made it so hard for humans to absorb them, or was it just a failure to adapt to this challenge? Is there a difference? There are plenty of microorganisms that simply have no need for many of these elements. In fact, there is no single trace metal that is required by all organisms. Put another way, for every one of these elements, there are organisms that have engineered their own molecules to perform those elements’ jobs. Humans haven’t done much of that and so we require a broad variety of trace metal ions.
Coda: Gorged to Death
For decades, the United States and other developed countries have been flooded with diet books. This reflects an ominous trend. Starvation used to be a serious threat to all humans; now, obesity is replacing it as a scourge in many parts of the world.
This flows directly from the shortsighted way that evolution programmed our bodies. As many of these books note, we are hardwired for obesity. Yet most popular explanations of how and why things have gone wrong miss the evolutionary lesson at the heart of this growing problem.
Virtually every human being loves to eat. Most people are constantly craving food, whether they are really hungry or not, and cravings are usually for high-fat, high-sugar foods. But most of the foods that supply essential vitamins and minerals—from fruits to fish to leafy greens—are not high in sugar or fat. (When was the last time you had an intense craving for broccoli?) So why do our instincts drive us to high-calorie foods no matter how well fed we are?
Because obesity has been on a steep and steady rise and was not a major health concern until the past century or two, it is tempting to consider it a problem of modernity, not biology. But while it’s true that modern lifestyles and eating habits are to blame for the current rate of obesity in the developed world, that is putting the cart before the horse. People don’t eat too much just because they can. They eat too much because they were designed to. The question is, Why?
Humans are not unique in being gluttonous. If you have dogs or cats, you have undoubtedly noticed that their appetites seem to be insatiable. They always want more treats, more scraps, more food, and they will beg more persistently for rich and savory foods than for, say, salad. In fact, our companion animals are just as prone to obesity as we are. If we are not careful about how much food we give them, they will become overweight quickly.
Scientists know that this is true for laboratory animals as well. Whether they are fish, frogs, mice, rats, monkeys, or rabbits, their food has to be limited or they will become overweight. The same is true at zoos. Animal handlers and veterinarians constantly monitor the weight and food rations of the captive animals so their health won’t suffer due to overconsumption.
The take-home point here is that all animals, including and especially humans, will become morbidly obese if left purely to their own devices. This is in direct contrast to what we see with animals in the wild, where obesity is rare, to say the least. Animals living in their natural habitats are almost always tight and trim—skinny, even.
It was once thought that the artificial environments of zoos, laboratories, and human homes were to blame for animals’ obesity. After all, animals have spent millions of years adapting to their wild natural habitats, and the artificial ones are just no substitute. Perhaps the stress of captivity causes nervous overeating. Or maybe the comparatively sedentary lifestyle throws the metabolic dynamics out of balance.
While these are reasonable hypotheses, they’ve been tested over the years and don’t seem to be the main explanations for captive animals’ obesity. Captive animals that exercise still need their food rationed. They will still become obese if they’re provided with too much.
So why don’t we find obese animals in the wild? The answer—a rather disturbing one—is that most wild animals are teetering on the edge of starvation pretty much all the time. They live their lives in a state of constant hunger. Even animals that hibernate and spend half the year bingeing are still relentlessly hungry. Surviving in the wild is a brutal business and a perpetual struggle. Different species of animals are in constant competition with one another for scarce resources, and there is simply never enough food. This scarcity of food is a biological constant for all animals—except modern humans.
For much of the twentieth century, it was thought that modern lifestyles and conveniences were to blame for the emergence of obesity. Desk jobs had begun to replace manual labor, and radio and television were replacing sports and other forms of physical recreation. The thinking was that previous generations had been much more physically active in both their livelihoods and their entertainment. The increasingly sedentary lifestyle and the transition away from physical toil was to blame for bulging waistlines. By this reasoning, obesity is not the result of a design flaw; it’s the result of poor lifestyle.
Although this may seem to make sense, it is not the whole story. For one thing, people who do make their living through physical toil are not in any way immune from obesity. The opposite is true, in fact, as both obesity and physical labor correlate with lower income. Second, children who spend more time engaging in physical play than indoor play are no less likely to develop obesity as adults. Again, the opposite tends to be true: people who are active athletes throughout their childhood, adolescence, and even into adulthood are more likely to be obese in their thirties, forties, and fifties, particularly when their physical activity wanes. It isn’t lifestyle, it’s the overconsumption of calorie-rich foods that seems to be the main cause of obesity.
This explains why, unfortunately, exercise alone rarely leads to long-term weight loss. In fact, exercise may do more harm than good. Intense exercise leads to intense hunger,
which in turn leads to poor diet choices and chips away at the mental resolve to lose weight. Each time someone slips on a diet, he gets closer to just giving up altogether.
The hard truth is that humans in the developed world are surrounded with high-calorie foods that they are ill equipped to resist. For most of our species’ history, this just hasn’t been something anyone needed to worry about. Until the past couple of hundred years, most people didn’t have access to diets rich in meat and sweets. It was the industrial revolution that began to bring rich diets to the masses. Before that, stoutness in a man and plumpness in a woman were signs of wealth, power, and privilege, and the commoners were, like animals in the wild, prone to constant hunger.
Overeating was a fine strategy when it wasn’t possible to do very often. But when people can pack it in three or four times a day, day after day, their feeble willpower doesn’t stand a chance of moderating intake to prevent unhealthy weight gain. Human psychology is no match for human physiology. This is why people too often treat every meal as if it were the last one before a long winter, as if they’re gorging themselves in anticipation of barely being able to find any food at all.
It gets worse. As recent studies have shown, our bodies adjust our metabolic rates so we gain weight easily and lose weight only with difficulty. Those who have struggled with their weight will tell you that weeks of dieting and exercise often result in negligible weight loss, while a weekend calorie binge can pile on a few pounds almost instantly. Thus, obesity and type 2 diabetes are the quintessential evolutionary mismatch diseases, conditions that directly result from humans living in a very different environment than the one they evolved in.*
Thanks to modern food supplies, people in the developed world will probably never need to worry about scurvy, beriberi, rickets, or pellagra. Obesity, however, will be a constant challenge to their willpower and habits. There is no quick fix. This fatalistic truth is reminiscent of the next categories of flaws we will explore—the flaws in our genomes.
3
Junk in the Genome
Why humans have almost as many broken, nonfunctional genes as functioning ones; why our DNA contains millions of virus carcasses from past infections; why a bizarre self-copying piece of DNA makes up over 10 percent of the genome; and more
You may have heard that humans use only 10 percent of their brains. This is a total myth; humans use every lobe, fold, and nook of their neural tissues. While some regions specialize in certain functions—speech, for instance, or movement—and rev up their activity when performing them, the whole brain is active pretty much all of the time. There is no part of the brain, no matter how tiny, that can be deactivated or removed without serious consequences.
Human DNA, however, is a different case altogether. There are vast expanses of our genomes—the entirety of the DNA we carry in each of our cells—that do not have any detectable function. This unused genetic material was once referred to as junk DNA to reflect its presumed uselessness, although this term has fallen out of favor with some scientists, as they have discovered functions for some parts of this “junk.” Indeed, it may very well turn out that a large portion of so-called junk DNA actually serves some purpose.
Regardless of how much junk our genomes contain, however, it is undeniable that we all carry around a whole lot of nonfunctional DNA. This chapter tells the story of that true genetic junk: the broken genes, viral byproducts, pointless copies, and worthless code that clutter our cells.
Before we proceed, it is worth pausing for a quick refresher about the basics of human genetics. Almost every one of your cells, whether it’s a skin cell, a muscle cell, a neuron, or any other type of cell, has within it a core structure called a nucleus that contains a copy of your entire genetic blueprint. This blueprint—much of which is illegible, as we’ll see—is your genome, and it is composed of a type of molecule called deoxyribonucleic acid, better known as DNA.
DNA is a linear double molecule that looks like a very long twisted ladder, and the genetic information it contains is written in pairings of other, smaller molecules called nucleotides. Think of these nucleotide pairs as the rungs of this metaphorical ladder. Every rung has two halves, each of which is a nucleotide molecule that’s attached to one of the two sides of the ladder. These nucleotides come in four flavors, abbreviated A, C, G, and T; A can pair only with T, and C can pair only with G. These are known as base pairs, and they are what make DNA such an incredibly effective carrier of genetic information.
If you look along one side of the ladder that is your DNA, you can see any combination of the four nucleotide letters. Let’s say you are looking at five rungs, and on one side you see the letters A, C, G, A, and T. Because the rung pairs can only put A with T and C with G, you can be confident that if you moved around to the opposite side of the ladder and looked at the other half of these same rungs, you would see a mirror image of the code on the other side (with the sequence reversed): A, T, C, G, and T.
This is a simple but ingenious form of information coding, especially because it makes it very easy for genetic material to be copied again and again. After all, you could rip your entire, massively long ladder down the middle, splitting every rung in half, and each half would essentially contain the same information. This is precisely what the cell does in order to copy the DNA molecule prior to cell division, the basic process by which the body replaces old cells with new ones. So DNA’s ability to copy itself is not only a miraculous feat of evolutionary engineering but also the basis of our very existence.
So far, so good; DNA is a wonder of nature. But here’s where it starts to look less wondrous. The ladder of DNA that makes up your genome has billions of rungs, 2.3 billion in total, composed of 4.6 billion letters. And a lot of those rungs are, for lack of a better word, unusable. Some of them are pure repetitive nonsense, like someone was pounding on a computer keyboard for hours, while other bits were formerly useful but became damaged and were never repaired.
If you read along the entirety of either side of the ladder of your DNA, you’d notice something strange. Your genes, those sections of the code that can actually accomplish something (cause the irises of your eyes to take on a certain color, say, or direct you to develop a nervous system), are only about 9,000 letters long on average, and you have only around 23,000 of these genes in total. That might seem like a lot, but in truth it’s only a few hundred million letters of DNA—a couple hundred million rungs out of the 2.3 billion that make up your body’s genome.
What are all those other rungs doing if they’re not part of your genes? The short answer is: nothing.
To understand how this could be, let’s adopt a new analogy. Let’s call genes words, a string of letters of DNA that add up to something meaningful. In the “book” that is your genome, the spaces between these words are filled with incredibly long stretches of gibberish. All told, only 3 percent of the letters in your DNA are part of words; most of the remaining 97 percent is gobbledygook.
You don’t have just one long ladder of DNA. Each cell has forty-six of them, called chromosomes, and they can actually be seen under a regular microscope in the moments that a cell is dividing. (The exceptions are sperm and egg cells, which have only twenty-three chromosomes each.) When cells aren’t dividing, however, all of the chromosomes are relaxed and mushed in together like a big bowl of forty-six tangled spaghetti noodles. The chromosomes vary in length, from chromosome 1, with two hundred and fifty million of these rungs, to chromosome 21, with just forty-eight million rungs.
While some chromosomes exhibit a fairly high ratio of useful DNA to junk, others are littered with repetitive, unused DNA. For example, chromosome 19 is fairly compact, with over fourteen hundred genes spread over fifty-nine million letters. On the other extreme is chromosome 4, which is three times larger than 19 but has around half the number of genes. These functional genes are few and far between, like small islands surrounded by vast, empty oceans.
In this regard, the human genome mirrors that of
other mammals, and all mammals have around the same number of genes, roughly twenty-three thousand. Although some mammals have as few as twenty thousand and others have as many as twenty-five thousand, this is still a relatively tight range—and one that is particularly surprising, given that the mammalian lineage is more than two hundred and fifty million years old. It is quite remarkable that, even though humans have been evolving separately from some mammals for over a quarter of a billion years, we all have a similar number of functional genes. In fact, humans have roughly the same number of genes as microscopic roundworms, which have no real tissues or organs. Just saying.
While relatively sparse, operational genes do a lot of work. They each make proteins by ripping the DNA molecule’s ladder down the middle and exposing all the letter nucleotides of either side. The stretch of letters that make a gene can be copied into something called mRNA (m for “messenger,” and RNA for “ribonucleic acid”), which in turn makes a protein that can travel around cells and help everything happen—things like growing and staying alive.
These twenty-three thousand genes that collectively make up 3 percent of the genome are a wonder of nature. Most of the other 97 percent of human DNA is more of a blunder—it does not seem to do very much. Some of it, indeed, is actually harmful.
Human Errors Page 7