How did we lose the ability to make vitamin C? Well, it turns out that we do have all of the genes that are necessary for vitamin C synthesis, but one of them is broken, mutated to the point of being nonfunctional. The broken gene, known as GULO, codes for an enzyme that is responsible for a key step in the manufacture of vitamin C. Somewhere in the ancestors of primates, the GULO gene suffered a mutation, rendering it inoperable, and then random mutation continued, littering the gene with tiny errors. As if to mock the uselessness of pieces of DNA like this, scientists call them pseudogenes.
We can still easily recognize the GULO gene in the human genome. It’s there, and the vast majority of the code is the same as in other animals, but there are a few key parts that have been mutated. It’s as if you removed the spark plug from a car. It’s still a car. You can easily see that it is still a car. In fact, you would have to look very carefully to find anything wrong with it at all. But it cannot function as a car, not even slightly. It’s totally inoperable, even though most of it is still exactly the way it was before it was broken.
That’s what happened to the GULO gene, way back in prehistory. The spark plug was removed by a random mutation. Over the course of evolutionary time, random mutations like this occur constantly. Often, they are of no consequence, but sometimes they strike right in a gene. When that happens, it is almost always bad, because the mutation usually disrupts the functioning of the gene. In such cases, the individuals in whose genomes these mutations occurred are a little worse off—or a lot worse off, if the change brings on a deadly genetic condition like sickle cell anemia or cystic fibrosis.
Often, the deadliest of these mutations are eliminated from the population when the people who carry them die. This begs the question: Why wasn’t the GULO gene mutation eliminated? Scurvy is fatal. The consequences of this mutation ought to have been quick and harsh and should have prevented the harmful error from spreading throughout the species.
Well, maybe not. What if this disrupting mutation happened in a primate who, purely by chance, already had lots of vitamin C in her diet? For her, there would be no consequence of losing the ability to make vitamin C since she already ate foods that contained it. (What foods contain a lot of vitamin C? Citrus fruits. And where do citrus fruits mostly grow? Tropical rainforests. And where do most primates live? Bingo.)
The reason that the ancestors of primates could tolerate a mutation in the GULO gene was that, with plenty of vitamin C in their diets anyway, scurvy wasn’t an issue. Since that time, primates—with the exception of humans—have pretty much stuck to rainforest climates. This preferred habitat is both a cause and a consequence of their inability to make vitamin C. After all, while it’s easy to break a gene by mutation, it’s much more difficult to fix it. It’s like slamming the computer when it’s not working right. Sure, you might fix it, but more likely, you’ll harm it.
Primates aren’t totally unique for having a screwed-up GULO gene. A few other animals have one as well. Not surprisingly, the ones that tolerate having the broken gene are the ones that get plenty of vitamin C in their diets. Take fruit bats, for example. They eat, um, fruit.
Interestingly, our bodies, like those of other animals that have lost the ability to make vitamin C, have attempted to compensate by increasing dietary absorption of it. Animals that make their own vitamin C are typically very poor at absorbing it from their food because they just don’t need it; humans, however, absorb dietary vitamin C at a much higher rate. But even though we have learned to eat food with ample vitamin C and even though our bodies are better at extracting these micronutrients from food, we have not managed to fully compensate for this malfunction. It’s still a very poor design. In the days before fresh food from faraway places was readily available to people, scurvy was a common and often deadly disease.
Other essential vitamins can give us just as much trouble as vitamin C. Take vitamin D. The commonly ingested form of vitamin D is not fully active, which means that we can’t use it until it’s processed in the liver and kidney. The precursor of the vitamin is also generated in the skin, providing the individual gets enough sunlight, but it still needs to be processed into the active form. Without enough dietary vitamin D or enough sunlight, young humans can develop a disease called rickets, and older humans can develop osteoporosis. Rickets is extremely painful and leads to weak bones that break easily and heal slowly and, in severe cases, to stunted growth and skeletal deformities.
Both of these conditions involve brittle and deformed bones, which can be extremely painful. Humans need calcium to keep bones strong, and we need vitamin D to help absorb calcium from food. We could eat all the calcium in the world, and none of it would be absorbed without sufficient vitamin D. (This is why vitamin D is commonly added to milk: it helps our bodies absorb the calcium that the milk contains.)
The effects of vitamin D deficiency on leg bones, a condition called rickets. Humans have trouble absorbing vitamin D from our diet; instead, our bodies require exposure to direct sunlight in order to synthesize it. If we fail to get enough vitamin D as children, the resulting skeletal deformities can last a lifetime.
Rickets is a uniquely human disease for a variety of reasons. First of all, we’re the only species that wears clothing and usually lives indoors. Both of these factors reduce the amount of sunlight exposure of the skin, thus crippling the ability to make the precursor of vitamin D. It could be argued that this is not a problem of poor design, per se, but it’s certainly not good design. The complex multistep activation path for vitamin D is obnoxious enough, but requiring sunlight exposure to produce the precursor molecule adds another wrinkle—no pun intended—and is another way in which we can develop vitamin insufficiencies.
B Vitamins
Vitamin
Alias
Food Source
Effects of Deficiency
B1
Thiamine
Yeast, meat, cereals
Beriberi
B2
Riboflavin
Dairy, eggs, liver, legumes, leafy greens, mushrooms
Ariboflavinosis
B3
Niacin
Meat, fish, legumes, all grains except corn
Pellagra
B4
Choline*
B5
Pantothenic acid
Meat, dairy, legumes, whole grain cereals
Acne, paresthesia
B6
Pyridoxine
Fish, organ meats, root vegetables, cereals
Dermal, neurological disturbances
B7
Biotin
Most foods
Impaired neurological development
B8
Inositol*
B9
Folate
Leafy greens, fruits, nuts, seeds, beans, dairy, meat, seafood
Macrocytic anemia, birth defects
B10**
PABA
B11**
PHGA
B12
Cobalamin
Most animal-derived foods
Macrocytic anemia
*Incomplete consensus on naming/identity. No longer considered a vitamin.
**No longer considered a vitamin.
The B vitamins and their deficiencies. While few if any wild animals have to contend with these deficiencies, they have been a significant blight for humans, particularly since the advent of farming and food processing.
Second, due to modern lifestyles and diets, we don’t always consume enough vitamin D. While it is always tempting to blame modern eating habits for dietary insufficiencies, that’s probably not true in this case.
The innovations brought by civilization have reduced the incidence of rickets. To understand why, consider that, in order to get sufficient vitamin D in our diets, we need to eat at least some fish, meat, or eggs. Precivilization humans consumed few, if any, eggs. While meat and fish were staples, they were almost certainly not available on a steady basis. Prehistoric life was marked b
y periods of feast and famine, and we know from studying the bones of early humans that rickets and brittle bones were a constant problem. Not so for us modern humans in the developed world with our abundant sources of animal protein.
The domestication of animals for meat and eggs (roughly five thousand years ago in the Middle East and at different points elsewhere) mostly solved the problem of rickets. This is just one example of human ingenuity overcoming the design limitations of the human body—a theme we will encounter again and again in this book.
What about the many other vitamins listed on that bottle of multivitamins? Many of them fall into the family of B vitamins. There are eight different B vitamins, which often go by other names, such as niacin, biotin, riboflavin, and folic acid (or folate). Each of these vitamins is required for various chemical reactions throughout the body, and each has its own syndrome associated with insufficiency.
Among the most well-known B vitamin–deficiency syndromes results from not having enough vitamin B12, also called cobalamin. This vitamin is familiar to long-term vegans because B12 deficiency is a problem they invariably have to face; it leads to anemia. Humans cannot make their own vitamin B12, and, since plants have no need for this vitamin, they do not produce it, so the only dietary sources of it are in meat, dairy, seafood, arthropods, other animal-derived foods, and vitamin supplements. Vegans, take note: you need these pills.
But what about vegetarian animals? There are many animals that eat only plants, but if plants don’t have any B12 and all animals need B12 to survive, how do cows, sheep, horses, and the thousands of other herbivorous animals avoid anemia? The answer is that they make it—or, rather, the bacteria in their large intestines make B12 for them.
You probably already know that the large intestines of mammals are chock-full of bacteria. Because bacteria are so much smaller than animal cells, there are more individual bacterial cells in your colon than there are human cells in your entire body. That’s right—the bacteria that live in your body outnumber your own cells! They also do some important things for you. Vitamin K, for example, is made by the bacteria in the gut, and we simply absorb it from there. You don’t need supplements or foods that contain it as long as you have the bacteria that produce it in your gut.
Just like vitamin K, vitamin B12 is made by our intestinal bacteria—yet we need to get more vitamin B12 in our diets. Why is that?
Here is the design flaw: Bacteria make B12 in the large intestine, the colon, but we can’t absorb it from there. We absorb B12 in the small intestine, which comes before the large intestine in the flow of traffic within the digestive system. So the wonderful bacteria of the human gut are nice enough to provide B12 for us, but the gut is so poorly designed that we send all of that B12 to the toilet. (And, yes, in case you are wondering, you could eat your feces to get the B12 you need, but I hope you will never be that desperate.) The bad plumbing job of our intestines has rendered B12 an essential dietary vitamin for humans while all the millions of herbivore animals are blissfully unbothered by any need to find and eat this molecule.
The next most famous B vitamin–deficiency syndrome is beriberi, caused by a lack of vitamin B1, also known as thiamine. Thiamine is required for a variety of chemical reactions in the body, the most important of which is converting carbohydrates and fats into usable energy. As a result of thiamine insufficiency, people can suffer nerve damage, muscle weakness, and heart failure.
Incredibly, despite this vitamin’s importance, we cannot make it ourselves. Like B12, vitamin B1 must come from our diets. Also like B12, B1 can’t be made by any animal. Only bacteria, most plants, and some fungi can make it, so at least we share this flaw with all our fellow animals. Except animals never get beriberi and humans have suffered massively from it. In fact, during the sixteenth and seventeenth centuries, it is estimated that beriberi was second only to smallpox in causing human death. Why only us?
The reason that other animals don’t suffer from beriberi is that B1 is abundant in a wide variety of plant foods found at the base of most food chains. In the oceans, many of the photosynthetic bacteria and protists found in plankton make B1, and it proceeds up the food chain from there. Filter-feeding plankton-eaters, like the massive blue whale, get it directly, but carnivorous fish and mammals often eat things that eat things that eat plankton. In any case, B1 makes the rounds. Same thing on land; many land plants are rich in B1, meeting the dietary needs of herbivores, which are then eaten by carnivores, which are then eaten by apex predators, among which are humans, although we also eat plants, of course.
So why do humans struggle with beriberi when no other animals do? The answer seems to lie in how we prepare our food.
As humans invented and refined agriculture, they began to process foods in various ways to make them taste better and last longer without spoiling or becoming unpalatable. Often, these methods stripped many nutrients out of food.
For reasons that are not always understood, nutrients are not spread evenly throughout a plant. For example, the skins of potatoes and apples are where most of their vitamins A and C are located, so peeling them can rob them of most of their nutrients.
This can be seen acutely in the removal of rice husks. Unrefined rice, or brown rice, is rich in B1. Refining raw rice, also called polishing, allows the rice to be dried and stored safely for years, and this agricultural innovation made a huge difference in preventing famine, especially in Asian populations, where rice is a staple. However, rice polishing removes essentially all the vitamin B1. This was not a problem for the wealthy elite in Asian cultures, since the B1-rich meat and vegetables that they ate supplemented the B1-poor rice. However, for the vast majority of Asian people, beriberi was an endemic condition for thousands of years. It is still a concern in poor remote villages.
The scourge of beriberi may not technically be an example of poor human design since it has plagued us only since the dawn of civilization and is due to our own innovations. However, it is an example of how our evolutionary limitations can be exacerbated—or ameliorated—as we continue to develop as a species. Were it not for human innovations in agriculture and horticulture, civilization would not be possible in the first place. The same technology that led to high rates of beriberi allowed our species to advance past the hunter-gatherer lifestyle. Civilization enabled humans to lead healthier lives in a variety of ways, as evidenced by the explosion of the human population. Beriberi was a tradeoff our ancestors made unknowingly, because they didn’t realize that their bodies could not produce a simple molecule required for the most basic chemical function: converting dietary calories into usable energy. So you could say that one cost of technology and civilization is beriberi.
To be sure, making our own vitamins is complicated and labor-intensive. Vitamins are complex organic molecules, many of them bearing a striking and distinct structure not closely related to other molecules’. To produce them, the body must have an elaborate pathway of enzyme-catalyzed chemical reactions. Each of those enzymes must be encoded by a gene. Those genes must be maintained, copied faithfully each time a cell divides, translated into proteins, and then regulated to match supply with demand. In the grand scheme of metabolism, the number of calories that an organism spends on synthesizing necessary vitamins is small, but it is not zero.
Given all of that, it is somewhat understandable why some organisms have forgone the making of their own vitamins and opted to obtain them from their diets instead. There is a certain logic to that; after all, why go to all of the trouble of making vitamin C when you already have it in your diet? Yet just because we don’t always need to make some essential vitamins doesn’t mean it’s a good idea to relinquish the ability to make them; doing so would be terribly shortsighted, since humans would be stuck with that dietary requirement forever. Once a gene is broken, it is hard to unbreak it.
That logic does not apply to the essential amino acids, whose simple structures are very easy for cells to fashion. And yet we still can’t make many of them eith
er.
Acid Tests
Amino acids are about as different from vitamins as two types of organic molecules can be. All organisms use twenty different kinds of amino acids to build proteins. Humans have tens of thousands of different proteins in their bodies, all of which are made with the same twenty building blocks. All twenty amino acids are structurally similar, each being a slight variation of another. To build these twenty amino acids, therefore, we don’t need twenty separate pathways. Sometimes only a single chemical reaction is needed to change one amino acid into another. This is a far cry from the contortions the human body must go through to create different types of vitamins, and the uses for amino acids are much more varied than those for vitamins.
Nevertheless, we cannot make some of the amino acids for ourselves and must get them from our diets. In fact, nine of the twenty amino acids are called essential because we have lost the ability to manufacture them. I say that we have lost that ability because, as we look back through evolutionary time, we find ancestors who could make some or all of them. A wide swath of unrelated microorganism species (bacteria, archaea, fungi, and protists) can synthesize all twenty amino acids as well as the components needed for DNA, lipids, and complex carbohydrates. These extremely self-sufficient organisms can get by on just a simple carbon-based energy source, such as glucose, and a little organic nitrogen in the form of ammonia.
It’s not just microorganisms that can make all their own amino acids either. Most plant species are capable of synthesizing all twenty amino acids. As a matter of fact, plants are even more self-reliant than most microorganisms because they can synthesize the energy source themselves, too, using energy from the sun. Given a simple balanced soil that contains organic nitrogen, many plants can live without any form of supplementation whatsoever. Plants don’t eat anything. They make all of their own food internally. This remarkable self-sufficiency means that plants don’t really require any other organisms, at least not for short-term day-to-day needs. This helps to explain how plants thrived on the dry land for a hundred million years, growing into impenetrably thick forests, before animals emerged from the oceans and began eating them.
Human Errors Page 5