Human Errors

Home > Other > Human Errors > Page 6
Human Errors Page 6

by Nathan H. Lents


  Animals are the exact opposite of self-sufficient. They must constantly eat other living things in order to survive. They can eat plants, algae, or plankton, or they can eat other animals that eat them. Either way, animals must get all of their energy from organic molecules made by other living things since they cannot harvest solar energy themselves.

  Since humans have to eat other living things anyway, we’ve gotten a bit lazy. While we eat plants and other animals mainly for their energy, consuming them also brings us all the proteins, fats, sugars, and even vitamins and minerals that those living things have in their bodies. We’re not getting just energy when we eat, in other words; we’re also getting various organic building blocks. This frees us from having to constantly make those molecules ourselves. If you are provided with a nice serving of the amino acid lysine, for example, every time you eat, why should you bother to spend energy making it?

  Of course, each plant and animal has different amounts and combinations of amino acids. If we stop making lysine ourselves, we might be okay on a diet of fish and crabs (which are high in lysine), but a diet of berries and insects (low-lysine foods) would hurt us. That’s the problem with discarding the ability to make certain nutrients. In order to save a few calories of energy, we lock ourselves into certain diets or lifestyles that we cannot change on pain of death. This is a dangerous game because the world is in a continuous state of flux. Every single geographic location and microenvironment has seen its share of upheavals, fluctuations, and catastrophes. The only constant in life is change.

  Yet evolution has made these shortsighted tradeoffs in humans again and again. Our species has lost the ability to make nine of the twenty amino acids. Each loss is the result of at least one mutational event, usually more. Mutations happen to individuals randomly, of course; they become fixed in the population either by pure chance or because they offer some sort of distinct advantage. In the case of the mutations that destroyed our ability to make amino acids, it was probably a matter of chance.

  When humans lost the ability to synthesize several amino acids, they gained nothing but the risk of debilitating, even deadly, dietary deficiencies—so how were these mutations not quickly eliminated when they occurred? Because our species’ diets compensated for this loss, just like we saw with vitamin C. A diet with at least occasional meat or dairy consumption usually provides enough of each essential amino acid. Plant-based diets, however, need to be planned more carefully because different kinds of plants provide different ratios of the twenty amino acids. As a result, variety is the easiest way for vegetarians and vegans to ensure that they get enough of all the amino acids they need.

  In the developed world, it is not difficult for a vegan to acquire all the nine essential amino acids. A single serving of rice and beans can give a whole day’s supply, provided that the rice is unrefined and the beans are of the black, red, or kidney variety. Further, chickpeas, also called garbanzo beans, contain large quantities of all nine essential amino acids all by themselves, as do quinoa and a few other so-called superfoods.

  However, among the poor, especially in developing countries, a varied diet is not always an option. There are billions of humans who subsist on extremely simple diets consisting of just a few staples, and those staples often do not provide enough of some of the essential amino acids, especially lysine. In some remote Chinese villages, the poorest of the poor will live on nothing but rice and the occasional scrap of meat, egg, or bean curd. In the poorest parts of Africa, the most destitute subsist on diets composed almost entirely of wheat products, and even those become scarce during famines. Unsurprisingly, given examples like these, protein deficiency is the single most life-threatening dietary problem in the developing world. This problem stems directly from our species’ inability to make certain amino acids.

  The problem of amino acid deficiency is not unique to the modern world by any means. Preindustrial humanity probably dealt with protein and amino acid insufficiency on a regular basis. Sure, large hunted animals such as mammoths provided protein and amino acids aplenty. However, living off big game in the era before refrigeration meant humans had to endure alternating periods of feast and famine. Droughts, forest fires, superstorms, and ice ages led to long stretches of difficult conditions, and starvation was a constant threat. The human inability to synthesize such basic things as amino acids certainly exacerbated those crises and made surviving on whatever was available that much harder. During a famine, it’s not the lack of calories that is the ultimate cause of death; it’s the lack of proteins and the essential amino acids they provide.

  Amino acids are not the only basic biomolecules that humans and other animals have lost the ability to synthesize. Two other examples come from a group of molecules called fatty acids. These long hydrocarbons are the building blocks for fats and other lipids that the body needs, such as phospholipids, which help form the membranes that surround every single cell. It is hard to think of a more essential structure than the cell membrane. Yet one of the two fatty acids that we cannot produce (both of which have tongue-twister names) is linoleic acid, which forms part of the cell membrane. The other one, alpha-linolenic acid, is used to help regulate inflammation, another hugely important internal process.

  Luckily for us, modern human diets have provided these two essential fatty acids in sufficient quantities in the form of seeds, fish, and various vegetable oils. Fortunately, too, several studies have shown that frequent consumption of these fatty acids leads to improved cardiovascular health. But we weren’t always so lucky. In prehistory, especially in the era before agriculture, human diets tended to be much simpler. Roving bands ate what they could find, doing their best to follow the food. Most of the time, these fatty acids were probably available, but there can be little doubt that periods of deficiencies existed as well. Sometimes grass, bugs, leaves, and the occasional berry were all that could be found. Just as with essential amino acids that we cannot synthesize ourselves, losing the supply of two important fatty acids would make any food crisis that much worse.

  What is most maddening about these two fatty acids is that they can be made pretty easily. Our cells can synthesize a whole host of lipid molecules, many of them quite a bit more complex than linoleic acid and alpha-linolenic acid. In fact, we make many very elaborate lipids from these simple ones—yet we cannot make these two themselves. The enzymes necessary to produce these particular fatty acids exist in many organisms on earth, but humans aren’t one of them.

  The human body, like the bodies of all animals, takes in plant or animal tissue, mashes it up, absorbs the small constituents, and uses the little bits to build its own molecules, cells, and tissues. However, there are gaps in this scheme. There are several molecules that are crucial for human health that we are incapable of making, so we have no choice but to seek them out in our food. The fact that we need to find these essential nutrients places restrictions on where and how humans can live. And that’s just organic nutrients. The human body is also terrible at obtaining the inorganic kind—known as minerals—even when they are right there in what we eat.

  Heavy-Metal Machines

  For squishy, water-based creatures, we humans sure do need a lot of metal in our diets. There are all kinds of metals—known as essential minerals—that we have to eat. Metal ions are single atoms, not complex molecules, and they cannot be synthesized by any living thing. They must be ingested in food or water, and the list of ions that are essential for us includes cobalt, copper, iron, chromium, nickel, zinc, and molybdenum. Even magnesium, potassium, calcium, and sodium are technically metals, and we need substantial amounts of these minerals daily too.

  We don’t think of these minerals as metallic because we don’t consume or use them in their elemental forms. Instead, cells use metals in their water-soluble, ionized forms. To appreciate the stark difference, consider sodium.

  Sodium, as it appears on the periodic table in its elemental form, is a metal so reactive that it catches fire if it comes in con
tact with water. It’s highly toxic; a tiny amount could kill a large animal. However, when we remove a single electron from a sodium atom, turning it into an ion, it has completely different properties. Ionized sodium is more than simply harmless; it is essential for all living cells. It combines with chloride ions to form table salt. For all intents and purposes, elemental sodium (Na) is a completely different substance than ionized sodium (Na+).

  While sodium and potassium are inarguably among the most important of the metal ions (in the sense that no cells can function without them), humans almost never have a chronic lack of these minerals in their diets. All living things have these two ions in relative abundance, so whether you’re paleo, vegan, or something in between, you will get the sodium and potassium that you need. Acute deficiencies of sodium or potassium can be an urgent problem, but they are usually the result of physiological dysfunction, fasting, excessive dehydration, or some other short-term insult.

  For other essential ions, the story is different. If you aren’t eating ’em, you aren’t getting enough of them, and you’ll suffer chronic illness as a result. Inadequate calcium intake, for example, is a problem throughout the world, affecting both rich and poor. Calcium insufficiency is one of the most frustrating dietary problems from a design standpoint because lack of calcium stems from our species’ poor ability to absorb it rather than from not having enough of it in our food. We all eat plenty of calcium; we just aren’t very good at extracting it from food. As already mentioned, vitamin D is required for calcium absorption, so if you’re deficient in vitamin D, all the dietary calcium in the world cannot help you because it will proceed right through your gut undisturbed.

  Even if we have plenty of vitamin D, we still aren’t very good at absorbing calcium, and we get worse and worse at it as we age. While infants can absorb a respectable 60 percent of the calcium they consume, adults can hope to absorb only around 20 percent, and by retirement age, that drops to 10 percent or even lower. Our intestines are so bad at extracting calcium from food that our bodies are forced to extract it from our bones instead—a strategy with devastating consequences. Without constant calcium and vitamin D supplementation, most people would develop the brittle bones of osteoporosis in their golden years.

  In prehistoric times, few humans lived beyond thirty or forty, so you might think that calcium deficiency was not such a problem for our ancestors. Yet even so, the majority of ancient skeletal remains show the telltale signs of calcium and vitamin D deficiency, and they appear more drastically—and in younger people—than we typically see today.

  So osteoporosis and the calcium shortages that can create it are definitely not new problems. Neither is the difficulty humans often encounter with getting enough of another vital mineral: iron.

  Iron is the most abundant transition metal (the class of metals occupying the huge center section of the periodic table and known for conducting electricity well) in our bodies and in the earth. As with the other metals, we make use of ionized iron atoms, not the elemental metal form. Most of that elemental stuff sank to the core of the earth shortly after it formed; what we have here on the surface is mostly the ions lacking one, two, or three electrons. In fact, the ease with which iron can switch among these different ionized states is the secret behind its special utility in our cells.

  The most commonly known role of iron is in the functioning of hemoglobin, the protein that transports oxygen throughout our bodies. Red blood cells are absolutely packed with this protein, each molecule of which needs four iron atoms. In fact, the iron atoms in hemoglobin are what give it its characteristic red color (which means your blood and the surface of Mars have more in common than you might think). Iron is also vital for other crucial functions, including the harvesting of energy from food.

  Despite the fact that there is plenty of iron in our bodies, our environment, our earth, and our solar system, deficiencies in iron are among the most common diet-related ailments in humans. In fact, according to the Centers for Disease Control and Prevention (known as the CDC) and the World Health Organization (WHO), iron deficiency is the single most common nutritional deficiency in the United States and worldwide. That iron deficiency is pandemic in a world filled with iron is paradoxical, to say the least.

  The most acute problem caused by iron insufficiency is anemia, a word that loosely translates to “not enough blood.” Because iron is central to the hemoglobin molecule and hemoglobin is central to red blood cell structure and function, low iron levels impair the body’s ability to make blood cells. The WHO estimates that 50 percent of pregnant women and 40 percent of preschool children are anemic due to iron deficiency. Current estimates are that two billion of the world’s seven billion people are at least mildly anemic. Millions die from the deficiency each year.

  Once again, poor design is mostly to blame for the body’s problems. To start with, the human gastrointestinal tract is terrible at extracting iron from plant sources.

  Plant- and animal-derived iron are structurally different things. In animals, iron is generally found in blood and muscle tissue, and it’s easy enough to process; humans usually have little trouble extracting iron from a nice hunk of steak. The iron in plants, however, is embedded in protein complexes that are much harder for the human gut to rip apart, and so they remain in the gastrointestinal tract and end up as waste, making iron consumption another concern for vegetarians. In this respect, humans are worse off than most animals. The majority of the creatures on earth are mostly or completely vegetarian, yet their intestines do just fine in processing iron.

  Additionally, there are many quirks about iron consumption that can further reduce absorption of it. For instance, we absorb iron best when it comes together with something else we readily absorb—for example, vitamin C. Vegetarians use this trick to boost their iron absorption. By combining sources of iron with sources of vitamin C, they can ensure that their bodies are better able to absorb both. A large dose of vitamin C can increase iron absorption sixfold. Unfortunately, the opposite is also true; a diet poor in vitamin C makes iron absorption more difficult, often leading to the double whammy of scurvy and anemia. Just imagine that combination. It’s bad enough that you are pale and lethargic, but you could also lose muscle tone and begin bleeding internally. Vegetarians in developed countries avoid this lethal trap because they have access to many foods that are high in both iron and vitamin C, such as broccoli, spinach, and bok choy. Poor people in the developing world are usually less fortunate, however, as those key foods are often precious and strictly seasonal.

  As if getting enough iron weren’t hard enough already, there are several other food molecules that actually interfere with iron absorption, particularly the iron in plants. Foods such as legumes, nuts, and berries—which we’re told to eat plenty of—contain polyphenols, which can reduce our ability to extract and absorb iron. Similarly, whole grains, nuts, and seeds are high in phytic acid, which tends to prevent iron from being absorbed by the small intestine. These complications are especially problematic for the two billion people on the planet who are at risk of anemia due to poverty, those for whom meat, and the iron in it, is a rare treat. Their diets tend to be high in the very foods that make iron extraction from plant sources even more difficult. While eating a varied diet is a good strategy to acquire all the elements we need, including iron, it must be carefully varied, in such a way that iron-rich foods are not paired with those that prevent iron extraction.

  Another dietary component that interferes with iron absorption is calcium, which can reduce iron absorption by up to 60 percent. Thus, foods rich in calcium, such as dairy, leafy greens, and beans, should be consumed separately from foods rich in iron in order to maximize absorption, especially if the source of the precious iron in question is plant-based. If you go to the trouble of eating iron-rich foods but pair them with calcium-rich foods, you’ve negated your efforts. It’s not enough to eat the right foods to meet our exacting dietary needs; we must eat those foods in the correct combina
tions. It’s no wonder so many of us opt for a multivitamin instead.

  Iron deficiency is yet another example of how our species’ prehistoric diet was even more insufficient than our modern diet. Although meat and fish were likely staples of the early human diet, their availability waxed and waned both seasonally and through long periods of feast or famine, and proteins were particularly hard to come by in landlocked communities that relied on meat alone. Before agriculture, the available food plants were nothing like the food people are accustomed to eating now. Fruits were tiny and bland, vegetables were bitter and mealy, nuts were hard and tasteless, and grains were tough and fibrous. Worse, plants that reduced iron absorption were more common than plants that provided iron.

  While it’s not all that hard to get enough iron from a vegetarian diet nowadays, it would have been nearly impossible during the Stone Age. Most prehistoric humans would have suffered from severe anemia whenever meat was scarce. This is at least part of the reason why migrations of preagricultural human communities largely followed coastlines or other bodies of water: fish were a more reliable source of iron than meat.

  You may be wondering how, if anemia is such a lethal and constant danger, humans survived at all. We almost didn’t. Our species teetered on the verge of extinction throughout much of prehistory. Over the past two million years, several species of hominids have come and gone, with all but one lineage ending in extinction. At certain points in our species’ long journey, our ancestors were so few in number that they surely would have been classified as endangered by today’s standards. What’s more, none of these branches of hominids was more cognitively advanced than any other until very recently, so modern humans can’t credit their big brains with surviving each and every brush with extinction; it was probably blind luck that saved our forebears in more than a few instances. These near demises had a variety of causes, but iron-deficiency anemia was almost certainly among them.

 

‹ Prev