The Secret History of Food

Home > Other > The Secret History of Food > Page 6
The Secret History of Food Page 6

by Matt Siegel


  And the scary thing is, we can’t pump the brakes on any of this because we depend on corn as much as it depends on us—practically to the point at which we can’t reseed ourselves without it, either.

  Sure, most developed countries could probably swear off foods such as corn on the cob and canned corn and maybe even corn flakes and tortillas*46—but corn is a secret ingredient in almost everything we eat. So we’d have to swear off meat, fish, and poultry, too, as more than a third of our corn supply is used for animal feed47, which means we’d also have to give up anything involving eggs or dairy products.

  Thanks to the widespread use of corn-based sweeteners such as high-fructose corn syrup (the use of which peaked at 49.1 pounds per person annually in 199948 but has since dropped to a no less apocalyptic 31 pounds per person in 2019), we’d also have to cut out soft drinks, candy, condiments, breads, breakfast cereals (and marshmallows), chewing gum, snack foods, and baby food—or at least start buying better brands. Then there’s cornstarch and corn flour,49 pervasively found in baking mixes, instant foods, fried foods (in the breading and batter), frozen foods, and certain pan coatings.

  We’d have to forget about Mexican food as a whole, as well as a lot of gluten-free products, which rely on corn to mimic the viscoelasticity of gluten,50 and a lot of beers, whiskeys, gins, and vodkas made from various fermented corn products.51 Not to mention any Chinese food that uses cornstarch as a thickener or nonfermented soy sauce made from hydrolyzed vegetable protein,52 corn syrup, and caramel color.

  Oh, and anything that contains baking powder, caramel, cellulose,53 citric acid, dextrin, dextrose, inositol, malt, maltodextrin, monosodium glutamate (MSG), semolina, sodium erythorbate, sorbitol, starch, vanilla extract, xanthan gum, and xylitol. Note that corn isn’t always present in this group, but it can be. Nor is this even close to an exhaustive list, so this doesn’t constitute medical advice if you’re allergic to corn—and if you are allergic, good freaking luck, as the legislation requiring food manufacturers to disclose the presence of potential food allergens like milk, eggs, fish,54 shellfish, tree nuts, peanuts, wheat, and soybeans doesn’t apply to corn. And even if it did—well, we’ll get to that in chapter 10. (See here to find out why food labeling is a meaningless sham.)

  In fact, the average American consumes about three pounds of food containing corn55 or corn products every day, often unknowingly. Eat an apple, and you could be eating corn in the layer of food-grade wax56 that’s applied to apples to make them look pretty and prevent them from drying out. Eat a cheeseburger, and there’s a good chance you’re eating corn feed in the beef and cheese; corn flour in the bun; corn syrup in the ketchup and pickles; corn-based ethylene gas in the tomatoes (used to make them ripen quicker57); corn-based dextrose (as a stabilizer)58 in the salt; and, if the meat was frozen, cornstarch in the coating that protected it from freezer burn59. Not to mention, probably, a slew of corn-based binders, emulsifiers, colorants,60 flavorings, sweeteners, preservatives, anticoagulants, and hair removal agents. (You might remember headlines from 2011, when Taco Bell admitted its beef61 was only 88 percent beef, with much of the remaining 12 percent coming from corn by-products like maltodextrin, citric acid, and modified cornstarch.)

  And even if products containing these ingredients are labeled “grass fed,” that doesn’t mean the animals they came from weren’t also corn fed. In 2018, consumers filed a lawsuit against the makers of Kerrygold butter62 for deceptively claiming it was produced from the milk of “grass-fed cows,” when really they were also fed corn, soybeans, and other foods; however, a California federal district court dismissed the complaint, ruling that it was unreasonable for consumers to expect cows to eat only grass.

  Still, human consumption accounts for only around 10 percent of the corn supply63, as it’s also an industrial ingredient in basically everything:64 adhesives, antibiotics, aspirin, ceiling tiles, chalk, cork, cosmetics, crayons, disinfectants, dry-cell batteries, engine fuel, fireworks, inks, plastics, rubber tires, soap, wallpaper, wallpaper glue; it’s probably in the paper and the adhesive binding of this book.

  So if you ordered that cheeseburger to go, corn was probably in the packaging, too,65 as well as in the paper and ink of your receipt; and if you cooked it at home on a grill, corn was in the charcoal briquettes and the match you used to ignite them.

  And we haven’t even covered ethanol, a corn-based biofuel that accounts for another few billion bushels,66 or roughly a third of the US crop yield67. The Energy Independence and Security Act of 200768 mandated that transportation fuels in the United States had to be blended with a certain amount of biofuel to reduce our dependence on foreign oil—phasing up from 9 billion gallons in 2008 to 36 billion gallons by 2022. Ethanol accounts for about 94 percent of this biofuel,69 and about 98 percent of that ethanol comes from corn,70 so approximately 10 percent of the fuel71 you buy from most gas stations is corn based.

  Add all this up, and we’re talking about trillions of dollars in investments we can’t walk away from: a global economy of farmers, ranchers, exporters, refiners, food and beverage manufacturers, and . . . whatever you call workers in the fertilizer industry; 1.1 billion metric tons of raw material72 annually used by everything from food labs to furniture makers; and a primary food source for 230 million people73 and approximately 20 billion feed animals74 (not including the innumerable insects, worms, and rodents that feed on corn crops without permission).

  And even if we could somehow walk away from corn without having to worry about things like foreign dependence, jobs, and logistics, policy makers would never let it happen thanks to the agricultural lobbyists who’ve spent more than $2.5 billion75 in the past two decades protecting the interests of farmers (interests that include the $5 billion in corn subsidies they receive annually76) and farming by-products, not to mention food and beverage manufacturers, industrial manufacturers, pharmaceutical companies, ethanol producers, and so on.

  Consumers wouldn’t let it happen, either. When Coca-Cola infamously changed its formula in 1985 with the introduction of New Coke, consumers lost their shit, publicly emptying cans into sewer drains77 and overwhelming the company’s switchboard with more than forty thousand complaints.78 Fans threatened class action lawsuits and founded support groups such as the Society for the Preservation of the Real Thing79 and Old Cola Drinkers of America, the latter of which reportedly received up to 4,200 calls a day80 from outraged sympathizers. When the company finally gave in and went back to its original formula after just seventy-nine days,81 it was such a big deal that television stations interrupted General Hospital to cover the announcement live.82 Senator David Pryor of Arkansas called it a “decision of historical significance83 . . . proof that certain American institutions can never change.” (Meanwhile, the company had been secretly changing its formula all along84 for several years prior to this by gradually replacing cane sugar with corn syrup.)

  And that was before the internet and social media.

  How would people today react if virtually every soft drink and snack food they were familiar with instantly vanished from shelves? Or if suddenly they had to eat 100 percent beef instead of 88 percent?

  What makes all of this more remarkable is that corn really isn’t even that ideal a food source, as it’s deficient in essential amino acids such as lysine and tryptophan,85 which the human body can’t make on its own, and contains a biologically unavailable form of niacin, which the human body can make, but not without tryptophan.86

  This was another reason the Iroquois planted corn together with beans and squash: neither corn, beans, nor squash contains sufficient amounts of all nine amino acids essential for human life,87 but when eaten together, they provide a proper balance, similar to the way they balance the nutrients in soil. The Iroquois also cooked their corn in a mixture of water and ashes, which created an alkaline solution that chemically unlocked corn’s protein-bound niacin to make it more digestible—a process called nixtamalization,88 from the Aztec nextli89 (“ashes”)
and tamalli (“masa”), also the source of the Spanish tamale.

  Again, no one knows how they knew to do this—though the process also made corn kernels easier to grind90; released pectin from the cell walls,91 which made it easier to create a pliable dough for tortillas; and gave corn an earthier flavor and aroma92 by catalyzing certain volatile compounds. So those may have been the intended outcomes, making balanced amino acids just a happy accident.

  Regardless of their intent, the Iroquois were still way ahead of the rest of the world, as these processes enabled them to consume corn as a staple while still maintaining a balanced diet; meanwhile, the unfortunate world cultures that adopted corn without nixtamalization (including much of Europe, Africa, and the American South93) faced deadly epidemics of niacin deficiency, causing pellagra, so named in 177194 after the Italian pelle (“skin”) and agra (“rough”)—as, in addition to symptoms such as death, diarrhea, dementia, insomnia, aggression, and sensitivity to light, the condition also causes hyperkeratosis, or rough, scaly skin.95 These outbreaks were endemic for hundreds of years before anyone made the connection to corn; in fact, medical knowledge was so far off that niacin deficiency symptoms might have been responsible for the mythology of vampires, reports of which started around the same time and in the same places as pellagra—according to some sources, within a year of each other.96

  As Jeffrey and William Hampl explain in the Journal of the Royal Society of Medicine, the symptoms of pellagra and vampire folklore were remarkably similar:

  Just as vampires must avoid sunlight97 to maintain their strength and keep from decay, pellagrins are hypersensitive to sunlight, with the margins of their dermatitis sharply demarcated. Sun-exposed areas at first become red and thick with hyperkeratosis and scaling. This is followed by inflammation and oedema, which eventually leads to depigmented, shiny skin alternating with rough, brown, scaly areas. With repeated episodes of erythema, a pellagrin’s skin becomes paper-thin and assumes a parchment-like texture.

  Other pellagra symptoms include cracked red tongues and lips, easily mistaken for blood, and a rash or string of lesions on the neck (known as Casal’s necklace after the physician Gaspar Casal,98 who first recorded the symptom in 1735 in Spain), which might explain the myth about vampires having visible neck bites. And the simultaneous symptoms of dementia, insomnia, and aggression would explain the correlation of pellagra epidemics and the reports of nighttime “vampire” attacks in eighteenth-century Europe, concentrated in areas such as Poland, Russia, and Macedonia (i.e., the general vicinity of Transylvania).99

  And this isn’t that far-fetched. Look up photographs of pellagra sufferers, and ask yourself whether you’d try to run a stake through their hearts, too, if they stumbled into your house late at night.

  In 2017, researchers also discovered a link between corn-induced niacin deficiency and cannibalism100 in hamsters, which developed black tongues and signs of dementia.101 Not coincidentally, the study specifically followed European hamsters, which are critically endangered largely due to the regional domination of corn and subsequent lack of alternative, niacin-rich food sources in their natural habitats; in other words, they’re facing now what humans faced in the eighteenth century.

  Ultimately, it took scientists roughly two hundred years to identify niacin deficiency102 as the real culprit behind pellagra when, in 1937, the American biochemist Conrad Elvehjem figured out that niacin supplementation cured the disease in dogs—and later in humans. Some twenty years earlier, another physician named Joseph Goldberger103 had come close when he narrowed down the cause to diet and was able to cure infected orphans and prisoners in the southern United States by feeding them fresh foods (as opposed to just cornmeal, molasses, and fatback). He wasn’t able to identify corn or niacin in particular, however, and faced ridicule from the rest of the scientific community, which insisted pellagra was a disease spread by things like sewage systems or drinking water104 and attempted to treat it with arsenic and electric shocks.105 This, despite the fact that Goldberger had injected himself,106 his wife, and several friends with the blood of pellagra patients and held “filth parties” during which guests consumed pellagric blood, scales, fecal matter, urine, and nasal secretions to prove it wasn’t infectious.*107

  All this for a plant that isn’t even that good for us.

  In fact, the history of corn reads a lot like Shakespeare’s The Taming of the Shrew, wherein a man agrees to marry a woman no one else wants because she’s too willful and independent—then proceeds to starve and torture her until she breaks and becomes obedient. (Or, some might argue, until she realizes it’s better to placate her new husband and feign obedience.) Except it’s not clear whether humanity or corn is the shrew here. Did we really domesticate corn—or did corn domesticate us? Did we tame a wild weed and use it to spread our roots all over the world—or was it the other way around?

  Then again, Shrew is more of a lighthearted comedy, so maybe corn’s history is more like Romeo and Juliet, wherein two teenagers enter a tragic relationship because each imagines the other to be a perfect match, when really they don’t know each other from Adam (or, in Romeo’s case, from Rosaline, the other girl he falls madly in “love” with right before Juliet) and a lot of people end up dead or miserable because of it.

  Or maybe it’s more like the plot of Invasion of the Body Snatchers, the black-and-white version from 1956, wherein the world is quietly invaded by alien seedpods that take over people’s bodies and assimilate with the population in order to colonize Earth and consume all of its resources, only nobody realizes until it’s too late and the spores are everywhere—and the seeds are already inside us. . . .

  Chapter 5

  Honey Laundering

  Instead of dirt and poison1 we have rather chosen to fill our hives with honey and wax; thus furnishing mankind with the two noblest of things, which are sweetness and light.

  —Jonathan Swift

  Haceos miel, y paparos2 han moscas. (“Make yourself honey, and the flies will devour you.”)

  —Miguel de Cervantes Saavedra

  “He that would eat the fruit3 must first climb the tree and get it”: but when that fruit is honey, he that wants it must first cut it down.

  —Robert Carlton (Baynard Rush Hall)

  The honey is sweet,4 but the bee has a sting.

  —Benjamin Franklin

  For thousands of years, people have heralded honey not just as a sweetener and an important food source but as a metaphor for purity, love, compassion, even godliness. We tell our children that they’ll attract more flies with honey than vinegar; call our loved ones things like “honey pie,” “honey bun,” or just “honey”; and celebrate the union of marriage with “honeymoons.”

  Ancient Babylonian and Sumerian priests used honey to exorcise evil spirits5 and poured it onto walls or foundations to consecrate temples;6 early Christians used it in baptisms;7 while medieval Jews smeared it on tablets8 so children would lick them and associate learning (and scripture) with sweetness; the Greeks, Romans, and Chinese placed it next to corpses to bid them a sweet afterlife;9 and in traditional Hindi weddings, honey was rubbed on, um, several of a bride’s orifices to ensure a sweet marriage.10

  Hitler gave honey to wounded soldiers11 with a sweet note that read “Ein Gruss des Führers an seine Verwundeten” (“Greetings from the Führer to his wounded”)—though, fittingly, it was really just cheap imitation honey made from beet syrup and yellow food coloring—while in ancient Germany, fathers were allowed to murder their own children, but only if they hadn’t yet tasted honey, which magically protected them from infanticide.12

  Yet no one ever talks about the dark side of honey: how it kills babies and causes hallucinations—or how it’s the one food that never goes bad*13 because it sucks the life out of everything it touches, making “honey” a much better nickname for toxic ex-lovers who shouldn’t be left around children.

  Look up the etymology of the term honeymoon, and you’ll find that it wasn’t co
ined to celebrate a couple’s living happily ever after but sort of the opposite. The moon, here, isn’t an allusion to enchantment, fairy-tale wishes, or all-night sex—but to fading, the point being that love, like the phasing moon, “is no sooner full than it begins to wane.”14

  Honie-moone: applied to those that love15 well at the first, and not so well afterwards, but will change as doth the moone.

  —John Minsheu, Guide into Tongues, 1617

  Hony-moon: applied to those marryed persons16 that love well at first and decline in affection afterwards: it is hony now, but it will change as the moon.

  —Thomas Blount, Glossographia, 1656

  Look closer at the adage that honey attracts more flies than vinegar, and you’ll find it’s not that simple. The saying dates back to the seventeenth century, from the Italian “Il mele catta più mosche, che non fà l’aceto”17 (“Honey gets more flyes to it, than doth viniger”). Meanwhile, the practice goes back even further, to the Egyptian pharaoh Pepy II18 (2278–2184 BC), who allegedly slathered naked slaves in honey and made them stand around him like human flypaper, so flies would bother them instead of him.

  But the actual science of attracting flies, explains Sean O’Donnell,19 professor of BEES (Biodiversity, Earth & Environmental Science) at Drexel University, is complicated. Adult flies don’t tend to live very long, so they don’t need a lot of nutrients; from the moment they’re born, they’re on an immediate decline during which they’re just burning out their bodies, so they basically just need sugars to fuel their metabolically expensive flight apparatus and keep those fires burning—and from that perspective, honey is the more attractive food source.

 

‹ Prev