By now we have become so inured to fake foods that we forget what a difficult trail margarine had to blaze before it and other synthetic food products could win government and consumer acceptance. At least since the 1906 publication of Upton Sinclair’s The Jungle, the “adulteration” of common foods has been a serious concern of the eating public and the target of numerous federal laws and Food and Drug Administration regulations. Many consumers regarded “oleomargarine” as just such an adulteration, and in the late 1800s five states passed laws requiring that all butter imitations be dyed pink so no one would be fooled. The Supreme Court struck down the laws in 1898. In retrospect, had the practice survived, it might have saved some lives.
The 1938 Food, Drug and Cosmetic Act imposed strict rules requiring that the word “imitation” appear on any food product that was, well, an imitation. Read today, the official rationale behind the imitation rule seems at once commonsensical and quaint: “…there are certain traditional foods that everyone knows, such as bread, milk and cheese, and that when consumers buy these foods, they should get the foods they are expecting…[and] if a food resembles a standardized food but does not comply with the standard, that food must be labeled as an ‘imitation.’”
Hard to argue with that…but the food industry did, strenuously for decades, and in 1973 it finally succeeded in getting the imitation rule tossed out, a little-noticed but momentous step that helped speed America down the path to nutritionism.
Industry hated the imitation rule. There had been such a tawdry history of adulterated foods and related forms of snake oil in American commerce that slapping the word “imitation” on a food product was the kiss of death-an admission of adulteration and inferiority. By the 1960s and 1970s, the requirement that such a pejorative term appear on fake food packages stood in the way of innovation, indeed of the wholesale reformulation of the American food supply-a project that, in the wake of rising concerns about dietary fat and cholesterol, was coming to be seen as a good thing. What had been regarded as hucksterism and fraud in 1906 had begun to look like sound public health policy by 1973. The American Heart Association, eager to get Americans off saturated fats and onto vegetable oils (including hydrogenated vegetable oils), was actively encouraging the food industry to “modify” various foods to get the saturated fats and cholesterol out of them, and in the early seventies the association urged that “any existing and regulatory barriers to the marketing of such foods be removed.”
And so they were when, in 1973, the FDA (not, note, the Congress that wrote the law) simply repealed the 1938 rule concerning imitation foods. It buried the change in a set of new, seemingly consumer-friendly rules about nutrient labeling so that news of the imitation rule’s repeal did not appear until the twenty-seventh paragraph of The New York Times’ account, published under the headline F.D.A. PROPOSES SWEEPING CHANGE IN FOOD LABELING: NEW RULES DESIGNED TO GIVE CONSUMERS A BETTER IDEA OF NUTRITIONAL VALUE. (The second deck of the headline gave away the game: PROCESSORS BACK MOVE.) The revised imitation rule held that as long as an imitation product was not “nutritionally inferior” to the natural food it sought to impersonate-as long as it had the same quantities of recognized nutrients-the imitation could be marketed without using the dreaded “i” word.
With that, the regulatory door was thrown open to all manner of faked low-fat products: Fats in things like sour cream and yogurt could now be replaced with hydrogenated oils or guar gum or carrageenan, bacon bits could be replaced with soy protein, the cream in “whipped cream” and “coffee creamer” could be replaced with corn starch, and the yolks of liquefied eggs could be replaced with, well, whatever the food scientists could dream up, because the sky was now the limit. As long as the new fake foods were engineered to be nutritionally equivalent to the real article, they could no longer be considered fake. Of course the operative nutritionist assumption here is that we know enough to determine nutritional equivalence-something that the checkered history of baby formula suggests has never been the case.
Nutritionism had become the official ideology of the Food and Drug Administration; for all practical purposes the government had redefined foods as nothing more than the sum of their recognized nutrients. Adulteration had been repositioned as food science. All it would take now was a push from McGovern’s Dietary Goals for hundreds of “traditional foods that everyone knows” to begin their long retreat from the supermarket shelves and for our eating to become more “scientific.”
FOUR - FOOD SCIENCE’S GOLDEN AGE
I n the years following the 1977 Dietary Goals and the 1982 National Academy of Sciences report on diet and cancer, the food industry, armed with its regulatory absolution, set about reengineering thousands of popular food products to contain more of the nutrients that science and government had deemed the good ones and fewer of the bad. A golden age for food science dawned. Hyphens sprouted like dandelions in the supermarket aisles: low-fat, no-cholesterol, high-fiber. Ingredients labels on formerly two-or three-ingredient foods such as mayonnaise and bread and yogurt ballooned with lengthy lists of new additives-what in a more benighted age would have been called adulterants. The Year of Eating Oat Bran-also known as 1988-served as a kind of coming-out party for the food scientists, who succeeded in getting the material into nearly every processed food sold in America. Oat bran’s moment on the dietary stage didn’t last long, but the pattern now was set, and every few years since then, a new oat bran has taken its star turn under the marketing lights. (Here come omega-3s!)
You would not think that common food animals could themselves be rejiggered to fit nutritionist fashion, but in fact some of them could be, and were, in response to the 1977 and 1982 dietary guidelines as animal scientists figured out how to breed leaner pigs and select for leaner beef. With widespread lipophobia taking hold of the human population, countless cattle lost their marbling and lean pork was repositioned as “the new white meat”-tasteless and tough as running shoes, perhaps, but now even a pork chop could compete with chicken as a way for eaters to “reduce saturated fat intake.” In the years since then, egg producers figured out a clever way to redeem even the disreputable egg: By feeding flaxseed to hens, they could elevate levels of omega-3 fatty acids in the yolks. Aiming to do the same thing for pork and beef fat, the animal scientists are now at work genetically engineering omega-3 fatty acids into pigs and persuading cattle to lunch on flaxseed in the hope of in
troducing the blessed fish fat where it had never gone before: into hot dogs and hamburgers.
But these whole foods are the exceptions. The typical whole food has much more trouble competing under the rules of nutritionism, if only because something like a banana or an avocado can’t quite as readily change its nutritional stripes. (Though rest assured the genetic engineers are hard at work on the problem.) To date, at least, they can’t put oat bran in a banana or omega-3s in a peach. So depending on the reigning nutritional orthodoxy, the avocado might either be a high-fat food to be assiduously avoided (Old Think) or a food high in monounsaturated fat to be embraced (New Think). The fate and supermarket sales of each whole food rises and falls with every change in the nutritional weather while the processed foods simply get reformulated and differently supplemented. That’s why when the Atkins diet storm hit the food industry in 2003, bread and pasta got a quick redesign (dialing back the carbs; boosting the proteins) while poor unreconstructed potatoes and carrots were left out in the carbohydrate cold. (The low-carb indignities visited on bread and pasta, two formerly “traditional foods that everyone knows,” would never have been possible had the imitation rule not been tossed out in 1973. Who would ever buy imitation spaghetti? But of course that is precisely what low-carb pasta is.)
A handful of lucky whole foods have recently gotten the “good nutrient” marketing treatment: The antioxidants in the pomegranate (a fruit formerly more trouble to eat than it was worth) now protect against cancer and erectile dysfunction, apparently, and the omega-3 fatty acids in the (formerly just fattening) walnut ward off heart disease. A whole subcategory of nutritional science-funded by industry and, according to one recent analysis,* remarkably reliable in its ability to find a health benefit in whatever food it has been commissioned to study-has sprung up to give a nutritionist sheen-(and FDA-approved health claim) to all sorts of foods, including some not ordinarily thought of as healthy. The Mars Corporation recently endowed a chair in chocolate science at the University of California at Davis, where research on the antioxidant properties of cacao is making breakthroughs, so it shouldn’t be long before we see chocolate bars bearing FDA-approved health claims. (When we do, nutritionism will surely have entered its baroque phase.) Fortunately for everyone playing this game, scientists can find an antioxidant in just about any plant-based food they choose to study.
Yet as a general rule it’s a whole lot easier to slap a health claim on a box of sugary cereal than on a raw potato or a carrot, with the perverse result that the most healthful foods in the supermarket sit there quietly in the produce section, silent as stroke victims, while a few aisles over in Cereal the Cocoa Puffs and Lucky Charms are screaming their newfound “whole-grain goodness” to the rafters.
Watch out for those health claims.
FIVE - THE MELTING OF THE LIPID HYPOTHESIS
N utritionism is good for the food business. But is it good for us? You might think that a national fixation on nutrients would lead to measurable improvements in public health. For that to happen, however, the underlying nutritional science and the policy recommendations (not to mention the journalism) based on that science would both have to be sound. This has seldom been the case.
The most important such nutrition campaign has been the thirty-year effort to reform the food supply and our eating habits in light of the lipid hypothesis-the idea that dietary fat is responsible for chronic disease. At the behest of government panels, nutrition scientists, and public health officials, we have dramatically changed the way we eat and the way we think about food, in what stands as the biggest experiment in applied nutritionism in history. Thirty years later, we have good reason to believe that putting the nutritionists in charge of the menu and the kitchen has not only ruined an untold number of meals, but also has done little for our health, except very possibly to make it worse.
These are strong words, I know. Here are a couple more: What the Soviet Union was to the ideology of Marxism, the Low-Fat Campaign is to the ideology of nutritionism-its supreme test and, as now is coming clear, its most abject failure. You can argue, as some diehards will do, that the problem was one of faulty execution or you can accept that the underlying tenets of the ideology contained the seeds of the eventual disaster.
At this point you’re probably saying to yourself, Hold on just a minute. Are you really saying the whole low-fat deal was bogus? But my supermarket is still packed with low-fat this and no-cholesterol that! My doctor is still on me about my cholesterol and telling me to switch to low-fat everything. I was flabbergasted at the news too, because no one in charge-not in the government, not in the public health community-has dared to come out and announce: Um, you know everything we’ve been telling you for the last thirty years about the links between dietary fat and heart disease? And fat and cancer? And fat and fat? Well, this just in: It now appears that none of it was true. We sincerely regret the error.
No, the admissions of error have been muffled, and the mea culpas impossible to find. But read around in the recent scientific literature and you will find a great many scientists beating a quiet retreat from the main tenets of the lipid hypothesis. Let me offer just one example, an article from a group of prominent nutrition scientists at the Harvard School of Public Health. In a recent review of the relevant research called “Types of Dietary Fat and Risk of Coronary Heart Disease: A Critical Review,”* the authors proceed to calmly remove, one by one, just about every strut supporting the theory that dietary fat causes heart disease.
Hu and his colleagues begin with a brief, uninflected summary of the lipophobic era that is noteworthy mostly for casting the episode in the historical past:
During the past several decades, reduction in fat intake has been the main focus of national dietary recommendations. In the public’s mind, the words “dietary fat” have become synonymous with obesity and heart disease, whereas the words “low-fat” and “fat-free” have been synonymous with heart health.
We can only wonder how in the world such crazy ideas ever found their way into the “public’s mind.” Surely not from anyone associated with the Harvard School of Public Health, I would hope. Well, as it turns out, the selfsame group, formerly in thrall to the lipid hypothesis, was recommending until the early 1990s, when the evidence about the dangers of trans fats could no longer be ignored, that people reduce their saturated fat intake by switching from butter to margarine. (Though red flags about trans fats can be spotted as far back as 1956, when Ancel Keyes, the father of the lipid hypothesis,
suggested that rising consumption of hydrogenated vegetable oils might be responsible for the twentieth-century rise in coronary heart disease.)
But back to the critical review, which in its second paragraph drops this bombshell:
It is now increasingly recognized that the low-fat campaign has been based on little scientific evidence and may have caused unintended health consequences.
Say what?
The article then goes on blandly to survey the crumbling foundations of the lipid hypothesis, circa 2001: Only two studies have ever found “a significant positive association between saturated fat intake and risk of CHD [coronary heart disease]”; many more have failed to find an association. Only one study has ever found “a significant inverse association between polyunsaturated fat intake and CHD.” Let me translate: The amount of saturated fat in the diet probably may have little if any bearing on the risk of heart disease, and evidence that increasing polyunsaturated fats in the diet will reduce risk is slim to nil. As for the dangers of dietary cholesterol, the review found “a weak and nonsignificant positive association between dietary cholesterol and risk of CHD.” (Someone should tell the food processors, who continue to treat dietary cholesterol as a matter of life and death.) “Surprisingly,” the authors wrote, “there is little direct evidence linking higher egg consumption and increased risk of CHD”-surprising, because eggs are particularly high in cholesterol.
In Defense of Food Page 4