by Bee Wilson
The White House conference was primarily designed to address these problems. The government announced a huge expansion in the Food Stamp scheme and improved child nutrition programmes; better school lunches; and food education. Nixon launched a grand call to “put an end to hunger in America.”56 The conference also included a panel whose job it was to consider the question of “new foods.” and what role they might play in saving the nation’s diet. (No one seems to have thought it politic to point out that the decline in America’s health had actually coincided with the proliferation of these foods.)
The New Foods Panel was chaired by the vice president of Monsanto (now most famous as a seller of GM technology, but then a producer of agricultural chemicals) and included among its members the vice president of Pillsbury (mass market baked goods) and the vice president of the Ralston-Purina Company (breakfast cereals and animal feed), as well as assorted food scientists and nutritionists. Unsurprisingly, given its industry component, the panel sagely concluded that novel foods were extremely valuable. It urged the complete modernization of food regulation, to permit “completely new foods” to be created.57 It was noted by the panel that new foods were currently “often required by the Government regulatory agencies to be called ‘imitation products,’ even when the new product is superior to the old.” They went on: “The use of such over-simplified and inaccurate words is potentially misleading to consumers, and fails to inform the public about the actual characteristics and properties of the new product.” Thus the official position on “imitation foods” had come full circle. The term “imitation” had originally been required to stop consumers from being misled. But now, the White House panel argued, the word itself had become misleading.
The New Foods Panel recommended a modernization of all food standards “to encourage the development and marketing of variations of traditional foods, and of completely new foods, that can provide consumers a greater variety of acceptable, higher quality, and more nutritious food products at lower prices.” The old recipe based approach to food standards was criticized as “deadening.” New Foods were the future, because they held out the alluring dream of solving the two great evils of the American diet: obesity and malnutrition. They would do this with two powerful tools: the invention of novel slimming foods, and the fortification of staple foods. All that was needed was the creativity of the free market, coupled with a bit of government help when needed, and everyone would be better off. Or so it was hoped.
Fortification and Slimming
The idea of enrichment or fortification goes back to the early 1830s, when goitre, a grotesque swelling of the neck cause by enlarged thyroid glands, was a common problem in many communities. This could lead in turn to cretinism, a form of mental illness. It was noticed that the places where goitre and cretinism were most common were those where the soil was deficient in iodine. Add iodine to people’s diets, and the goitre and cretinism never occurred. A French chemist, therefore, advocated adding iodine to all table salt. Routine enrichment of salt with iodine was introduced in Europe from the early 1900s; and, little by little, goitre became a forgotten disease in the affluent West, though in Pakistan, where not all table salt is iodized, millions are still at risk from iodine deficiency.
This was an early, and isolated, case. More general fortification began only in the 1940s, as wartime governments panicked that the public was not getting enough nutrients from the food it ate. Knowedge of vitamins, those micronutrients so crucial to health, had been growing ever since 1897 when Christian Eijkman, a Dutch pathologist, discovered that eating unpolished rice could prevent beriberi because it contained thiamine. Each decade of the twentieth century brought a new vitamin or mineral to worship. In the 1900s, it was fish oil to cure rickets. In the 1920s, it was calcium and vitamin A, which led experts to recommend drinking enormous quantities of milk and stuffing oneself with green vegetables.58 It also led to vitamin D being added to milk, to aid calcium absorption and prevent rickets. Next came vitamin C and vitamin G (later rechristened riboflavin). In 1940, in the United States, it was the turn of thiamine again, which became known as the “morale vitamin” in the fight against Hitler. Vice President Henry Wallace went so far as to say that adding thiamine and other B vitamins to the diet “makes life seem enormously worth living.”59
The only reason they needed to be added to the food, however, was that the basic American diet was so depleted. Take bread. Most of the vitamins and minerals in flour are in the outer layers of the wheat—the bran.60 The old method of milling white flour was to sift the crushed wheat through fine sieves or “bolting cloths”; this method retained many of the nutrients in the wheat. From the 1870s, however, a new, efficient system of roller-milling came in, which passed the flour between steel cylinders, stripping it of most of its vitamins on the way. Standard, “70% extraction” white flour will have lost 60 percent of its calcium, 77 percent of its thiamine, 76 percent of its iron, and 80 percent of its riboflavin. By 1940, the average American was eating 200 pounds a year of bread made from this nutrient-stripped flour. For the poor, this represented a very large percentage of their daily calories. Nutrition experts were terrified that America was suffering from “hungerless vitamin famine” as a result.61 The government did not like to tell people to return to more nutrient-dense whole wheat bread; it wouldn’t be popular. But during the 1930s, it had become commercially viable to produce vitamins on a large scale. So from 1940 onwards, inferior white flour was required to be enriched with thiamine, iron, and niacin.62 This was followed in 1943 by enriched cornmeal and grits, and in 1946 by enriched pasta.63 The same story unfolded in Britain, where flour was enriched with vitamin B and calcium from July 1940 on.64
Fortification had its sceptics. In Britain, Ernest Graham Little MP, a member of the Food Education Society, expressed doubts: “The universal scientific opinion is that the organic and natural supplies of vitamins are far superior to the synthetic kind.”65 In the United States, too, the American Medical Association cautioned against more extreme forms of “fortification.” The temptation was for manufacturers to add nutrients to foods as a selling point, nutrients that often had never been there in the first place. SunnyFranks frankfurters boasted that they were enriched with vitamin D—“the Sunshine Vitamin hard-playing youngsters and hard-working men need! This Vitamin D does not ‘cook out’!”66 It was a pure marketing gimmick, an attempt to put these fatty sausages on a nutritional par with cod liver oil. Normal beef frankfurters are low in vitamin D compared to leaner red meats; dosing them up with the vitamin gave the impression that they were healthy, when actually a single frankfurter contains 20 percent of the recommended daily intake of fat. Other manufacturers engaged in a kind of “fortification race” to become the most highly vitaminized. Carnation bragged that its wheat contained “Actually 50% more Vitamin B1 than in whole wheat.”67 To innocent consumers, topping up on vitamins looked like an obvious benefit. How could you ever have too much of a good thing? In fact, though, it was just another swindle, and a potentially dangerous one.
In 1957, the Ministry of Health in Scotland reported adverse symptoms in a number of infants and young children: a failure to thrive, vomiting, weakness, and, in several cases, death. The problem was traced to excess levels of vitamin D in the food supply, because so many children were being fed up on vitamin D-enriched dried milk, plus cod liver oil. The children were suffering from hypervitaminosis D, an excess of vitamin D in the body, which leads to an overload of calcium, causing damage to bones, soft tissues, and kidneys.68 The Ministry of Health intervened to reduce the levels of vitamin D consumed by children, and the problem abated.
Pillsbury Farina, a typical fortified food of the mid-twentieth century.
This was by no means the last case of vitamin poisoning, though. Illness from excess iron fortification has been on the rise for some time. As many as a million Americans suffer from a hereditary condition that means that they absorb slightly more iron than they need. On a normal diet, with not
too much red meat, such people may be fine. If they consume too much iron-rich food, however, the iron can build up in their bodies to toxic levels, causing liver and heart problems and even death. From 1970 to 1994, iron in the American food supply increased by a third; in the same period, death by iron poisoning or haemochromatosis increased by 60 percent.69 In 2004, the Danish government banned the sale of Kellogg’s cereals on the grounds that consumers could overdose on the vitamins, especially pregnant women who could put their unborn babies at risk. This action was widely reported as “bizarre.”70 A spokesperson for Kellogg’s announced that the company was “mystified.” But the Danish decision was no more peculiar than the decision of earlier governments to endorse vitamin intake so wholeheartedly that consumers got the impression that it was impossible for added vitamins to do more harm than good.
Vitamins, like everything else, are a poison if taken to excess. In 2000, a study done by the U.S. Institute of Medicine reported that consuming large quantities of antioxidants (such as vitamins C and E and selenium) could lead to hair loss and internal bleeding.71 The following year, three thousand children in Assam in India fell severely ill after receiving too high a dose of vitamin A.72 Even when they are not toxic, high doses of single nutrients can create imbalances that affect the ability to metabolize other nutrients. They can also do damage by covering up the underlying flaws in the food to which they are added.
The New Foods Panel of 1969 had no reservations about what it saw as the advantages of food fortification. Its first recommendation was to urge “an immediate fortification programme to relieve malnutrition.” The thought behind this was that too many staples eaten by poor consumers were simply “not nutrionally adequate.” But the panel did not want to limit fortification to staple foods, insisting that “no one type of food should be preferred over another as a nutritional carrier.”73 This opened up the market for all manner of “enriched” foods, from doughnuts to candy. Sugary breakfast cereals could boast that they were good for bones or brains, or gave those who ate them amazing athleticism, because of their vitamin content. Instead of being a menace to public health, highly processed foods—if sufficiently fortified—could claim that they were improving it. In the trade press, the industry was a little more honest about who was really being enriched by enrichment. Hoffman-La Roche, a company selling bulk vitamins to manufacturers, urged an increase in voluntary fortification, arguing that “nutrition is good business . . . food fortification, technologically feasible at low costs, opens up new marketing possibilities for food manufacturers.”74
Unlike iodine in salt, which addressed a problem affecting entire communities, this later wave of fortification attempted to address the health problems facing particular groups (the poor, children, the elderly, pregnant women) by dosing en masse the consumers of certain processed foods. This approach has its obvious drawbacks. One is that there is no guarantee that the fortification will reach those who most need it. Almost everyone eats table salt; the same is not true of expensive breakfast cereals. Another problem is that fortification may reach those who don’t need it, who may actually suffer as a result. One example is folic acid in bread. Since the 1990s, it has been mandatory to add folic acid to bread and other grain products in the United States (and, at the time of writing, Britain looks set to follow, with folic acid added to sliced white bread). The reason is that folic acid consumed by pregnant women can prevent babies being born with neural tube defects, such as spina bifida. The downside is that universal fortification of popular foods with folic acid may harm older people, by masking a deficiency in vitamin B12. This deficiency affects up to 10 percent of those over sixty-five and can result in damage to the nervous system.75
There is another kind of deception going on too—a kind of collective self-deception. Fortification can disguise the fundamental inadequacies of the diet eaten by the general population. By bolstering the intake of certain select vitamins, fortification can give the impression that, in large industrial societies, the food of the poor or uneducated is not so much worse than the food of the rich or educated. This is an illusion. On grounds of both taste and nutrition, there is a great difference between eating a whole, tart, juicy orange, rich in fibre as well as natural flavour, and eating an orange-flavour drink fortified with vitamin C; or between eating a slice of real, malty wholegrain bread, naturally rich in B vitamins, and eating an industrially produced square of fortified white “bread.” In this sense, fortification is a social panacea. As Marion Nestle writes:
The fortification of cereals, milk and margarine . . . addresses vitamin and mineral deficiencies that are caused largely by poverty or other socioeconomic conditions that affect a relatively small proportion of the American population. In an ideal world, nutritional deficiencies among such groups would be corrected through education, jobs, or some form of income support—all better overall strategies than fortification.76
After 1969, fortification combined two contradictory views of the consumer. One saw consumers as children who need to be protected against nutritional harm without their knowledge; the other treated consumers as adults who can assume total responsibility for the food they buy. The basic premise of fortification programmes is that consumers are incapable of making sound judgements about what food would do them most good. If everyone ate iron-rich foods of their own accord, no one would suggest duping them into doing so. Fortification turns consumers into passive creatures, who swallow vitamins whether they choose to or not. Combined with this, however, was the idea of total consumer freedom. Nixon’s New Foods Panel stated that “the consumer should be free to select, in the marketplace, any fortified food of her choice, whether completely natural or completely synthetic in origin.”77 This assumes that the consumer has an expert knowledge of nutrition. But if the consumer did have an expert knowledge of nutrition, why would fortified foods be necessary?
The New Foods Panel was at least confident that the risks of over-dosage from enriched foods would not be anything to worry about. Maximum limits would be set on how much of any vitamin could be added to a given food, and the panel insisted that “overconsumption of any nutrient would be prevented by basing fortification on the calorie contribution of the food to the diet.”78 In other words, if you could work out how much bread or milk or cereal the average person ought to consume in the course of a day, you could make sure that he or she would not overdose on fortified foods. But there was a flaw in this reasoning too. Already in 1969, many Americans were not simply suffering from malnourishment but from overconsumption and obesity. It was all very well to set the amount of vitamins a given portion of food should contain. The trouble was that millions were not sticking to recommended portion sizes. If you binged on a whole box of vitamin-fortified cereal, you could very easily exceed the maximum limits on those “healthy” vitamins—never mind the damage to your health in other respects.
While much of the food industry was pressing hard to “nutrify” as many foods as possible, other manufacturers joined the drive to make new “nonnutritive” versions of traditional foods, in order to tackle the growing problem of obesity (and, of course, to turn a profit from it). Again, this development involved treating the consumer as someone simultaneously incapable of making choices and in need of an infinite array of them. In 1970, William F. Cody, a lawyer for a big food company, argued that, while it was true that “overweight can be largely eliminated through dietary adjustments,” it was very difficult for the “average man” to change his dietary habits. The answer? “To provide palatable modifications of the high-calorie, high saturated fat foods; these special foods should look, smell and taste like the traditional food—but should be reformulated so as to reduce or eliminate the objectionable characteristics.”79 Cody gave the examples of low-calorie margarine and low-cholesterol, low-fat dried egg. He complained that current law still obliged these foods to be labelled as “imitation” foods, which “conjures up the image of something highly synthetic or cheapened.” Cody pointed out that t
he old imitation foods were generally cheaper than those they were imitating. By contrast, these new “non-standard compositions” might actually cost more to manufacture, and therefore be more highly prized by the consumer. How could something be an imitaiton if it was the more valuable article? Low-calorie margarine was not the same as, say, watered-down milk, in Cody’s view, because the low-calorie product was a deliberate improvement on the real thing: more expensive to make, better to eat, and healthier.
Unfortunately, whether it was healthier was becoming a moot point. In 1969, a crisis had hit for the fast-expanding diet industry when Robert Finch, the U.S. secretary of health, announced that he was ordering the removal of the artificial sweetener cyclamate from the GRAS list.80 Food scientists had been thrilled when they first discovered cyclamate in 1937. Like saccharine, it had sweetness without the calories of sugar, but unlike saccharine, it did not have a bitter aftertaste. In 1951, the FDA approved cyclamate, and it crept into countless slimming products, from chewing gum to diet soda, from children’s vitamins to sugarless jam. When the Delaney Clause of 1958 prevented the FDA from clearing carcinogens as food additives, cyclamates—ironically—took over even more from saccharine, because of a 1951 report suggesting that the latter might cause cancer. By 1969, foods and drinks containing cyclamate were in as many as three quarters of American homes.81