Swindled

Home > Nonfiction > Swindled > Page 25
Swindled Page 25

by Bee Wilson


  There was a gathering sense that without basic standards stipulating the quality of individual foods, the market would descend into anarchy. Hence, the 1938 Food, Drug and Cosmetic Act, signed into law by FDR on 25 June, was designed to mark a significant improvement of the 1906 act signed by Roosevelt’s fifth cousin Theodore.33 The new law was a triumph for honesty. As well as setting far more rigid regulations for drugs companies (in the wake of a scandal in 1937 when more than a hundred people, many of them children, died after taking a new “sulfa” wonder drug, which was really antifreeze), it clarified the federal government’s power to police the food supply. The 1938 act finally abolished the distinctive name proviso and began to establish basic food standards. Now, it was a requirement for the label of a food to bear “its common or usual name” alongside whatever fanciful brand name it might choose. Bred-Spred could no longer deny that it was purporting to be jam. Gradually, the FDA also established enforceable standards for some of the most important foods of the American larder. The first standards were set for tomato products, to be followed by milk and cream, fruit juices, canned tuna, canned vegetables, chocolate, flour and grains, cereals, mayonnaise, and macaroni.

  One of the earliest foods to be given a standard was jam or “jelly.” Essentially, the new food standards were like old-fashioned recipes, laying down exactly what ingredients should be in any given food, and how much of each; this way, any swindling or tampering would be easily apparent, and also easy to punish. In the case of jam, the FDA took evidence from family recipes and cookbooks and decided that the correct recipe was about fifty-fifty fruit to sugar, a generous ratio. The legal minimum fruit content for jam was then set at 45 percent (compare this with the current UK regulation of 35 percent for most fruits). Assuming it is well made, a jam containing 45 percent fruit is the proper article—luscious and fruity, as well as being far more nutritious than the Bred-Spred horrors of the Depression era. By setting the percentage so high, the U.S. government was ensuring that “jam” would become worthy of the name.

  For most of the 1940s, the new food standards worked well. They went some way to protecting the diet of a population hit by war and rationing (albeit on nothing like the same scale as in Britain). The government used the standards to protect consumers against swindles they could otherwise do nothing about. The recipe approach was one that everyone—consumers as well as lawmakers—could understand, and it was upheld by the courts. In 1944, there was a series of cases prohibiting substitute dairy products. The Supreme Court stated at the time that it was necessary to prohibit such products on occasion because labelling was not an adequate remedy for deception.34 In 1949, Quaker Oats was banned from marketing a farina cereal with vitamin D, because it did not meet the statutory requirements for such products. There were cracks around the edges of the system, though. In some areas, demanding standards could lead to bootlegging. A Philadelphia lawyer insisted that he had seen bootlegged consignments of inferior ice cream travelling the city in refrigerated vans.35 More significantly, the manufacturers of imitation foods also began to fight back.

  The landmark case was the Imitation Jam ruling of 1952. In 1951, the FDA seized sixty-two cases of “Delicious Brand Imitation Jam” in assorted flavours. The “jam” was seized in New Mexico, where it had been shipped from Denver, Colorado; under the law, “interstate commerce” in adulterated goods was forbidden. The FDA ruled that this pseudo-jam was “misbranded” under the Food and Drugs Act, since it contained a scant 25 percent fruit (the rest being highly gelatinized sugary water). Under section 403(g), a food was misbranded if “it purports to be” a food for which a standard has been set. In the FDA’s view, this product was clearly purporting to be jam, and, equally clearly, it could not be jam because it fell short of the fruit requirements by a full 20 percent. To the FDA, Delicious Brand Imitation Jam was a deception, despite the apparent honesty of its name. Even if the label admitted that it was not real jam as defined by the law, most consumers would still believe that it was.

  The FDA, however, was deploying this case not only to uphold the purity of jam against imitators, but also to reinforce its own powers to set food standards. The strategy backfired. After conflicting judgements were given in the trial court and the court of appeals, the case went to the Supreme Court, which ruled that Delicious Brand Imitation jam was not guilty of misbranding. Section 403(c) of the Food Act stated that imitation foods were misbranded unless they were labelled as such; but since the Delicious Brand preserves were prominently labelled as “imitation,” it did not matter that their fruit content was so woefully low. The Supreme Court thus rejected the view of the FDA that products such as Delicious Brand Imitation Jam were inherently deceptive. As one supporter of the ruling said: “There is nothing difficult or strange about the word imitation.”36 After 1952, it would still be an offence to pass off imitation jam as pure jam, but bakers would henceforth be free to sell “cake roll with imitation jelly,” and customers could buy an imitation jelly and peanut butter sandwich in a diner, probably for a lower price than if the jelly had been the real, full-fruit kind.

  The Imitation Jam case made a lot of people very worried. A lawyer for the association of National Milk Producers complained that it would give rise to an explosion of substandard foods—not just jams and jellies, but dairy products too. Consumers would become confused about what the true standards were. “Unscrupulous” sellers would take the opportunity “to deceive and cheat them.”37 What was the point of having food standards if decent producers couldn’t be protected against poor imitations? Another American lawyer complained that imitation foods were not even as cheap as they seemed. They pretended to be offering good value to cash-strapped consumers, but actually they had “an unfair price bulge when competing with the genuine product”:38 in other words, they cost much more than they were worth. The price of the imitation food was largely determined by the price of the genuine article. The imitation version would be priced up a few cents below the real thing. But the real value of these imitation goods was often much lower still. The consumer was thus being cheated twice over: once, by buying the imitation food in the first place; and again, by buying it at a far higher price than it warranted. Yet again, it all boiled down to consumer knowledge, or lack of it. In 1894, the Supreme Court had stated that “The Constitution of the United States does not secure to anyone the privilege of defrauding the public.”39 But in the eyes of its critics, the Imitation Jam ruling had done just that.

  These were voices in the wilderness, trying to stem the tide of history. Soon, the idea that legislators could ban imitation foods would start to look quaint. In 1952, the year of the Imitation Jam ruling, America was poised to enter what has been called its “golden age of food processing.”40 Frozen orange juice, instant coffee, readymade TV dinners of chicken à la king, boil-in-the-bag macaroni and cheese in glitzy foil pouches, dehydrated potato salad—all these wonder products and many more were at the disposal of the housewife in 1950s’ America. If you browse the commercial pages of America’s regional newspapers for 1952, you find job advertisements for frozen food salesmen, special offers on Miracle Whip, promotions for Campbell’s Tomato Soup and Hormel’s Chili in a can.41 The critical difference between these new processed foods and the older ones was that they had lost their inferiority complex in relation to unprocessed food. Paul Willis, a food industry boss, boasted in 1956 that “Today’s processed foods have a food value at least equal, and often superior to, raw produce, but many housewives are still spending countless hours preparing raw produce in the erroneous belief that they are feeding their families more ‘healthfully.’ ”42 This marked a crucial shift . Thanks to intensive marketing, substitute foods were no longer to be seen as poor relations of the foods they originally mimicked. They were new; and new was best.

  Additives, New Foods, and the White House Conference of 1969

  In 1953, Dwight “Ike” Eisenhower became president. Eisenhower would be remembered for building the interstate highway
system, and for his doctrine of “dynamic conservatism.” As it applied to food, this meant embracing the new array of processed foods, and freeing food manufacturers from too much government regulation. It was farewell to the self-abnegation and wholesomeness of the war years and hello to fast cars on the open highway and as much “modern” food as you could buy. The new food industry had a lot in common with the ever-growing auto industry. Both were constantly seeking ways of adding “value” to their products and making them seem as exciting as possible; consumer safety was a secondary consideration. Soon after he assumed the presidency, Eisenhower attended a special “research” luncheon hosted by the U.S. Department of Agriculture in Beltsville, Maryland. The meal was designed to showcase the marvellous potential in all the new ways of processing foods. Eisenhower sampled “powdered orange juice, potato chip bars, a whey cheese spread, ‘dehydrofrozen peas,’ beef and pork raised on a new (hormone and antibiotic added) feeding method and lowfat milk.”43 Half a century later, if a president were to be fed such a miserable meal, he or she might feel entitled to send it back, with a rebuke to the chef. Eisenhower, however, seems to have been impressed.

  The constantly heard refrain was that the American diet had never been better; and that it was the best in the world.44 In 1952, the government’s Food Protection Committee announced that “the American people now enjoy the most abundant and varied diet of any nation in history.”45 This was only possible, it added, because of improvements in food production and technology, above all the proliferation of chemical additives. At the beginning of the twentieth century, there were only about fifty additives in common use, and, like the benzoates in ketchup that Harvey Wiley had battled against, they had a murky image. After the Second World War, chemists came up with hundreds of new additives, which now had the air of a magician’s box of tricks.

  Like Eisenhower’s dynamic conservatism, the postwar explosion of food additives represented a break with the past. There were countless new tools for extending shelf-life. New colourings gave processed foods a delightful illusion of freshness. A new breed of preservatives seemed to offer “virtual immortality for some kinds of baked goods.”46 As Ira Somers, an advocate for the food industry, explained, “in the United States a person buys a loaf of bread and it will keep for several days in the home without spoilage,” whereas in those countries unfortunate enough to make their bread without additives, “there is considerable loss due to mould growth.”47 The obvious reply to this is that there are other, better ways of dealing with this problem; that buying good fresh bread on a near-daily basis, as is the French and German custom, or baking it yourself, can be a far preferable existence to having a permanent loaf of mould-inhibited “bread” in the bread bin. But the additives evangelists would have none of it. “Additives are needed to retain the standard of living to which we are all accustomed,” claimed Somers. This was fast becoming the American Way.

  “What will the world taste like tomorrow?” This advertisement from flavour manufacturer Norda International expresses the mood of promise in the flavour industry of the 1970s.

  You would expect an industry spokesperson to talk like this. What was more surprising was how ready government was to share the industry view. Despite the frequent complaints by the food industry that the FDA was obstructing its activities, government agencies under Eisenhower generally acted more to “allay public concern” over additives than to investigate whether such concern was justified.48 The view of additives as both necessary and wonderful persists in the FDA to this day. On the FDA website, if you search for “additives,” you will be directed to this series of questions, which have the bright tone of a 1950s’ car salesman:

  Q. What keeps bread mould-free and salad dressings from separating?

  Q. What helps cake batters rise reliably during baking and keeps cured meats safe to eat?

  Q. What improves the nutritional value of biscuits and pasta and gives gingerbread its distinctive flavour?

  Q. What gives margarine its pleasing yellow colour and prevents salt from becoming lumpy in its shaker?

  Q. What allows many foods to be available year-round, in great quantity and the best quality?

  Answer: FOOD ADDITIVES49

  Official optimism about additives, however, has always been met with widespread uncertainty about their safety. Looking beyond his bright patter, the 1950s’ car salesman was selling a highly desirable product that could nevertheless kill its owner. Was the same true of the 1950s purveyors of substitute foods?

  It was not long before this question came to the attention of lawmakers. In 1950, 1951, and 1952, Congressman James Delaney chaired a committee looking into the safety of chemicals in food. The committee reported that about 840 chemicals were currently in use in foods, of which only 420, or half, could be deemed “safe.”50 This worrying discovery led directly to the Food Additives Amendment of 1958, which was supposed to give the consumer greater protection against harmful additives. The new amendment was extremely complex, a reflection of the ambiguous status of additives. To start with, what was an additive, exactly? Weren’t all ingredients additives, in a way? During the construction of the new law, one expert witness had suggested that cream might be considered as a “chemical additive” to ice cream—an obviously absurd notion, but one that could be overturned only by a very precise definition of the word “additive.”51 The new law defined additives as substances that leave residues in food or otherwise affect its characteristics—implying that it was distinct from the food itself. It then went on to put chemicals used in food in one of three categories.

  First were those chemicals that must be excluded from food altogether—those that had been shown to cause cancer when ingested by either man or animals (the so-called Delaney Clause). Next came chemicals that must be subject to intensive testing by manufacturers and excluded from food by the FDA until proven safe. The third category of chemicals was very different. These were not prevented in any way from being added to the food supply, nor were they legally defined as “additives” because they had been “generally recognized as safe”—the so-called GRAS concept. A substance was placed on the GRAS list if it was deemed to have been “adequately shown through scientific procedures to be safe,” or if it had been in common use in food prior to 1958. Some of these substances were traditional condiments such as salt, pepper, sugar, and vinegar.52 Many more, however, were relatively modern chemical creations. The original list of GRAS chemicals numbered 182; by 1961, there were 718 chemicals on the GRAS list. If a manufacturer wanted to add a GRAS chemical to food, that was no one’s business but the manufacturer’s.

  Thus, while the Food Additives law was designed to offer consumer protection, it offered a still greater protection to the manufacturer who wished to create “innovative” processed food. By the 1960s, the old recipe-based food standards of the Roosevelt years were starting to look creaky. In a world of powdered soup mixes, what use were recipes? In 1961, the FDA issued its first non-recipe-based food standard—for frozen raw breaded shrimp (prawns).53 Instead of stipulating exact options for the ingredients of the batter and breading, it simply requested that “safe and suitable” ingredients should be used, at the manufacturer’s discretion. This could include not just the ingredients most consumers would expect if they were breading their own shrimp at home—breadcrumbs, eggs, and milk—but “safe and suitable” emulsifiers, flavour enhancers, and preservatives. “Safe and suitable” was an all-encompassing category that could include countless substances that would once have counted as adulterants.

  This new move was a sign that the FDA could no longer expect to maintain control over every new product that came into the marketplace. There were just too many of them, and most could not be measured by the old standards. By the 1970s, there were a thousand agricultural products in use in the United States, but twelve thousand natural and synthetic chemicals added either directly or indirectly during the processing of food.54 Coming up with the standard for jam had been pretty easy,
as we have seen; all the FDA had to do was to consult a range of traditional cookbooks and calculate a reasonable percentage of fruit to sugar. But no old recipes existed for most of the new- fangled processed foods being created. These confections existed in total opposition to home-cooked food. What cook would dream of making Razzles, Pop-Tarts, or Pringles (all junk foods launched in 1966–67)? Actually, there is a website now dedicated to people whose hobby is re-creating giant versions of junk food snacks (www.pimpthatsnack.com): crazed postmodern cooks spend days fashioning giant versions of Oreo cookies, or KitKats, or Jammy Dodgers. But the joke works only because cooking your own processed food is such an obviously perverse thing to do.

  By the late 1960s, the old pejorative category of “imitation foods” no longer seemed appropriate for many of the processed foods around. It had been easy for Accum to complain about “lemonade” that was really water mixed with tartaric acid, when everyone knew it should have been made with fresh lemons. It was much harder to say what “Tab,” the new diet drink introduced in 1963, was an imitation of. It might as well have appeared from space, so little resemblance did it have to any traditional beverage. Like so many of the new foods and drinks, it was a novel creation, sui generis.

  In December 1969, President Richard Nixon called a White House conference on food and nutrition. The political route from Eisenhower to Nixon (via Kennedy and Johnson) marks a shift in national mood from prosperous optimism to cynicism and despair; and so it was with food. The backdrop to the conference was the discovery that many in America—especially the poor—were struggling with hunger and malnutrition. Far from being the best-nourished people in the developed world, as the Eisenhower-era zealots had proclaimed, Americans were among the worst. National statistics from 1967 showed that twenty-year-old men in thirty-six other countries would live longer on average than those in the United States. A study published in Nutrition Education in November 1969 revealed that nearly all American children under the age of one were deficient in iron. Meanwhile, obesity was on the rise. The study concluded that “Dietary habits of the American public have become worse, especially since 1960.”55

 

‹ Prev