by Maureen Ogle
That legislation went nowhere, but by the early 1980s, Nebraskans were more receptive, thanks to a devastating agricultural crisis that roiled farming in the plains and Midwest. During the 1970s famine, farmers had heeded the call to increase their output, and many did so by borrowing money to buy land and equipment. All too often, the bet did not pay off. Crop yields outstripped global demand, leaving farmers stuck with debts they could not pay. Foreclosures, bankruptcies, and, most tragically, farmer suicides followed. (If this sounds familiar, that’s not surprising. The farm crisis of the eighties mirrored that of the 1920s.) Hunting for a scapegoat, farmers and their advocates latched on to a familiar culprit: corporate farms, an appealing bogeyman and as good an explanation as any for the crisis. Once again, opponents lobbied for laws to prevent corporations from owning or operating farms. In one important instance, Nebraskans campaigned for Initiative 300, a constitutional amendment designed to stop the spread of corporate farms. If ratified, it would allow existing corporate ventures to remain but prohibit them from buying any more land. The initiative’s supporters aimed their sights directly at National Farms and Bill Haw, who had announced plans to expand the company’s Nebraska hog-farming operation. Should he succeed, warned an analyst with the Center for Rural Affairs, a farm advocacy group, the state’s hog prices would drop by a dollar or more per hundredweight and cost “small” farmers about $2,400 a year in income. “Do you get the feeling that we small producers—dumbly and blindly, like a sheep being led to slaughter—are being forced out by the greed of those high-rollers?” asked one farmer. “Have you stopped to figure out how many 100-sow farm units will be replaced by 24,000 sows?” A spokesman for Tyson, which supplied NF with pigs and would also be affected by the amendment, reacted with little sympathy and more than a bit of outrage: “If the people of Nebraska want Tyson to get the hell out, we will do that.”
The people did. Voters approved the amendment, and Bill Haw, unable to expand in Nebraska, moved on to South Dakota. Back in 1974, that state’s legislature had passed a law designed to protect family farmers from corporations. But the bill exempted livestock feeders, and Haw assumed he would be able to build a new facility there. He was wrong. Opponents were ready and waiting. The South Dakota Pork Producers Council urged the legislature to amend the law to include hog farms and circulated petitions demanding a special election on the issue if the legislature did not act. The South Dakota Farm Alliance, a coalition that included such strange bedfellows as the National Farmers Organization, the South Dakota Meat Promoters, and the Catholic Rural Life Conference, launched a separate campaign to force National to abandon its proposed site near Pierre, the state’s capital and one of its largest cities. The alliance succeeded—Pierre’s residents were not happy about having a giant hog farm so close—and Haw homed in on an alternative location near the (tiny) burg of Doland. The town council voted unanimously to support Haw’s project, but other townspeople were less enamored. A few days after the vote, a resident told one councilman that he would no longer patronize the man’s hardware store if NF came to town. Doland avoided what could have turned into a local civil war: in late 1988, South Dakotans voted to ban corporate hog farms.
Identical scenarios unfolded in Iowa, Minnesota, and Missouri as networks of rural activists spread the gospel: corporate farming must be stopped. Those efforts, and the larger story of the farm crisis, garnered attention beyond the Midwest thanks to dramatic expansion of the media universe. Cable television had only recently become a mainstay, as had the twenty-four-hour news cycle that came with it. The new medium provided ample opportunity to cover stories that broadcast networks could not always fit into conventional half-hour newscasts. The depth of the farm crisis also attracted journalists working for big-city newspapers, and in 1985, singer-songwriter Willie Nelson amplified the cause when he staged his first Farm Aid concert. The rural activists had been tutored in the ways of Nader, and they greeted journalists with facts and figures that made reporters’ work easier. Ironically, much of that data came from the land grant schools that Jim Hightower had condemned a decade earlier. The Hightower critique, as well as the political activism that demarcated the sixties and seventies, inspired a generation of university faculty to embrace research projects rooted in real-world problems, such as the impact of corporate farming on rural incomes. And, too, by the 1970s, scholars and political activists were able to draw on the work of land grant economists who had compiled extensive databases that tracked the changes in agriculture since the end of World War II.
Make no mistake: not everyone opposed corporate farming. Consider this: in Nebraska, National Farms employed 150 full-time people, and more hands during busy seasons, and bought most of its inputs from local businesses, the exception being fertilizer. That didn’t bother the man who managed the fertilizer dealership. “They can get [it] cheaper elsewhere. If I was their size, I’d do it the same way. You have to be good businessmen,” he said. National’s employees weren’t complaining either. The company paid higher-than-average wages, and workers enjoyed health insurance and pension contributions. Jobs at NF offered another benefit many people valued: leisure time. One man told a reporter that he had begun farming right out of high school, but after he married and started a family, he resented the seven-day-a-week schedule that his farm demanded. He signed on with National so he could work regular hours and enjoy more time with his wife and children. Nor was everyone convinced that corporate farming represented an economic dead end. After hog producer Premium Standard Farms was denied a permit to build a confinement operation in central Iowa, the company moved to friendlier terrain across the border in Missouri. The director of the Missouri Rural Crisis Center denounced the state’s willingness to support PSF. Missouri’s leaders had demonstrated that they would “stoop to anything for economic development,” he complained, “and the family farmer be damned.” Others disagreed. “It’s going to be a big help here,” said the owner of a farm equipment dealership not far from PSF’s new location. “This area has just been devastated by the poor farm economy in the past four years.” Finally, many thoughtful people questioned the wisdom of anticorporate farm laws because they feared those bans would have unintended, and negative, consequences. They were right to worry. In 1990, John Morrell & Co. closed its Kansas packing plant, at the time the state’s biggest slaughtering operation, throwing some seven hundred people out of work. Why the closure? Because a 1981 state law had banned corporate farms, and the state’s “family” farmers couldn’t supply enough hogs to keep the plant operating at capacity. Morrell, already struggling to compete with behemoths IBP and Tyson, had to either import livestock from other states or close the plant. Morrell’s CEO opted for the latter and relocated to Colorado, where Bill Haw had already built a new hog farm. “We don’t need to do business in a populist, antibusiness environment,” Haw had said in explaining the move. Colorado wanted his two hundred jobs and $3 million a year, so that’s where he went.
As these last two examples indicate, pro-family-farm laws provoked fundamental alterations in the geography of hog farming as Corn Belt states ceded dominance to Oklahoma, Colorado, North Carolina, and Utah. But by the 1980s, this too was true: the creature that rural activists identified as the family farm was more myth than reality. Family farmers had led the agricultural revolution of the 1950s, and many had embraced large size and scale. As they did, and over time, the agricultural infrastructure that supported farmers had changed, too. Two USDA analysts affirmed that in a report published in 1985. Small-scale hog farmers faced a steep trek toward survival because the food-making infrastructure supported big farmers, not small ones. Meatpackers and food processors wanted to buy bigger lots of livestock than most small farmers could provide. Moreover, in hog production as in the rest of farming, life and death “rest[ed] largely with those who provide the capital,” and banks were no longer willing to fund small ventures. Even relatively big hog farmers—two to ten thousand head a year—could see the writing on the hog
barn wall. When the editors of National Hog Farmer surveyed readers in 1987, 29 percent of the respondents—most of whom marketed at least two thousand hogs a year—identified “large corporate hog farms” as their biggest threat. Another 23 percent feared encroachment from vertical integrators who raised hogs for use in their packing plants and food factories. The owner of one of the nation’s largest independent hog farms predicted that “within 10 years, [his] operation probably [would] be considered very small.” Like the Tysons back in the 1950s, he recognized that he’d reached a fork in the road; he had to get bigger or get out.
But there was another, less obvious but significant reason why the family farm, whatever it may have been, faced extinction and why big farms, corporate or otherwise, had moved center stage. During the 1970s and 1980s, demographic and social changes transformed Americans’ relationships with food, changes that would ripple across the culinary landscape for decades to come and lead, eventually, to enthusiasm for organic and “local” foods and inspire a new generation of small farmers. But in the 1980s and 1990s, large-scale, corporate livestock production surged because that model of meat making was best suited to the demands of a fragmented and demanding nation of eaters.
There’s no better place to launch a survey of that new terrain than the American home. In the last quarter of the twentieth century, record numbers of households were headed by adults, married or single, who worked outside the home, among them members of the core grocery-shopping, food-preparation demographic, women in their thirties and forties. Cooking was low on many households’ agendas, and fewer Americans were interested in buying raw materials that had to be transformed into meals. They preferred foods that were ready to eat or, as economists phrase it, preferred to trade money for time. Back in the 1920s, shoppers regarded canned soups and quick biscuit mixes as gifts from the convenience gods. Fifty years later, Americans had moved beyond such basic processed foods, which, after all, still required some “making” in order to eat, to paying someone to fix their food for them. In 1960, Americans spent about 27 percent of their food dollars outside the home. By the early 1990s, that share had risen to nearly 50 percent, about half of which went to fast food.
Late-twentieth-century eating habits were also shaped by consumer consciousness, which matured to a logical conclusion: consumers reasoned, however unconsciously, that if their actions greased the economic engine, they were also the economy’s most important players, ones whose demands must be met. What had been a relatively homogeneous consumer market splintered into myriad fragments, or, as analysts labeled them, “niches,” defined by age, income, race, ethnicity, geography, and a mysterious but important inner drive for self-satisfaction. In clothing, for example, consumers expressed both desire and identity by wearing jeans adorned with “designer” labels or T-shirts emblazoned with logos or slogans that linked the wearer to a specific niche. So, too, food became a consumer good that conveyed image and status and provided (instant!) gratification. To name one example, in the 1980s, “yuppies,” a teensy demographic segment that briefly captured the admiration of economists and media, boosted sales of imported beer because drinking that carried more cachet than drinking conventional American brands. At the same time, an even narrower demographic scorned imports in favor of “craft” beer whose niche appeal stemmed from both its artisanal source and its relative scarcity. Several years later, another cohort of consumers flipped that equation on its head, expressing a “hipster” image by scorning imports and craft brews in favor of mainstream beers like Pabst Blue Ribbon.
Such fragmentation played out in the food industry as a whole, as the hordes demanded foods and menus that satisfied individual tastes and whims. Some in the restaurant-going public, for example, wanted steak. Some wanted chicken. Some wanted chicken grilled with teriyaki sauce, and others wanted it on a pizza. The health-conscious demanded salads (perhaps to compensate for indulging in “all-natural” ice cream the night before), and the budget-conscious wanted all-you-can-eat buffets (a gastronomic free-for-all that required restaurants to seek rock-bottom prices on everything from lettuce and tomatoes to pickled beets and precooked meats). Cooking-averse consumers expected grocery stores to function as personal chefs capable of satisfying every craving. Aging baby boomers wanted low-salt foods. Busy parents wanted (cheap) food that could be combined with, say, hamburger or pasta and turned into a meal for four. Even better? A package that contained both burger and pasta. The diet-conscious demanded low-calorie, fat-free foods. (SnackWell’s, a line of low-fat crackers and cookies, was one of the biggest food success stories of the 1990s because it allowed Americans to eat out of both sides of their mouths, pronouncing themselves “healthy” eaters with one side while satisfying junk-food desires with the other.) Teenagers and twenty-somethings wanted anything and everything that could be microwaved, the teenagers because they were hungry after school or too busy working jobs to eat at home or because their working parents were too tired to cook, and the twenty-somethings because they didn’t know how to cook and didn’t realize that cooking from scratch was cheaper than toasting a Pop-Tart. “Nichification” fueled technological innovation that fueled more nichification. Consider the microwave oven, arguably the most important food preparation technology of the twentieth century. A convenience-crazed nation recognized its value immediately: it enabled them to zap foods to fork-ready condition in minutes. It was up to manufacturers, however, to supply zappable foods, from conventional TV dinners to pizza to chicken nuggets.
Fragmentation plus convenience fueled a self-perpetuating cycle: the easier it was to put dinner on the table without cooking, the less relevant cooking skills became. Kids who grew up in homes where no one cooked became adults who didn’t know how to cook and relied on manufacturers, grocery stores, and microwave ovens to do it for them. Nor, it’s worth noting, did economic upheaval derail the long-term trend. In the mid-1970s, even amid inflation and unemployment, a manufacturer of plastic packaging materials was delighted by soaring sales—delighted, but puzzled—and conducted a study to determine what drove its good fortune. The answer: grocery stores were installing “deli” departments to meet the demands of “young and leisure-oriented shoppers” (read: young adult baby boomers) who subsisted on prepared foods like fried chicken, macaroni and cheese, and presliced meats and cheeses. Hence the demand for take-it-home packaging. “The supermarkets are crying for anything new that will stop people from going out to eat,” mused a Tyson executive in 1979. The company embraced the new niches and dumped millions into the “precooked frozen” market, moving beyond conventional TV dinners with their tinfoil compartments of sliced chicken and pasty mashed potatoes into chicken-based hot dogs, corn dogs, and bologna; packaged, presliced chicken; chicken and turkey “ham”; boneless turkey breasts; chicken patties and steaks; and frozen, ready-to-cook chicken Kiev and prefried chicken that only needed to be heated before eating. “I think my mother could cook it better,” Don confided to a reporter who asked about the fried chicken, “but I’m not sure my wife could.” Nor did it matter in an era when convenience trumped taste: “People who eat precooked frozen today are not as fussy as the previous generation,” he added, and predicted that “succeeding generations” would prove even “less discriminating.”
What’s most remarkable is how little Americans spent to satisfy their desires. From 1960 to 1990, the cost of food fell by a third; even during the inflation-dogged 1970s, and the passion for eating away from home notwithstanding, Americans spent a minuscule amount on food. In the early 1990s, on average, consumers paid out just 11 percent of their disposable income to feed themselves. Obviously many households spent more. People earning less than $10,000 a year, for example, devoted about 35 percent of their income to food. But even those in what was then the lower range of the middle class—households with incomes of $20,000 to $30,000—spent only about 17 percent; the wealthiest spent less than 9 percent. And of course that was good for the economy: people had money left to buy other consumer
goods.
But more than demographics roiled the culinary landscape. By the late 1970s, Keys’s fat-is-bad theory had become gospel, and the nation’s medical experts urged the public to cut back on fat and cholesterol, a message many Americans interpreted as “Don’t eat beef and pork.” That view got a federal stamp of approval in 1977 when a Senate committee chaired by George McGovern of South Dakota issued a report recommending that everyone eat more poultry and fish and reduce their intake of “meat,” by which it meant pork and beef. The report also documented the extent to which the political establishment had embraced consumer activism and Naderist ideas, complete, it must be said, with an establishment-like dollop of hypocrisy. Naderites had long criticized government agencies for relying on information and advice from industry insiders trying to protect their turf, but when it suited their cause, they did precisely the same. In this case, Nick Mottern, who wrote the McGovern report, was a Nader acolyte eager to challenge the powerful meat lobby. Mottern relied on expertise provided by D. M. Hegsted, a Harvard professor who endorsed and admired the work of Ancel Keys. The staffers who assisted Mottern gathered information primarily from newspaper and magazine coverage that affirmed the view the committee wanted to promote, namely, the Keysian version of the relation between diet, fat, and good health. (“We really were totally naive,” a staff member later conceded.) The press conference to introduce the committee’s final findings was a masterpiece of glib assertion. Senator McGovern summarized the document’s largely unsupported claims about the relationship between diet and health and then introduced experts who espoused still more assertions as if those were fact, all of which reporters dutifully recorded and passed on to the public.