The Hamlet Fire

Home > Other > The Hamlet Fire > Page 17
The Hamlet Fire Page 17

by Bryant Simon


  The medical examiners listed in detail what the victims wore on their bodies at the time of their deaths.

  Brenda Gail Kelly had on a blue shirt, white jeans, and white shoes. She had a bow in her hair.

  Michael Allen Morris had on rubber boots and a black T-shirt with the words “I Survived Hugo”—a hurricane that blew through the Carolinas in 1989 killing dozens of people and causing billions of dollars in damages—written across the front.

  At the time of the fire, Donald Brice Rich had on blue denim overalls and a dark T-shirt proclaiming his allegiance to the holy trinity of “Women, Wine, and Overtime.” Underneath his clothes, he had a collection of tattoos on his arms and shoulders depicting a skull and dagger, a grim reaper, a lightning bolt, and a cobra riding on flaming wheels that said MARY.

  Scotland County’s Rose Lynette Wilkins had a rose tattoo and nothing in her pockets.

  Many of the dead still had on their blue smocks and white aprons. Some wore rings and watches, and several of the women had painted their finger- and toenails, bright red in one case.

  Medical examiners dug through the pockets of Josephine Barrington, whose son went back into the fire to save her from the smoke and flames only to perish with her. Emptying her pants, they found bifocals, red hair clips, nail clippers, scissors, a small tin of Tylenol, a coin purse with three five-dollar bills, three one-dollar bills, ninety-one cents in change, a ring with seven keys, and a few sticks of chewing gum.

  Doing their jobs with characteristic thoroughness, the medical personnel jotted down the color and material of the bras and underwear the victims wore. They recorded whether or not they were circumcised and if their teeth were fully intact or they relied on dentures.

  Before returning them to undertakers in their hometowns of Ellerbe, Laurinburg, Rockingham, and Hamlet, the examiners measured the precise length and exact weight of the naked bodies. They wrote down the numbers in the designated boxes on the death certificates.

  If the height and weight data for the victims were plugged into a formula measuring body types, eighteen of the twenty-five who died on September 3, 1991, at the Imperial Food Products plant would be considered overweight. By this same rough gauge, five would be deemed “grossly overweight.” Several of the death certificates added brief descriptions about the body types and shapes of the dead. “Overweight,” said one. “Large woman,” recorded another.

  The bodies of the Imperial victims were not just personal stories of individual choices about what to wear on the outside and underneath and what messages to have written on their arms and shoulders. They were also facts and numbers. But these bodies, like all bodies, were at the same time products of history. They were reflections of the nation’s way of eating, of particular ways of measuring things and people, of the narrowing of American politics, of the triumph of business interests over everyday economic security, and of the growing indifference toward the lives of the poor and the vulnerable.

  Just like Hamlet’s local labor market and just like the chicken tenders made at the plant, the bodies of Imperial workers were also not exceptions. Rather, they represented the mainstreaming of cheap.1

  After holding steady for three decades after World War II—an era tagged by the food historian Harvey Levenstein as “the golden age of food processing” and marked by the surge in McDonald’s outlets, the triumph of french fries over baked potatoes as a side dish, and the widespread appearance of vending machines that spat out bottles of sugary Coke and Pepsi for pocket change—the nation’s obesity rate started to climb in the 1970s, slowly at first, but then faster and faster. In the 1960s, the Centers for Disease Control (CDC) estimated that 13.5 percent of Americans were obese or had a body mass index (BMI) above 30, the numeric line, according to experts, separating healthiness and unhealthiness.2 (BMI is a widely used, and sometimes criticized, measure of obesity that derives its quotient by dividing a person’s weight by his or her height squared.) This number started to jump in the 1970s. Twenty percent of Americans had a BMI of over 30 in 1980. The percentage rose to 30 percent a decade later. By 2000, nearly 60 percent of all Americans were either overweight or obese by this measure. Children were especially vulnerable, with the number of overweight kids doubling in the 1980s. In poorer, more rural communities, like Hamlet and Bennettsville, the numbers were even higher. By 1995, public health officials declared that weight gain had developed into a full-fledged epidemic in the United States.3

  As the nation’s waistline grew, commentators began to add up the costs of a heftier republic. Researchers at the prestigious Children’s Hospital of Philadelphia blamed over 100,000 deaths each year across the country on obesity. According to health officials, overweight individuals faced a higher risk for heart disease, high blood pressure, diabetes—especially type 2 diabetes—joint problems, respiratory issues, and even breast, colon, and gallbladder cancer. Overweight adults experienced disrupted sleep and tended to get sick more often and underperform at work. One study even linked dementia to weight gain. More than 66 percent of people with arthritis, according to another study, could be considered overweight or obese. Like adults, heavier kids confronted problems with their hips, knees, and joints.4 They were more likely to get bullied, skip school, do worse in class, and, some contended, suffer from low self-esteem and body shaming and discrimination. Several researchers even framed weight gain as a national security issue. “Between 1995 and 2008,” one scholar noted, “the military was forced to reject more than 140,000 recruits because they were overweight.”5

  “Obesity costs lots of money,” insisted Kelly Brownell, the one-time director of Yale’s Rudd Center for Food Policy and Obesity. “The costs,” he added, “are incurred by individuals, businesses, the country, and the world.” Overweight individuals spend twice as much on health care and medication than do their less heavy neighbors. Health economists estimated in 2003 that the direct and indirect costs of obesity, of early death, of weight-related illnesses, and of lost productivity reached between $117 and $239 billion annually. According to another assessment, health expenditures resulting from obesity cost every single family in the United States $1,470 per year.6

  As early as 1977, the Select Committee on Nutrition and Human Needs told the United States Senate that the country faced a looming public health and fiscal crisis. “The over-consumption of food in general has become a major public health problem,” testified Dr. Theodore Cooper, assistant secretary for health in the Department of Health, Education, and Welfare. He estimated that, at the time, about 20 percent of all adults in the United States were “overweight to a degree that may interfere with optimal health and longevity.” The secretary and others, including South Dakota Senator George McGovern, drew a comparison between overeating and smoking. “These dietary changes,” warned the 1972 Democratic presidential candidate, “represent as great a threat to public health as smoking.”7

  McGovern’s reference to smoking was a telling one. For much of the 1960s, smoking represented the nation’s number one public health threat and expense. After discovering a definite link between smoking and cancer in the 1950s, Dr. Alton Oschner of Tulane University had declared an all-out war on smoking. He targeted smokers in his fight. In his popular book Smoking: Your Choice Between Life and Death, he tagged smokers again and again as “selfish” and called smoking the “most selfish habit anyone can have.” While Oschner pushed for government warnings on cigarette packs and talked of lawsuits against the nation’s tobacco giants, he stressed the individual act of quitting as the initial line of defense in the battle against smoking. If smokers stopped smoking, the problem would go away. He recognized that it wouldn’t be easy, but he insisted that if smokers exercised greater self-control and made the right choices for themselves, for those around them, and for the entire nation, the problem would disappear.8

  As more and more experts and policy makers drew a parallel between smoking and overeating, personal choice started to frame the discussion around obesity. Applying Oschne
r’s approach to cigarettes and public health to the issue of obesity, experts began to link weight gain to poor and selfish eating decisions. Americans grew heavier, they said, because they were too lazy to cook at home. They ate out too often and ate too much at fast food restaurants. Lisa Mancino, a food economist for the United States Department of Agriculture, calculated that just one meal a week away from home can translate into two extra pounds of weight gain a year for the average consumer.9 According to this math and this way of thinking, the solution was to eat somewhere else and make better choices at home. Researchers for the New England Journal of Medicine estimated that Americans gained on average 3.35 pounds per person per year between 1986 and 1990.10 Several researchers argued that too much television and not enough exercise were the causes for the jump. Yet a majority of commentators as well as scholars kept the analysis simple and pointed to the obese as the cause of obesity. They got big because they chose the wrong foods. They put chips, cookies, and beer in their shopping carts instead of peas, carrots, and Florida orange juice. On top of that, they drank too much soda. But, mostly, Americans were fat because they consumed too much of the wrong things. It was that simple. They needed to eat less and make better choices.11

  Early in the twenty-first century, President George W. Bush summed up mainstream thinking in the country and showed just how deeply ideas about choice were baked into discussions about weight gain. “Good foods and regular exercise,” the president insisted, would reverse the upward trend in BMI rates and “save our country a lot of money—but, more importantly, save lives.” A letter to the editor of the Atlanta Journal Constitution echoed the president’s remarks: “Choice is what is to blame for obesity. Americans have the freedom to choose. That is our flaw, we are free to do what we want and most humans chose the easy way—fast food instead of a salad and liposuction instead of exercising.”12

  Just as in the battle over smoking, the political and social realities of class, race, and gender threaded through remarks about weight. Obesity rates were not evenly distributed across America. Because of their almost complete dependence on cars to get from place to place, rural folks walked less and tended to be heavier than urban dwellers. The poor, people of color, and single moms grew larger at double the rate of white middle- and upper-middle-class men and women.13 As reporters and commentators cited these numbers, they repeated the arguments of anti-smoking advocates, welfare opponents, and promoters of less government (for some) that dominated American politics in the 1980s. The poor were poor because they made bad choices, like “choosing” welfare over work and having more kids over thrift and abstinence. The overweight were overweight because they lacked willpower and compounded their bad decision making. They were the problem, and their bodies showed it. Rather than saving their money, they went out to McDonald’s or Shoney’s. They passed up spinach for french fries and grilled chicken breasts (in the late 1980s, falsely assumed to be automatically healthy) for fried tenders. They cooked too little, used the microwave instead of the stovetop, and bypassed tap water in favor of oversize bottles of Coke, Pepsi, and Sprite and, worse still, twelve-packs of beer and jugs of fortified purple wine. They didn’t exercise enough either. Commentators talked about obese people skipping the gym in favor of sitting in front of the television snacking on a supersize bag of Doritos filled with shocking amounts of fat and salt. And they kept making these choices day after day. As George Bush suggested, their bad decisions cost taxpayers billions of dollars and weakened the nation.14

  This kind of thinking turned the obese into not just people with problem bodies but also into irresponsible citizens. As irresponsible citizens, they didn’t deserve the state’s largesse and support. This perception both reflected and shaped public policy. After heated debate, in 1978, the Democratic-led Congress approved the Humphrey-Hawkins Act, calling for full employment and stable consumer prices. The measure was more symbolic than anything else. It had little teeth to it and even less funding behind it. In retrospect, it was, it seems, a last and final gasp of the New Deal order, a tribute to the nation’s passing faith in the broad benefits of high wages and widespread buying.15 By the early 1980s, policy discussions in the United States had changed and changed dramatically. The ideas of cheap rose to the fore, influencing ideas about government and its role in the economy and every daily life, including school food policies, which quickly seemed as emblematic of their era as the Humphrey-Hawkins Act was of an earlier moment in time. Reduced tax revenues due to business losses and jumps in unemployment led to government cuts at the federal and state levels, which in turn reduced support for school nutrition and exercise programs. Faced with persistent shortfalls and narrow choices between an extra science or reading class and physical education or home economics, educational administrators often chose biology and English over gym and cooking courses. To cover up the growing holes in their budgets, they cut down on visits from nurses and slashed funding for teacher’s aides. Often these were the people who oversaw non-academic activities, like going outside to play and taking time for lunch. According to a 2001 study by the Clearinghouse on Early Adult Education and Parenting, almost 40 percent of the nation’s school districts had cut or eliminated recess because of a lack of funding. With money in short supply, more and more new school buildings went up in cities and towns without costly playgrounds or gyms.16 Some school districts cut back on their sports programs while others entered into dubious privatization deals. They let Coke and Pepsi line their halls with vending machines stocked with high-calorie foods and drinks in exchange for money for new scoreboards for the baseball field and helmets for the football team.17

  The Reagan administration, looking for more money with which to pay for the Contras and several colossal new aircraft carriers, cut support for the school lunch and food stamps programs. Famously, officials in Washington tried to reclassify ketchup as a vegetable on school lunch trays to save money as well.18 Distrustful of the decision making of the poor and the not thin, some lawmakers proposed limiting what recipients could purchase with federal and state funds. While the nation’s growing legions of the working poor were told they couldn’t buy wine or beer or some prepared foods with government-issued food stamps, they could still use them to purchase cookies, chips, and soda. Big food and big agriculture celebrated choice as well, it seems. When a number of lawmakers suggested restrictions on sugary drinks and salty snacks, industry representatives rolled out the flag and complained that any limits on consumer choice amounted to an alarming loss of cherished freedoms. Few congressional representatives wanted to vote against American rights or forgo campaign contributions from big donors, so it remained okay to buy chips and soda with government funds.19

  But policy makers didn’t cut food subsidies altogether; they just redistributed the benefits of government action and spending so that more of it flowed toward industry and industrial farmers. Between 1975 and 1990, the United States government wrote checks worth billions of dollars to wheat, soybean, and corn growers. As these policies made these commodities cheaper, food scientists figured out ways to insert inexpensive goods into more and more links in the food chain. Corn went into chicken feed, and it went into the batter that coated nuggets and tenders, into the filler, and into the high fructose corn syrup (HFCS) that sweetened the dipping sauces.20 Between 1997 and 2005, according to calculations done by researchers at the Global Development and Environment Institute at Tufts University, animal farms, slaughterhouses, and processors in the United States saved $3.9 billion annually because of government subsidies to corn and soybean growers. The broiler industry alone saved as much as $1.25 billion over this period as a result of the same subsidies programs. In terms of total operating expenses, this amounted to, according to the Tufts scholars, a 13 percent reduction in costs for broiler and egg producers with nothing put aside for growers of healthier fruits and vegetables. These policies, then, artificially made healthier food items look more expensive.21

  Corn subsidies, in particular, helpe
d to make fat-filled calories cheaper and more accessible. “Humans aren’t the only ones who are fatter than they used to be,” the sociologist Michael Carolan remarked in his 2011 study of cheap food. During the 1960s, a 3.5-ounce piece of chicken, he explained, contained just under 4 grams of fat. By 1970, that number had risen to 8.6 grams, and, by 2004, industrial chickens fed a steady diet of cheap, corn-fortified foods arrived at stores and fast food outlets packed with more than twenty grams of fat in each serving. As Michael Crawford of London’s Institute of Brain Chemistry and Human Nutrition observed, “While chicken was at one time a lean, low-fat food, it is no longer.” He wondered, “Does eating obesity cause obesity in the consumer?”22

  Eating chicken nuggets and tenders made by Imperial and Cagle’s meant eating another dose or two of fat added from the further processing and cooking of the items. For most of the 1970s and 1980s, American eaters still chose chicken because the industry’s efficiencies lowered the price, but also because they thought poultry was healthier than red meat no matter what. That’s what newscasters, reporters, and ad men told them. Based on the headlines and commercials, most consumers associated chicken with good, sensible food choices. The word chicken in chicken nuggets and chicken tenders implied that there was actually chicken and some healthiness to be found inside the crust of these items. This, of course, was in the years before the author Michael Pollan led a food movement urging the close and painstaking reading of labels.

  Of course, chicken nuggets were never a smart food choice. To begin with, there was never very much chicken in the nuggets, as most people understood the idea of “chicken.” Twenty-plus years after the fire, a pair of Mississippi-based food scientists decided to do an “autopsy” on a chicken nugget. The name chicken nugget was, they concluded, “a misnomer.” It implied that the meat inside was chicken meat, but the main components they discovered were chicken skin and chicken fat.23 In fact, six chicken nuggets, weighing about 3.4 ounces, contained almost twice as many grams of saturated fat as a regular 3.2-ounce McDonald’s hamburger.24 They contained as much salt as well. While salt itself isn’t fattening, it can lead to bloating and weight gain as well as to an increased risk of high blood pressure, strokes, kidney stones, and stomach cancer. Plus, it definitely makes you thirsty, and in many instances this means reaching for the nearest, tallest cup of Coke or Dr. Pepper and all of the liquid calories found inside such drinks.25

 

‹ Prev