Garbology: Our Dirty Love Affair With Trash
Page 7
Packard’s argument made sense and plenty of people bought his books. Not so many bought into his central premise, however, as Americans showed no inclination for embracing a retreat from consumerism. In 1960, his pessimistic, anti-materialistic views and prescriptions just could not compete with Lippincott’s vision of endless abundance. People didn’t want to see the waste—even Packard conceded as much. The overflowing trash cans were just more evidence of America’s productivity. Indeed, fifty-one years later, in 2011, America’s leaders were still looking at the world through Lippincott’s eyes, even as they proved Packard correct by publicly stating that the best hope for pulling the country out of recession and unemployment—that is, the best way to get the tiger to stop biting us—would be for American consumers to shop the country out of trouble. Fifty years after Eisenhower, the message remained the same: Buy anything. Then throw it away and buy some more.
THIS RISE of consumerism and the new American Dream launched during television’s golden age was accompanied by another trash-boosting trend—the plasticization of America.
Municipal waste was only .4 percent plastic by weight in 1960. Our trash cans had almost no plastic inside them back then, but that began to change rapidly. By the end of the sixties, plastic trash had increased sevenfold. By the year 2000, American households were throwing away sixty-three times as much plastic as in 1960, and nearly 11 percent of the stuff in our bins each week was made of synthetic “miracle” polymers. Plastic had become the second weightiest component of garbage next to paper, which, as daily newspapers in America have declined, is a shrinking category of waste.
These figures, which come from the Environmental Protection Agency, actually underestimate the impact of plastic ubiquity, because plastics are the lightest trash we make. If the components of American trash were measured by volume or pieces rather than weight, plastic would rate as an even larger portion of what we throw away. Consider the plastic shopping bag, which didn’t exist in 1960. By the year 2000, Americans consumed 100 billion of them a year, at an estimated cost to retailers of $4 billion—costs passed on to consumers. Most of these bags land in landfills, though many go astray—plastic bag waste was the second most common item of trash found on beaches during 2009’s International Coastal Cleanup Day (with cigarette butts taking first-place honors). Their relatively small weight masks plastic bags’ enormous impact.
The new age of plastic went hand in hand with another trash-making trend of the era: the rising popularity of disposable products. These appeared on the scene in rapid succession: the invention of Styrofoam by Dow Chemical in 1944; the plastic-lined paper cup in 1950; the first TV dinner (turkey, mashed potatoes and peas on a one-serving, one-use plate and package) from Swanson Foods in 1953; the invention in 1957 of high-density polyethylene plastic, which would be used in the now-common gallon milk jugs that displaced reusable glass bottles; the 1958 introduction by Bic of the first disposable pen, followed by the 1960 marketing of the first disposable razors. There was a wave of such products, touching every part of the household and daily life, from plastic bread bags to non-refillable aerosol cans to the original paper Dixie cups. Dixies were initially despised for their wastefulness at the dawn of the twentieth century, but they were eventually accepted as tools for disease prevention in lieu of the traditional public glasses and ladles. (The product’s original name had been the “Health Kup.”) All of these new materials and products not only supplanted longer-lasting ones that could be reused many times and thereby remain outside of the waste stream, they also introduced a number of synthetic and potentially toxic waste materials into our refrigerators, medicine chests, cupboards, oceans, town dumps, natural habitats and our bodies.
One of the biggest shifts toward the modern throwaway economy came from the ultimate artificial necessity, a food product with no nutritional value and plenty of unhealthful effects, from obesity to dental decay: soda. In 1964, Coca-Cola decided it was finished with reusable glass soda bottles and introduced the “one-way” container, intended to be thrown out even though it was still glass and inherently reusable. Consumers embraced the convenience. The soda company was freed from the chains of local bottling plants (and local workforces) where bottle returns were cleaned and refilled. Three years later, the invention by DuPont scientist Nathaniel Wyeth of a plastic soda bottle that did not explode overnight in the refrigerator as its experimental predecessors had done—the polyethylene terephthalate (PET) plastic container—led to the marketing of the now ubiquitous two-liter disposable soda bottle. It and its smaller relatives have dominated the soft-drink industry ever since, ending a half-century age of reusable glass containers. Decades of recycling efforts followed that have yet to equal 1960 levels of soda sustainability.
At the time of this switch to plastic, the soda manufacturers cited the convenience, simplicity, lower weight and lower cost (to the manufacturers, at least) that made disposable plastic bottles the preferred and sensible choice over glass. It was progress. The market had “spoken.” The notion that one-use containers made of fossil fuels—containers that never decompose—would inevitably impose a substantial cost on taxpayers, ratepayers, local sanitation departments and the environment was not considered in the industry’s cost-benefit analysis favoring the plastic soda bottle. The reason for that omission was simple: The beverage companies did not have to pay those costs; such costs were, to use the corporate term, “external,” which is a nice way of saying someone else has to foot the bill for a company’s business plan. The new consumer economy, in effect, encouraged and subsidized the creation of new and greater volumes of trash—which, in the case of the soda industry, amounts to just under 10 billion cases of soft drinks a year in the U.S. alone. How much trash is that? Soft drinks are America’s favorite beverage, quaffed daily by half the U.S. population over two years old (70 percent of males age two to nineteen drink sugary beverages daily, with the same age group of females right behind at 60 percent).1 Annual consumption exceeded 50 gallons for every person in the nation beginning in 2002, more than twice the rate of soda drinking in 1960. (This is also twice the current per capita soda consumption in the number two soda-guzzling nation, Ireland; three times the soda drunk by Germans; five times the French; and ten times the Japanese.)
Fifty gallons a person is the equivalent of 160 billion 12-ounce soda cans, bottles and paper cups a year to be hauled to the curb.
NEW PRODUCTS and trends alone did not fuel the growing mounds of trash in postwar America. There was also the fading of old methods of garbage disposal. Another factor in this same era that boosted the trash flow, leaving sanitation officials in Los Angeles and many other communities scrambling for more dumps, was the demise of the garbage piggeries.
Until about 1960, piggeries still represented a major part of some cities’ waste disposal strategies, not to mention the pork-production industry. Garbage piggeries may have been around nearly as long as the Republic itself, but they came into their own during World War I, when Washington war planners suggested that feeding garbage to pigs would conserve food. A similar boost occurred during World War II, when this was one of the forms of conservation practiced as part of the war effort, along with Victory Gardens, rubber roundups and scrap-metal drives.
Good data is hard to find on just how common it was for cities and towns to feed their garbage to swine. Perhaps the most authoritative source on the subject was Willard H. Wright, chief of zoology for the National Institutes of Health and a prolific author and public health advocate. He estimated in 1943 that there were at minimum 1.25 million garbage-fed pigs in the U.S. Based on U.S. Department of Agriculture figures, Wright believed that figure to be “very conservative,” because it only counted hog farms that sent more than one hundred pigs to market ever year. There were, according to Wright, a “very large number of persons who market less than 100 garbage-fed hogs per year.”
Wright’s own survey of municipal garbage disposal methods found that 53 percent of cities with populations o
f ten thousand or more fed part or all of their garbage to swine. Wright didn’t survey smaller cities, but he asserted that the practice of feeding garbage to pigs was thought to be similarly pervasive in small-town America.
Los Angeles had a network of garbage feeding ranches to handle the edible portions of its waste in those days. By far the largest was Fontana Farms, fifty miles east of the city, which billed itself as the biggest hog farm in the world. Fontana at its height had a herd of sixty thousand pigs supplying a quarter of Southern California’s ham and bacon. Feed consisted of daily shipments of 400 to 600 tons of Los Angeles garbage.
Though the war brought a resurgence in piggeries’ popularity as a means of keeping heavy, rotting, vermin-attracting organic waste out of town dumps, that popularity did not extend to the use of garbage-fed pigs as human food. First there was the taste. Garbage-fed pigs, according to Wright, were poor in quality compared to conventionally fed porkers. “Garbage produces a soft oily pork,” he wrote. “Cured pork and lard from such hogs are inferior in quality to that obtained from grain fed hogs … Most meat packers buy garbage fed hogs only at a discount.” Wright also found that the “fattening value” of garbage had declined over the years, making it less attractive financially for farmers to adopt a trash diet for their swine: A ton of Los Angeles garbage in the 1930s could generate sixty-eight pounds of pork; ten years later, the slop put only thirty-one pounds on the hogs.
After the war, Wright advocated against the practice of feeding pigs raw garbage, gathering evidence that showed the meat from garbage-fed pigs was associated with higher human infection rates of the potentially deadly parasitic disease trichinosis. Garbage feed also was linked to several epidemics of vesicular exanthema, a fatal infection in swine similar to the bovine ailment known as hoof-and-mouth disease. As a result, by 1960, most states required the garbage fed to pigs to be cooked first to sterilize it. This proved so inconvenient and expensive that the use of piggeries as landfill alternatives quickly fell from favor. In the old days, Los Angeles was paid by pig farmers for its garbage. By the end, the city was paying the piggeries to take the slop, but it still wasn’t enough, and they all but vanished nationwide by 1970.
Two trash-related inventions further contributed to this era’s rising dependence on landfills: the compacting trash truck, and the green-plastic trash bag, introduced to consumers in 1960 by Union Carbide. In the past, open trash cans and uncompacted loads of garbage were easily picked over by scavengers, who made a living scoping out trash and pulling out recyclables and other reusable materials. Scavenging had long been considered an honorable vocation, one that provided economic opportunities for newly arrived immigrants in the U.S. in particular; the trash business in San Francisco, for example, was dominated in the early decades of the twentieth century by Italian immigrants who formed a network incorporated as the Scavengers Protective Association. The advent of dark green, opaque polyethylene trash bags, billed as a convenience, and garbage trucks that ground and crushed the trash into compacted masses for ease of disposal, had the unintended consequence of making scavenging far more difficult. No more lifting up the lid for a quick assessment of a trash can’s contents—the bags hid everything. And once on the truck, everything was mixed and mashed, something that had not occurred with old-style wagons and open-bed trucks. Consequently, more material than ever ended up in the landfill instead of back in the manufacturing chain. Recyclables and food waste were hard to separate out. And the plastic bags themselves added to the waste stream as well.
The final factor driving the rise of more landfilling was the fall from favor of industrial-scale incinerators. Unlike the backyard burners, they had not been banned in many areas and there were quite a few still in use at that time, providing an alternative to landfills. But two new federal laws hastened their closure: the Solid Waste Disposal Act of 1965, and the Clean Air Act of 1970. The old incinerators were too polluting to pass muster under the new regulations, while upgrading them to cleaner technologies was often prohibitively expensive. It seemed for a time that waste-to-energy plants would provide the next and best garbage disposal solutions, but it was not to be. Only the New England states—the U.S. region with the least amount of land available for new landfills—clung to incineration as a major solution, making a substantial push toward converting trash to fuel rather than fill. But even there, landfills remained dominant. The rest of the country turned to burial rather than cremation, and the age of the modern landfill began in full force.
FACING FAR more garbage than in the past, coupled with considerably fewer options for disposing of it, sanitation officials across the country began hunting for more landfill space to accommodate what was predicted to be a tidal wave of trash. In Los Angeles, a property in the Valley of the Dumps caught the attention of county sanitation engineers: a large and, at the time, well-known family dairy farm that had been operated for generations by the heirs of one of L.A.’s early land barons.
The Pellissier family’s patriarch, Germain, had immigrated from France eighty years earlier, investing in sheep, cows and real estate. The land came cheap back then but would be worth a fortune for future generations of Pellissiers, particularly a rocky stretch on the edge of the city. This would eventually become the Miracle Mile portion of Wilshire Boulevard, in its mid-twentieth-century heyday nicknamed “the Fifth Avenue of the West.” The Pellissier name became part of the fabric of burgeoning Los Angeles, plastered on streets and buildings, the family money behind such landmarks as the historic art deco Wiltern Theater. Their dairy ranch remained an L.A. institution for decades (their Hazel the Cow, record-breaking milk producer, achieved minor local fame). The dairy farm finally shut down in the early seventies, as the value of the land had grown too great to consign to Holsteins in an era of seemingly insatiable demand for suburban houses and shopping centers—and dumps. In 1971, the county sanitation district bought a large portion of the Pellissiers’ land along with the private dump that had been leasing space there. The county continued running the old dump as a small way station for garbage, while laying plans for an entirely new future for the place as the Puente Hills Landfill, the first big and thoroughly modern landfill in the region, with the new sanitation headquarters building built at its base.
Back then, three peaks dominated the property, part of the Puente Hills range on the southern border of the San Gabriel Valley. Canyons divided the hills and had been used for grazing the dairy cows. Residents in nearby Hacienda Heights could hear the cows coming home every day, a taste of country life in America’s second-biggest cityscape. But the peaks were high enough to pose an aviation hazard for L.A.’s increasingly busy skies. In 1952, a North Continent Airlines twin-engine Curtiss flying from New York to Burbank was diverted because of fog, and while approaching its alternative landing site, Los Angeles International Airport, it dipped ten feet too low. Its landing gear struck the fog-shrouded peak. The plane crashed and exploded, killing all twenty-nine passengers and crew members aboard.
A year later, the first landing of invaders from Mars took place in those same hills—in the 1953 film adaptation of the H. G. Wells classic The War of the Worlds.
Cold War fears of a more earthbound invasion were soon added to the mix of bucolic pasture and suburban foothill: In 1956, the tallest of the three hills overlooking Los Angeles was chosen by the Defense Department as a strategic high point for a new Nike missile installation, America’s first missile defense program. Batteries of radar-guided missiles designed to shoot down Soviet Union bombers flying as high as fifty thousand feet were positioned on the hilltop in the fifties and sixties. Similar outposts were set up across the country, continually upgraded to match faster and higher bombers. The Nikes were all decommissioned by the mid-seventies once the main threat of nuclear attack shifted from manned aircraft to much faster and more numerous automated nuclear missiles. Today the launch silos are covered over by a community college campus. The radar station at the high point has been repurposed as an aviation r
adar relay accompanied by a conglomeration of cell phone towers. Only a narrow wooden guard shack remains of the old Nike outpost to mark the spot, sporting a small memorial plaque.
But this former missile site no longer stands tall above a deep canyon. The Puente Hills landfill has filled in the spaces between the three hills, absorbing them into the larger footprint of a single mound of trash. The old high point that a missile launch complex once occupied, defending Los Angles from above, is just a small bump on the big, broad plateau of Garbage Mountain.
That was not the original plan, never the bold vision—the landfill was supposed to be a minor part of a much bigger effort to remake America’s waste future while also weaning the country from foreign oil dependence.
This is, it turns out, a familiar theme in the history of trash—massive landfills as unintended consequence. Fresh Kills, which previously held the title of world’s largest active landfill, was originally presented to the public in 1947 as a brief, temporary solution to New York’s trash woes while new waste-to-energy plants were constructed. The plants did not come, and Fresh Kills continued operating until 2003. A decade later, the slow conversion of its polluted landscape into parkland is still under way. Philadelphia’s leaders likewise believed they could eliminate the city’s garbage with waste-to-energy plants, but a homeless, wandering barge of toxic ash from Philadelphia became an international pariah and scandal that killed that city’s vision, too.
Puente Hills, destined to become America’s biggest active landfill, has turned out to be as much cautionary tale as it is engineering achievement. It demonstrates once again how so many components of the modern waste-management system began as little more than a backup plan, an accident. The 102-ton legacy and the landfills that now constrain it are, bottom line, the unintended consequence of Lippincott’s magical economy of abundance, superimposed on the American Dream by brilliant marketing capable of persuading us to accept the patently unsustainable as common sense.