American Empire

Home > Other > American Empire > Page 51
American Empire Page 51

by Joshua Freeman


  New Corporate Models

  Three industries—meatpacking, computing, and retailing—illustrate the transformations in business organization that took place after the post–World War II era of prosperity ground to a halt.

  People generally do not think much about meatpacking; who wants to know how the animal you are eating was killed, chopped up, and distributed? To the extent that in the post–World War II years Americans knew anything about meatpacking, most probably conceptualized it as a story of progress. Upton Sinclair’s 1906 novel The Jungle, a staple in classrooms, brought to generations of students stomach-churning images of conditions in turn-of-the-century Chicago meatpacking plants and the miserable lives of packinghouse workers and their families. A combination of federal regulation and unionization subsequently improved the sanitary and safety conditions in the packinghouses and the lives of their workers. But meatpacking was not a story of continual advance, at least from the point of view of the workers who killed livestock and processed it. Rather, from the late 1970s on, a structural transformation of the industry led to a deterioration of their working conditions and a decline in their standard of living.

  When Sinclair wrote about meatpacking, a few giant companies dominated the industry. At huge stockyards in big midwestern cities, they bought livestock that farmers shipped by train, slaughtered it, and sent dressed carcasses and processed meats to branch plants and warehouses across the country. From there, meat was delivered to restaurants, hotels, and retail stores, where butchers did the final cutting and packaging. By the 1950s, most of the workers in the main plants were unionized and earned above-average wages for manufacturing workers.

  Even before World War II, smaller independent firms began challenging the big companies by building packinghouses near where cattle and pigs were raised and buying them directly from farmers, rather than through stockyard auctions. Soon the big companies built “direct buying” plants of their own. As the industry dispersed, still concentrated in the Midwest but often in small and midsize cities, so did unionization. Most packinghouses continued to pay wages at or near the level set by union negotiations with the major firms.

  In the early 1960s, what amounted to a revolution in the industry began when a new wave of independents, led by Iowa Beef Packers, built technologically advanced plants in rural or small-town locations, taking advantage of improved roads and air-conditioning technology to ship their products by truck rather than by train. Instead of distributing dressed carcasses, Iowa Beef and its imitators cut animals into smaller pieces, which they vacuum-sealed in bags, boxed, and shipped directly to retailers and end users. For the buyers, this had the advantage of requiring fewer skilled workers to do the final cutting and packaging.

  Unlike the older packing companies, which had become largely resigned to unionization, Iowa Beef took a militantly antilabor stand. When it could, it kept unions out of its plants altogether. When workers did manage to organize, it insisted on paying lower wages than the major firms.

  During the economic downturn of the late 1970s and early 1980s, the growth of Iowa Beef and other firms with more efficient methods and lower costs presented a profound threat to the old-line packers. They reacted not by investing in modernization and new technology but by milking their operations for cash, shutting older plants, diversifying, and in some cases allowing themselves to be purchased. Corporate changes became dizzying. The plants once owned by Armour and Company, for instance, were bought in the 1970s by bus company turned conglomerate Greyhound, and then in 1983 sold to the food conglomerate ConAgra. The older companies also moved to break the national pattern of union-bargained wages and benefits, seeking concessions for individual plants and whole chains and threatening to sell or shutter facilities if they did not receive them.

  The United Food and Commercial Workers (UFCW) tried to resist the disintegration of pattern bargaining. Packinghouse workers repeatedly struck rather than accept lower wages and benefits, but with limited success. The prolonged walkout by Hormel employees in Austin, Minnesota, in 1984 and 1985 garnered broad community support and national attention but ultimately failed in the face of determined company resistance, the mobilization of the National Guard to protect scabs, and the opposition of national UFCW leaders, who rather than protecting high wage rates, like those the strikers received, sought to reestablish national wage standards at a somewhat lower level. That did not work either. By the end of the 1980s, national bargaining in the industry had all but ended, leaving workers at individual unionized plants to cut the best deals they could. The percentage of meatpacking workers covered by collective bargaining fell from 83 percent in 1963 to 71 percent in 1984.

  The weakening of unions had a dramatic material impact. The annual earnings of midwestern meatpacking workers peaked in 1977 and then fell by nearly one-half over the next two decades. By 1990, wages in the industry had slipped to 20 percent below the average for all manufacturing workers. New technology, including small, handheld circular saws (“whizards”), and the decreased ability of unions to enforce work rules allowed production to speed up. Injuries became more common, including repetitive stress disorders. Problems that unionization had once eliminated—like the inability of workers to leave production lines for bathroom breaks—again came to plague the industry.

  The deterioration of working conditions and the deurbanization of meatpacking led to a wholesale change in the makeup of the workforce. The old urban meatpacking plants had employed European immigrants and their children, African Americans, white transplants from the rural Midwest, and a small number of Mexican Americans. The new rural plants tapped a different labor market, former farmers and other hard-up rural residents (including a growing number of single mothers) who had few alternative sources of employment. But rural areas did not have enough workers to staff the new plants, and few native-born workers were willing to move to take such poor jobs. Employers solved their problem by turning to immigrant workers, particularly from Southeast Asia and, in much larger numbers, Mexico and Central and South America, who by the end of the twentieth century made up a majority of the workforce at most midwestern packinghouses. Turnover in the industry soared, as workers often quit after brief tenures, unwilling or unable to stand the pace of work, dangerous conditions, and low pay.

  As the twentieth century ended, animals were slaughtered and dismembered under work conditions not very different from the ones most Americans probably believed had disappeared forever in the years following the publication of The Jungle. From a source of stable jobs providing decent livings, meatpacking reverted to its past as a low-wage industry that offered little security or upward mobility. In other industries, like automobile, steel, garment, and airline, the deterioration of wages and working conditions in the 1980s was often attributed to either foreign competition or government deregulation. Neither played a substantial role in meatpacking. Rather, the industry showed how changed corporate attitudes, the rise of new competitors, the introduction of new technology, geographic relocation, and a political and economic climate in which strikes could be taken and defeated could lead to a radical transformation in the lives of working people and the reemergence of a sweatshop America that until then seemed to have become a distant memory.

  In the computer industry, the key innovations were neither in production methods nor labor relations but in the products themselves. The first electronic digital computers were built under military sponsorship during World War II to calculate ballistic trajectories. After the war, the military continued to play a large role in the development of computing, sponsoring research, commissioning advanced machines, and creating programming languages. By the mid-1950s, some large corporations had begun using computers for payroll, billing, accounting, inventory control, and other tasks. IBM emerged as the dominant computer maker, winning control of 70 percent of the commercial market. The computers IBM and its competitors produced were massive, expensive devices, requiring air-
conditioned facilities. Programs were executed in batches, submitted to specially trained operators (usually on punch cards), with the results available only after what was sometimes a considerable wait.

  During the late 1950s and 1960s, defense spending and heavy investment in computing by the burgeoning space program stimulated a series of technological innovations that laid the basis for a radical transformation of the industry. The desire by the Air Force and NASA for smaller, more reliable, and less power-hungry electronics sped the development of integrated circuits. First produced in 1959, these small silicon “chips,” which had etched on them transistors, resistors, and other circuitry, made possible a drastic reduction in the size, power needs, and air-conditioning requirements for electronic equipment. Intel Corporation took the lead in producing integrated circuits for computer memory, one of a number of companies that made the Santa Clara Valley south of San Francisco a hub for computer industry innovation. Integrated circuits in turn made possible advances in minicomputers, smaller, cheaper machines than the “mainframe” computers that dominated commercial work.

  Minicomputers allowed computers to be used in new ways. Rather than submitting programs to operators and then waiting for results, students and faculty at MIT, Stanford, and other universities began directly controlling minicomputers through interactive terminals. The Pentagon’s Advanced Research Projects Agency (ARPA) funded much of their work, as well as the development of “time-sharing” systems that allowed multiple users to simultaneously access machines otherwise far too costly for individual use. With these developments, computer users could write and revise programs on the fly, unleashing a wave of innovation and unanticipated new uses for computers, from games to communications to text manipulation. ARPA also developed the first network that linked together computer centers in various parts of the country.

  The development in the early 1970s of microprocessors—chips that contained the basic processing functions of a general-purpose computer—made possible the next step in the miniaturization and personalization of computing, the development of small desktop machines that possessed computing power approaching that of much more expensive minicomputers. In 1975, Ed Roberts, the owner of a model rocket hobby shop in Albuquerque, New Mexico, began selling a small computer, the Altair, in kit form for less than $400. To provide an easy-to-use computer language for his machine, Roberts bought a version of BASIC, a computer language developed at Dartmouth as a teaching tool, from two young programmers, Paul Allen and Bill Gates, who soon set up their own software firm, Microsoft, aimed at the emerging PC market.

  The San Francisco Bay Area, including the Santa Clara Valley (by then dubbed Silicon Valley in recognition of its growing concentration of chip producers and computer companies), was a hotbed of engineers, academics, hobbyists, and students fascinated with the possibilities for building their own computers. They met in small groups, infused with the radical and countercultural sentiments then still common in the Bay Area. For many PC pioneers, cheap, individually operated computers and community-based computer networks represented liberation, a way to spread knowledge and power and undercut control by central authorities. In organizations like the Homebrew Computer Club they exchanged utopian visions, practical tips, hardware devices, and newly minted programs. The most successful of the computers being built in kitchens and garages around the area came from a young chip designer, Steve Wozniak. With his neighborhood friend Steve Jobs, a Reed College dropout, Wozniak set up a company to sell his creation, which Jobs dubbed the Apple. Wozniak soon created an improved version, the Apple II, introduced in 1977, which made home computing a possibility even for technically unsophisticated users. Two years later, a start-up software company released a spreadsheet program called VisiCalc for doing accounting calculations on the Apple II, which brought a flood of business customers to computer stores looking to buy what until then still had been largely a hobbyist’s machine.

  Unlike most of the tinkerers who created the first personal computers, Jobs, just twenty-two when he cofounded Apple, had a strong commercial orientation and the wisdom to hire professional managers to help guide his company as it explosively expanded. In 1980, it became a public corporation with an initial stock offering that made Jobs and Wozniak together worth over $300 million. Within five years of its founding, it grew into a billion-dollar corporation.

  In 1981, IBM came out with its own personal computer that had many of the characteristics of the Altair and Apple II. Like them, it used off-the-shelf parts and adopted an open architecture that allowed other developers to sell add-on components and peripheral devices. IBM offered three operating systems, including one licensed from Microsoft, which proved by far the most popular. The IBM PC became a huge commercial success, legitimated personal computers in the commercial as well as home markets, and set standards for future hardware and software. Ironically, companies that copied the IBM design ultimately eclipsed it in the personal computer market. The chips Intel produced and the software Microsoft wrote, rather than finished computers, became the defining force in the industry.

  By the end of the 1980s, some twenty million personal computers a year were being sold. The personal computer had become ubiquitous in offices, schools, and homes. Out of an odd brew of militarism, engineering ingenuity, academic and youth culture, post–civil rights movement antiauthoritarianism, questing for personal empowerment, and venture capital had come a technological and social revolution that interwove computer use into daily life.

  The companies that mushroomed up with the personal computer revolution prided themselves on creating work environments very different from prevailing buttoned-down, hierarchical corporate norms. At least on the surface, the California computer industry seemed to embody the spirit of 1960s university towns, with casual dress, all-night work sessions, company-sanctioned play breaks, informal corporate structures, and the extensive use of work teams. Below the surface, though, the labor practices of the emerging high-tech companies represented continuity as much as change.

  In the post–World War II years, large unionized companies set many of the standards for national employment practices, but an influential alternative could be found in sophisticated, nonunion companies that developed modernized versions of pre–New Deal welfare capitalism. Rejecting collective bargaining, companies like Sears and Roebuck, Kodak, and IBM sought to maintain employee loyalty and high productivity through job security, good benefits, stock purchase plans, extensive use of teamwork, and worker participation in decision making. Many of the high-tech start-up firms of the 1970s and 1980s adopted this model, with an overlay of countercultural trappings. Given the importance of intellectual capital and innovation to their success, they made it a priority to keep top designers and programmers satisfied, tolerating their idiosyncrasies and enriching them through stock grants and options.

  Production workers fared nowhere nearly so well. In the factories that made silicon chips, disk drives, circuit boards, and other computer components, conditions could be harsh. Precise, repetitive work exhausted workers, who often were exposed to noxious chemicals. Many computer companies subcontracted production to firms that paid low wages and provided few benefits. While the managerial and professional ranks of the high-tech firms were heavily male and white, women, minorities, and illegal immigrants made up much of the production workforce. Once products and manufacturing techniques became standardized, companies frequently stopped making goods in Silicon Valley or elsewhere in the United States, subcontracting production to overseas firms in low-wage regions or setting up their own offshore factories. Highly efficient, automated production techniques helped to keep the cost of personal computers low, but like many other industries during the 1980s, profits and success in the computer industry rested in part on a downward spiral of wages and benefits for blue-collar jobs, whether within the borders of the country or by moving beyond them.

  Low wages played a key role in the transformation of
retailing, which began in the 1960s and accelerated in the 1970s and 1980s. The rise of giant discount chains, most notably Wal-Mart Stores, Inc., changed how Americans shopped and the economics of everyday life. Wal-Mart helped set labor standards across the economy, altered patterns of urban and suburban development, and even influenced cultural trends.

  Most of the major discount chains founded in the 1960s—including Kmart, Target, and Woolco (a Woolworth’s chain that Wal-Mart bought in 1983)—were creations of existing urban-oriented mass merchandise companies. Wal-Mart had different roots. Sam Walton had owned variety stores in small Arkansas towns. He set up his first discount stores on the outskirts of such communities, where they drew customers from the surrounding regions by offering a wide variety of goods at low prices. Determined from the start to keep labor costs down, Walton fought to keep out unions, which in any case had little toehold in Arkansas and the neighboring states where he initially sited his stores. Walton found a ready workforce among local women, almost all white, willing to work for minimum or subminimum wages in the face of the mechanization of local agriculture and the paucity of alternative employment opportunities.

  Spreading out systematically from its original base, by 1975 Wal-Mart had 125 stores and seventy-five hundred employees. That year Congress made it illegal for manufacturers to fix retail prices, opening the door for an expansion of discounting chains. During the 1980s, Wal-Mart began adding stores in the suburbs of large cities and introduced “Supercenters” that combined grocery supermarkets and general merchandise discount stores under one roof. Many Supercenters stayed open around the clock, following a retail trend toward twenty-four-hour operation. (In 1963, 7-Eleven began keeping convenience stores open twenty-four hours; the midwestern Meijer grocery stores adopted the practice in 1984.) In 1990, Wal-Mart became the country’s largest retailer. The next year it began an international expansion by opening up a store in Mexico City. By the end of the century, it had become the world’s largest private employer, with over 1.1 million workers.

 

‹ Prev