American Empire
Page 49
The business rehabilitation took time. But by the end of the 1980s, its complementary strands—economic, political, and cultural—had together effected what amounted to a reconfiguration of the corporation and a transformation in the political economy. A corporate revolution—or counterrevolution—had taken place that left the United States a very different place than it had been a decade and a half earlier.
Downsizing
In the mid-1960s, the average rate of profit for business began to fall, after having remained relatively steady since the end of World War II. It continued to drop until the early 1980s. The slide for manufacturing was steeper than for the economy as a whole. After averaging near 25 percent (before taxes) from 1948 to 1969, the manufacturing profit rate slipped below 17 percent from 1969 to 1973 and then fell to 14 percent from 1973 to 1979 and 13 percent from 1979 to 1990.
Various explanations were put forth for the drop. Some people pointed to a slowdown in the growth of labor productivity (though the profit slide preceded the productivity slump that began after 1973). Others pointed to a rise in real wages and the increased costs of doing business, including greater regulatory requirements. High energy and commodity prices and interest rates hurt too. Problematic in themselves, these developments were much harder to deal with than in the past because of growing international competition.
Worldwide, heavy investment and national competition led to overcapacity in many manufacturing sectors. Foreign companies often had lower labor costs, and some more efficient production methods, than U.S. firms. Combined with a lowering of trade barriers, the result was downward pressure on the prices for finished goods. American companies that in the past could pass on higher labor and raw material costs to their customers found themselves constrained by competition.
The rising value of the dollar exacerbated the problems faced by American manufacturers. The very high interest rates that resulted from the Federal Reserve’s effort to fight inflation attracted foreign investors, driving up the cost of dollars relative to Asian and European currencies. The strong dollar made it cheaper for Americans to import goods and more expensive for people in other countries to purchase things made in the United States.
As profits fell and stagflation settled in, companies and investors grew reluctant to put capital into expanding or upgrading plant and equipment or into research and development. Investment in manufacturing fell by 8 percent between 1979 and 1982 and another 15 percent in 1983. That left companies in many industries slipping behind their European and Asian competitors in the quality of their products and the efficiency of their production processes. Many companies, faced with escalating difficulties and little prospect of robust future profits, shut down facilities, laid off workers, or went out of business entirely. In some cases, whole industries imploded.
Steel provides a case in point. U.S. steel production remained fairly steady from the mid-1950s through the 1970s. A few giant, integrated companies, which made their own iron and steel and fabricated finished products, dominated the industry. By avoiding price competition, they kept profits strong. Their failure to spend much money on research and development or to move quickly to adopt new technologies had few immediate adverse consequences.
Over time, though, new competition developed. Overseas, steel production grew dramatically, helped in many countries by cartelization and government financing, subsidies, and export promotion. Often foreign producers had cost advantages over U.S. companies as a result of more modern production techniques and lower wages. The boost U.S. producers once got from having rich domestic ore deposits diminished. Domestic mines began to be depleted, while new high-grade foreign mines opened up and shipping costs dropped, so that even countries without iron deposits, like Japan, could get ore cheaply. The United States began importing steel in the late 1950s, at first largely to meet demand that domestic producers could not cover, but soon price competition came into play. The major U.S. companies faced another challenge from mini-mills. These small, efficient domestic producers used scrap metal as raw material and generally employed nonunion workers, enabling them to sell simple products like bars and bolts at significantly lower prices than the integrated firms.
For the giant steel companies, competition went from being a nuisance to a serious threat once the domestic demand for steel began dropping in the early 1970s. The increasing use of alternative materials in manufacturing and construction—aluminum, plastics, and prestressed concrete—contributed to the decline. So did the growing flow of imported goods containing steel, especially automobiles, which had the effect of reducing the demand for domestically produced metal.
Though long-term trends underlay the problems the steel industry faced, its crisis came suddenly. The industry made money every year from World War II until 1982. Then the recession induced by the Federal Reserve brought disaster. Demand for steel plummeted just when the strong dollar made imports cheaper. Between 1982 and 1986, the industry lost more than $15 billion.
During the course of the 1980s, some two dozen steel mills were shut down and 230,000 jobs eliminated. Between 1979 and 1982 alone, steel companies discharged more than 150,000 production and maintenance workers. Employers used threats to shutter mills to win concessions from the United Steelworkers, which over a half century of battling had made steel jobs among the best-paid blue-collar work in the country. Companies also pressed local and state governments to ease pollution standards and give tax breaks as their price for continuing operations. But often they shut mills anyway, seeing the prospects for the industry too bleak to invest more money. U.S. Steel used its capital to buy an oil company and swapped stock to acquire another, changing its name to USX to symbolize its diversification away from its historic role as the world’s largest steel producer. By 1981, companies that made steel had nearly 40 percent of their assets in nonsteel operations.
Steelmaking communities in the Northeast and Midwest were devastated by the closures. Bethlehem Steel eliminated ten thousand jobs at its Sparrows Point complex outside Baltimore, ended steelmaking in Lackawanna, New York, and phased out operations in Johnstown, Pennsylvania, where the unemployment rate shot up to 25 percent. In Youngstown, Ohio, mills that had employed over nine thousand men and women shut down. The Pittsburgh area suffered through one closure after another, including the 1986 shuttering of the U.S. Steel Homestead works, once the center of Andrew Carnegie’s empire. On the Southeast Side of Chicago, nearly fifteen thousand millworkers lost their jobs as a result of closures or layoffs, and many nearby steel-related plants shut down or shriveled, including the historic Pullman railcar factory, which ceased production, putting three thousand people out of work.
The automobile industry underwent a similar, if not quite as severe, cataclysm as a result of recession, high interest rates (pushing up the cost of auto loans), and high gas prices. Total car sales fell from over eleven million in 1978 to fewer than eight million in 1982. Sales of domestically produced cars plunged even more steeply, from 9.3 to 5.8 million, as car buyers turned to foreign companies, particularly Japanese automakers, for vehicles that were less expensive and more fuel-efficient than domestic models. Nearly half the capacity of the auto industry lay idle (bad, but not as bad as steel, where two-thirds lay unused). In 1981, General Motors lost money for the first time since the early 1920s. Chrysler, the smallest of the “Big Three” automakers (but still the ninth-largest company in the country), survived only because the federal government, in a deal modeled on its help to New York City, guaranteed loans to the company on the condition that the United Automobile Workers (UAW) agree to cuts in wages and benefits and suppliers and creditors grant concessions.
To reduce overcapacity and restore profits, the auto industry undertook a massive program of plant closings and cost reduction. In Wayne County, Michigan, which includes Detroit, more than three dozen auto-related plants shut down between 1978 and 1981, from the gigantic but technologically outmoded Dodge
Main complex to small parts suppliers. Nationally, employment in the industry fell by a third between 1978 and 1982. The unemployment rate in Michigan, the center of the industry, hit 17 percent in 1982. Most autoworkers who kept their jobs took reductions in compensation to do so, as GM, Ford, and smaller companies followed in the wake of the Chrysler bailout by demanding that wage increases be delayed or eliminated and benefits cut. When auto sales picked up after 1982, employment levels did not rise proportionally because companies preferred using extensive overtime to adding to their head counts and invested in automated equipment, like robot welders, to reduce manpower needs.
The plants shut down by steel and auto companies and in related industries, like tire making, tended to be clustered in industrial cities in the Northeast and Midwest. Because neighborhoods and sometimes whole cities had grown up around them, their closure had wrenching impact that rippled far beyond the workers immediately affected. Supply, service, and shipping firms shut down. Businesses that catered to plant workers and their families—from auto dealers to grocery stores to bowling alleys and bars—suffered. Empty stores soon lined commercial streets. With tax bases shriveling, public services diminished and infrastructure eroded. In Gary, Indiana, where U.S. Steel invested heavily in its steelmaking operation but, largely as a result of automation, eliminated twenty thousand jobs, more than a third of the retail stores closed, including all five department stores. Even when new industrial employers set up shop in areas hit by shutdowns, the jobs they provided rarely matched in pay or benefits the ones that had disappeared.
The desperation for work in what had been the industrial heartland of the country was palpable. In early 1983, when a Milwaukee factory announced it had two hundred job openings, nearly twenty thousand people lined up to apply. Many laid-off workers found themselves forced to move to where jobs were more plentiful. Youngstown saw its population shrink by twenty-five thousand after its steel industry evaporated. The Fordist system of production, for seventy years a pillar of prosperity, especially in the Midwest, no longer could carry the weight it once did.
When factories and the jobs they provided disappeared, a whole way of life faced collapse. In industrial centers like Homestead, Flint, South Chicago, and Youngstown, work, family, and church provided moral and cultural poles for communities rich in solidarity and shared values, if inward-looking and still to a great degree structured by patriarchal authority. “Steel workers, especially men,” reported the head of a job counseling center at a U.S. Steel plant in South Chicago, “think of work as the most important part of their life. The job provided a structure; once that structure crumbles, people’s personality crumbles.”
African Americans were particularly hard-hit by the downsizing of heavy industry. The automobile and steel industries had provided black workers with opportunities for better-paid jobs, with more security and benefits, than they could get almost anywhere else. By the 1970s, most racially discriminatory employment practices had been eliminated. The shuttering of older factories—which were more likely to be near substantial black populations than newer plants, many of which were sited in suburbs or semirural locales—took a heavy toll on black workers. The number of African American steelworkers fell from 38,098 in 1974 to just 9,958 in 1988. Majority-black cities dependent on heavy industry, like East St. Louis, Illinois, and Gary, Indiana, tumbled into poverty, dilapidation, and abandonment. In Memphis, where decades of struggle had won African Americans greatly expanded job opportunities, the shutdown of RCA, Firestone, International Harvester, and Memphis Furniture Company factories threw thousands out of work. Nearly half the black community lived in poverty into the 1990s.
The trauma of the disappearance of so many well-paying jobs during a period of deep recession made it easy to forget the downsides of industrial employment and the kind of life it sustained. In a 1977 interview, Edward Sadlowski, an insurgent candidate for the presidency of the United Steelworkers union, remarked, “Working forty hours a week in a steel mill drains the lifeblood of a man.” There were millworkers “who are full of poems,” he said, who had the potential to be lawyers or doctors but lacked the opportunity to try. The goal of organized labor, he added, in a comment that hurt his unsuccessful election bid, should be to eliminate jobs as arduous as those in steel. Doris McKinney, laid off from the finishing department of a Buffalo steel mill, was delighted when she found a job as an occupational therapist, which she much preferred to burning metal. A sociological study of workers who accepted a buyout at a New Jersey automobile plant found that they generally later expressed few regrets at doing so, even when it meant, as it usually did, a drop in income. (African Americans, who because of discrimination found it harder to find decent new jobs than did whites, had more regrets.) Many of the former autoworkers were happy to have left behind the physical toll and autocratic atmosphere of life on the assembly line. Still, given the limited alternatives, few workers celebrated the contraction of heavy industry.
The speed and extent of the transformation of the automobile and steel industries disarmed the unions that represented their workforces. Mostly they fought rearguard battles to minimize layoffs and concessions. In the Pittsburgh area, local union activists, ministers, and community groups proposed the public takeover of steel mills facing closure, winning halfhearted support from their national union but failing to stop the shutdown wave. The UAW called for greater union involvement in managerial decisions and a national industrial policy, with government-coordinated planning and public redevelopment banks aimed at preserving and renovating manufacturing, but its proposal won little traction. It won greater attention in calling on the public to “Buy American,” a campaign that at the grass roots sometimes took on a racist, anti-Japanese cast. An accompanying bid to require specified levels of domestic content in vehicles sold in the United States failed to make headway in Washington.
Lowering the Cost of Labor
Labor costs were but one factor—and often not the most important one—in declining corporate profits, but, unlike international competition, interest rates, and demand, they were something business executives felt they could address. To save money, companies slashed both production and managerial ranks. The changed economic and political climate of the late 1970s and 1980s allowed many unionized firms to do what they had long wanted to but had not dared: launch a full-blown attack on organized labor to weaken it and, where possible, eliminate it altogether.
Some companies used relocation as an anti-union strategy, moving their production out of the urban Northeast and Midwest to plants in the South and Southwest and in semirural settings, where they believed that workers had a stronger work ethic and unions would have a hard time getting a foothold. In 1967, nearly two-thirds of all manufacturing jobs were in the Northeast and Midwest; by 1992, only half. As unionized companies set up nonunion plants and conglomerates bought up manufacturing firms, executives felt better positioned to take on organized labor, knowing that they could maintain some production and draw on deep corporate pockets if their aggressive stands led to strikes.
Workers recognized that with unemployment high, odds were turning against them. Starting in the mid-1970s, the annual numbers of strikes began to fall. But when pushed hard enough, workers did fight back.
In 1977, coal miners struck in response to employers’ demands for union concessions, including restraints on wildcat strikes, which had become epidemic in the industry, and changes in the community health system that the United Mine Workers had established over years of collective bargaining. After the 160,000 strikers twice rejected settlement terms that top union leaders had accepted, Jimmy Carter invoked the Taft-Hartley Act to order them back to work. In an extraordinary defiance of federal power, the miners refused to end their walkout, forcing the Carter administration to effectively back down. After 110 days, the strike ended on terms more favorable to the workers than the initial proposals (but still requiring givebacks).
The co
al strike proved exceptional. In the years that followed, there were other large strikes resisting employer demands for union concessions, but most failed. Meanwhile, the percentage of the private workforce that belonged to a union dropped. The weakness of organized labor, combined with bouts of high unemployment, allowed employers to reduce wages. Between 1979 and 1989, average hourly earnings for production and nonsupervisory workers (who made up more than four-fifths of the workforce) fell by an annual average of 0.7 percent (after adjusting for inflation). Their average weekly wages (expressed in 1982 dollars) fell from a 1972 high point of $315 to a 1993 low point of $255, a 19 percent drop. Workers lost benefits too; between 1979 and 1989, the percentage of workers with employer-provided health insurance fell from 69 percent to 62 percent and with employer-provided pensions from 51 percent to 44 percent.
Even as hourly wages dropped, per capita and family income rose. Several factors accounted for this seeming anomaly. For one thing, workers on average worked more hours a year than in the past. Many companies, when business picked up, preferred allowing or requiring employees to work overtime to adding new workers. That made it easier to quickly cut costs if business slowed and avoided benefit costs for additional employees. (For the same reason, companies increasingly hired temporary workers and ones they could categorize as independent contractors.) For another thing, the number of women who worked continued to grow. By 1985, well over half of married women with children under six held a job or were looking for one, up from fewer than a third in 1970. The need, because of the decline of hourly wages, for a higher percentage of the population to work, and work more hours than before, in order to keep family and per capita income rising was a measure of the success of the antilabor offensive.