Saving Capitalism

Home > Other > Saving Capitalism > Page 14
Saving Capitalism Page 14

by Robert B. Reich


  A third driving force behind the declining power of the middle class has been the demise of unions. Fifty years ago, when General Motors was the largest employer in America, the typical GM worker earned $35.00 an hour in today’s dollars. By 2014, America’s largest employer was Walmart, and the average hourly wage of Walmart workers was $11.22. (Walmart will raise the pay of its lowest-paid workers to $10.00 dollars an hour, starting in February 2016.) This does not mean the typical GM employee a half century ago was “worth” more than three times what the typical Walmart employee in 2014 was worth. The GM worker was not better educated or more motivated than the Walmart worker. The real difference was that GM workers a half century ago had a strong union behind them that summoned the collective bargaining power of all autoworkers to get a substantial share of company revenues for its members. And because more than a third of workers across America belonged to a labor union, the bargains those unions struck with employers raised the wages and benefits of nonunionized workers as well. Nonunion firms knew they would be unionized if they did not come close to matching the union contracts.

  Today’s Walmart workers do not have a union to negotiate a better deal. They are on their own. And because fewer than 7 percent of today’s private-sector workers are unionized, most employers across America do not have to match union contracts. This puts unionized firms at a competitive disadvantage. The result has been a race to the bottom.

  Some argue that the decline of American unions is simply the result of “market forces.” But other nations, such as Germany, have been subject to many of the same “market forces” and yet continue to have strong unions. And these unions continue to provide their middle classes sufficient bargaining power to command a significant share of economic growth—a much larger share than that received by the middle class in the United States. In contrast to decades of nearly stagnant wage growth for most Americans, real average hourly pay in Germany has risen by almost 30 percent since 1985. And as I said earlier, while the percentage of total income going to the top 1 percent in the United States grew from 10 percent in the 1960s to well over 20 percent by 2013, the richest 1 percent of German households continues to receive about 11 percent of total income there. That percentage has remained roughly the same for four decades.

  Why the difference? Look to politics and the allocation of power, especially when it comes to unions. Here it is useful to consider the building block of capitalism I refer to as market power, and the role of government in establishing its limits. In the first few decades that the Sherman Antitrust Act was in place, unions were among the primary targets. When railroad workers went on strike in 1894, the federal courts ruled the strike an “illegal restraint of trade” under the act. President Grover Cleveland dispatched two thousand troops to break it up, resulting in the death of a dozen strikers and the demise of the incipient American Railway Union. The union movement was thought by many business leaders to pose a fundamental threat to the nation, and its objectives to run counter to the principles of economics. The president of the National Association of Manufacturers warned in 1903 that “organized labor knows but one law and that is the law of physical force—the law of the Huns and the Vandals, the law of the savage….Composed as it is of the men of muscle rather than the men of intelligence, and commanded by leaders who are at heart disciples of revolution, it is not strange that organized labor stands for principles that are in direct conflict with the natural laws of economics.”

  But as average working Americans gained political power, they were able to legitimize unions and establish them as a critical part of the economy. In the Progressive Era, Congress passed the Clayton Antitrust Act of 1914, which appeared to exempt unions from the antitrust laws by enunciating the principle that “the labor of a human being is not a commodity or article of commerce.” After the Supreme Court in 1921 somewhat recalcitrantly interpreted the Clayton Antitrust Act to outlaw unions (with Justices Holmes, Brandeis, and Clarke dissenting), Congress finally and forever legalized them in the Norris-LaGuardia Act of 1932. The National Labor Relations Act of 1935 went further, guaranteeing workers the right to organize into unions and imposing on employers the legal responsibility to bargain with them.

  As unions gained economic power in the late 1930s and 1940s, they gained further political power and wielded it to further enlarge the bargaining clout of American workers. After the legendary Treaty of Detroit in 1950, when Big Business and Big Labor agreed to share productivity gains in exchange for labor peace, the rate of unionization increased dramatically, as did wages and benefits. Which is why, by the mid-1950s, almost a third of all employees in the private sector of the economy belonged to a union, and why the median wage increased in tandem with productivity growth.

  Starting in the late 1970s, the process went into reverse. Union membership began to decline, as did the economic and political power of unions, along with the bargaining clout of most workers. The reasons for the decline involved the changes I’ve already noted—globalization combined with labor-replacing technologies, as well as the shift in corporate mission toward maximizing shareholder returns. But it was also the consequence of political and legal decisions that diminished the economic clout of unions and, in consequence, the political power of unions to avoid further erosion. Ronald Reagan’s notorious firing of the nation’s air traffic controllers for going on strike—something he had every right to do because they had no right to strike—also signaled to the nation’s large employers that America had embarked on a different era of labor relations. CEOs whose corporations had high percentages of unionized workers insisted on wage concessions as a condition for maintaining their jobs. Many moved, or threatened to move, their facilities to “right-to-work” states, which had laws permitting employees to forgo union dues and membership as a condition of employment.

  As I have noted, the strategic use of bankruptcy to eliminate union contracts, utilized by American Airlines in 2013, was another practice begun in the 1980s. In what would become a repeating nightmare for unionized airline workers, Frank Lorenzo, who was CEO of Continental Airlines in 1983, took the cash-strapped carrier into bankruptcy, ripped up its labor contracts, laid off thousands of workers, and hired replacements for striking pilots and flight attendants. He then paid his new employees half what they had been paid under the former contracts and demanded they work longer hours. In 1993, Northwest Airlines threatened bankruptcy and insisted on wage concessions from flight attendants and mechanics. A decade later, when more than four thousand Northwest mechanics went on strike, the airline outsourced most of their jobs. In 2002, United Airlines entered bankruptcy—forcing its pilots and flight attendants to accept pay cuts ranging from 9.5 percent to 11.8 percent—and then emerged from bankruptcy in 2006, more profitable than ever.

  When the Taft-Hartley Act of 1947 allowed right-to-work laws, the practical consequence was to let workers who did not pay dues get a free ride off of those who did, thereby undermining the incentive for anyone to join a union in the first place. Yet until the 1980s, such laws had minimal effect because they were enacted in southern and western states, while most industries remained in the North and Midwest. But as corporations came under increasing pressure to show high returns and cut labor costs, many CEOs found right-to-work states more alluring. In 2012, even the old heartland industrial states of Indiana and Michigan enacted right-to-work laws. In 2015, Wisconsin joined in.

  Workers who inhabited the local service economy—retail, restaurants, custodial, hotels, elder and child care, hospitals, transportation—faced a different challenge from their counterparts in big industry. Their jobs were in less danger of disappearing because they couldn’t be outsourced abroad and most would not be automated. In fact, the number of local service jobs in America has continued to grow. The real problem is that these jobs have tended to pay very low wages, rarely include any benefits, and provide little chance of advancement. Significantly, most of them are not unionized. If they were, these workers would have more b
argaining power with their employers.

  Walmart and major fast-food chains have been aggressively anti-union. Facing the possibility that their workers might seek to be unionized, the firms have erected procedural roadblocks to union votes, used delaying tactics to retaliate against workers who try to organize, and intimidated others into rejecting the union. Many of these tactics were illegal under the National Labor Relations Act, but in the 1980s, as noted, Congress cut appropriations for enforcing the act. As a result, the National Labor Relations Board, charged with protecting workers’ rights to form unions and bargain collectively, developed long backlogs of cases. Even when employers were found to have violated the law by firing workers illegally, the board imposed minuscule penalties on them, such as merely requiring them to pay the wrongly fired workers the wages lost since they were dismissed from their jobs. A succession of Democratic presidents promised legislation streamlining the process for forming unions and increasing penalties on employers who violated the law, but nothing came of these promises.

  The result has been a steady decline in the percentage of private-sector workers who are unionized. Not incidentally, that decline parallels the decline in the share of total income going to the middle class (see figure 7).

  The underlying problem, then, is not that average working Americans are “worth” less in the market than they had been, or that they have been living beyond their means. The problem is that they have steadily lost the bargaining power needed to receive as large a portion of the economy’s gains as they commanded in the first three decades after World War II, and their means have not kept up with what the economy could otherwise provide them. To attribute this to the impersonal workings of the “free market” is to ignore how the market has been reorganized since the 1980s, and by whom. It is to disregard the power of moneyed interests who have received a steadily larger share of economic gains as a result of that power. It is not to acknowledge that as their gains have continued to accumulate, so has their power to accumulate even more. And it is to overlook the marked decline of countervailing power in our political-economic system.

  FIGURE 7. AS UNION MEMBERSHIP DECLINES, THE SHARE OF INCOME GOING TO THE MIDDLE CLASS SHRINKS

  Source: Center for American Progress Action Fund analysis based on union membership rates from updated Barry T. Hirsch, David A. MacPherson, and Wayne G. Vroman, “Estimates of Union Density by State,” Monthly Labor Review 124, no. 7 (2001):51–55, available at http://unionstats.gsu.edu/MonthlyLaborReviewArticle.htm. Middle-class share of total income is from Bureau of the Census, Table H-2: Share of Aggregate Income Received by Each Fifth and Top 5 Percent of Households (2013), available at http://www.census.gov/hhes/www/income/data/historical/household.

  14

  The Rise of the Working Poor

  The standard assumption that work determines worth—and validates one’s personal virtue and social responsibility—is further confounded by a substantial increase in the number of people working full-time who are still poor, and a simultaneous surge in the comparatively smaller ranks of people who do not work at all but are rich. It is difficult to hold firm to the belief that people are “worth” what they earn when more and more people who are working full-time do not earn enough to lift themselves and their families out of poverty, while another group of people at the opposite end of the income spectrum have so much wealth—much of it inherited—that they can live comfortably off the income it generates without ever breaking a sweat.

  Until quite recently, poverty was largely confined to those who did not work—widows and children, the elderly, the disabled and seriously ill, and those who had lost their jobs. Public safety nets and private charities were created to help them. It was rare for a full-time worker to be in poverty because, for the reasons I have noted, the economy generated a plethora of middle-class jobs that paid reasonably well and were inherently secure. This is no longer the case. Some politicians cling to the view, as expressed, for example, by Speaker of the House John Boehner in 2014, when he said the poor have “this idea” that “I really don’t have to work. I don’t really want to do this. I think I’d rather just sit around.” The reality is that America’s poor work diligently, often more than forty hours a week, sometimes in two or more jobs. Yet they and their families remain poor.

  There are several reasons for the growth of America’s working poor. First, wages at the bottom have continued to drop, adjusted for inflation. By 2013, the ranks of the working poor had swelled to forty-seven million people in the United States, one out of every seven Americans. One-fourth of all American workers were in jobs paying below what a full-time, full-year worker needed in order to support a family of four above the federally defined poverty line. The downward trend of low wages continued even in the so-called recovery following the Great Recession. Between 2010 and 2013, average incomes for the bottom fifth dropped 8 percent, and their average wealth declined 21 percent. According to a study by Oxfam America, more than half of America’s forty-six million users of food pantries and other charitable food programs in 2013 had jobs or were members of working families.

  It is doubtful that all these working people came to be “worth” that much less, except in the tautological sense that their pay dropped. In reality, the decline has had a great deal to do with their lack of economic and political power. CEOs seeking profits in a lackluster economy have continued to slash labor costs, often by outsourcing the work, substituting automated machines, or forcing workers to accept lower wages. This process has pushed many previously middle-class workers into local service jobs that pay less than the jobs they once had. Low-paying industries such as retail and fast food accounted for 22 percent of the jobs lost in the Great Recession. But they generated 44 percent of the jobs added between the end of the recession and 2013, according to a report from the National Employment Law Project. Employers in these industries tend to be virulently anti-union and have fought successfully against any efforts to organize their workers.

  Meanwhile, the real value of the federal minimum wage has been steadily eroded by inflation. Congress (to be more precise, Republicans in Congress) has chosen not to raise it to compensate for this decline. The National Restaurant Association and the National Retail Federation, along with the largest fast-food chains and retailers that support them, have lobbied against any increase in the federal minimum wage, which is tantamount to allowing it to erode even further. By 2014, its real value ($7.25 an hour) was below the level to which it had been raised in 1996, when, as secretary of labor, I had led the political fight to raise it. Had the minimum wage retained the value it had in 1968, it would be $10.86 an hour. And, of course, by 2014 the nation’s economy was far larger than it was then, and far more productive.

  Some have claimed, nonetheless, that any attempt to restore the real value of the minimum wage will cause employers to fire workers at the lowest rungs, because such workers would no longer be “worth” the cost. In June 2014, at a conference for the Republicans’ largest donors hosted by Charles and David Koch at the luxurious St. Regis Monarch Beach Resort in Dana Point, California, Richard Fink, the Kochs’ in-house economist, sounded off against the minimum wage. “The big danger of minimum wage isn’t the fact that some people are being paid more than their value-added,” he said. “It’s the five hundred thousand people that will not have a job because of minimum wage.” Fink warned that such a large group of disillusioned and unemployed people would become “the main recruiting ground for totalitarianism, for fascism.” The conference-goers presumably nodded in sober agreement before getting back to their foie gras.

  The mythology that a minimum-wage increase (or, in real terms, restoring it to its 1968 level) would cause employers to reduce employment is a common trope. A corollary is that getting rid of the minimum wage altogether and allowing employers to pay what employees are “worth” will reduce or even eliminate unemployment. As former congresswoman Michele Bachmann once put it, if the minimum wage were repealed “we could potential
ly virtually wipe out unemployment completely because we would be able to offer jobs at whatever level.” Theoretically, Bachmann is correct. But her point is irrelevant. It is no great feat for an economy to create a large number of very-low-wage jobs. Slavery, after all, was a full-employment system.

  In fact, evidence suggests that few if any jobs would be lost if the minimum wage were to be increased at least to its 1968 level, adjusted for inflation. Unlike industrial jobs, minimum-wage retail service jobs cannot be outsourced abroad. Nor are these workers likely to be replaced by automated machinery and computers, because the service they provide is personal and direct: Someone has to be on hand to help customers or dole out the food. In addition, and significantly, the gains from a higher minimum wage extend well beyond those who receive it directly. More money in the pockets of low-wage workers means more sales in the places where they live, which in turn creates faster growth and more jobs. Research by Arindrajit Dube, T. William Lester, and Michael Reich confirms this. They examined employment in several hundred pairs of adjacent counties lying on opposite sides of state borders, each with different minimum wages (one at the federal minimum, the other at a higher minimum enacted by a state) and found no statistically significant increase in unemployment in the higher-minimum-wage counties, even after four years. (Other researchers who found contrary results failed to control for counties where unemployment was already growing before the minimum wage had been increased.) Dube, Lester, and Reich also found that employee turnover was lower where the minimum wage was higher, presumably saving employers money on recruiting and training new workers.

 

‹ Prev