Book Read Free

Saving Capitalism

Page 13

by Robert B. Reich


  Yet, while surely relevant, the standard explanation cannot account for much of what has happened. It does not explain why the transformation occurred so suddenly, over a relatively brief number of years, nor why other advanced economies facing similar forces did not succumb to them as readily. (By 2011, the median income in Germany, for example, was rising faster than it was in the United States, and Germany’s richest 1 percent took home about 11 percent of total income, before taxes, while America’s richest 1 percent took home more than 17 percent.) And the standard explanation doesn’t tell us why the average incomes of the bottom 90 percent actually dropped during the first six years of recovery from the Great Recession.

  Nor, finally, does the standard explanation account for the striking fact that the median wages of younger college graduates also stopped growing, adjusted for inflation. While recent college graduates continued to earn higher wages than young people without college degrees, their wages no longer rose. In fact, between 2000 and 2013, the real average hourly wages of young college graduates declined (see figure 5). By 2014, according to the Federal Reserve Bank of New York, the share of recent college graduates working in jobs that typically do not require a college degree was 46 percent, versus 35 percent for college graduates overall. The New York Times called this group “Generation Limbo”—highly educated young adults “whose careers are stuck in neutral, coping with dead-end jobs and listless prospects.”

  FIGURE 5. REAL AVERAGE HOURLY WAGES OF YOUNG COLLEGE GRADUATES, BY GENDER, 2000–2014*

  *Data for 2014 represent twelve-month average from April 2013 to March 2014.

  Note: Data are for college graduates age 21–24 who do not have an advanced degree and are not enrolled in further schooling. Shaded areas denote recessions.

  Source: Economic Policy Institute analysis of Current Population Survey Outgoing Rotation Group microdata (http://stateofworkingamerica.org/chart/swa-wages-table-4-18-hourly-wages-entry/)

  A fuller understanding of what has happened to the middle class requires an examination of changes in the organization of the market that increased the profitability of large corporations and Wall Street while reducing the middle class’s bargaining power and political clout. This transformation has amounted to a redistribution upward, but not as “redistribution” is normally defined. The government did not tax the middle class and poor and transfer a portion of their incomes to the rich. The government—and those with the most influence over it—undertook the upward redistribution less directly, by altering the rules of the game.

  First consider the fundamental change that occurred over property rights in corporations. Before the 1980s, as I have noted, large corporations were in effect owned by all their stakeholders, who were assumed to have legitimate claims on them. As early as 1914, the popular columnist and public philosopher Walter Lippmann called on America’s corporate executives to be stewards of America. “The men connected with [the large corporation] cannot escape the fact that they are expected to act increasingly like public officials….Big businessmen who are at all intelligent recognize this. They are talking more and more about their ‘responsibilities,’ and their ‘stewardship.’ ”

  Subsequently, in 1932, Adolf A. Berle and Gardiner C. Means, a lawyer and an economics professor, respectively, published The Modern Corporation and Private Property, a highly influential study showing that top executives of America’s giant companies were not even accountable to their own shareholders but operated the companies “in their own interest, and divert[ed] a portion of the asset fund to their own uses.” The solution, Berle and Means concluded, was to enlarge the power of all groups within the nation who were affected by the large corporation, including employees and consumers. They envisioned the corporate executive of the future as a professional administrator who dispassionately weighed the claims of investors, employees, consumers, and citizens and allocated benefits accordingly. “It seems almost essential,” Berle and Means wrote, “if the corporate system is to survive—that the ‘control’ of the great corporations should develop into a purely neutral technocracy, balancing a variety of claims by various groups in the community and assigning each a portion of the income stream on the basis of public policy rather than private cupidity.”

  This vision of corporate governance came to be widely accepted by the end of World War II. “The job of management,” proclaimed Frank Abrams, chairman of Standard Oil of New Jersey, in a 1951 address that typified what other chief executives were saying at the time, “is to maintain an equitable and working balance among the claims of the various directly affected interest groups stockholders, employees, customers, and the public at large. Business managers are gaining professional status partly because they see in their work the basic responsibilities [to the public] that other professional men have long recognized as theirs.”

  In the early 1950s, Fortune magazine urged CEOs to become “industrial statesmen,” which in many respects they did—helping to pilot an economy generating broad-based prosperity. In November 1956, Time magazine noted that business leaders were willing to “judge their actions, not only from the standpoint of profit and loss” in their financial results “but of profit and loss to the community.” General Electric, noted the magazine, famously sought to serve the “balanced best interests” of all its stakeholders. Pulp and paper executive J. D. Zellerbach told Time that “the majority of Americans support private enterprise, not as a God-given right but as the best practical means of conducting business in a free society….They regard business management as a stewardship, and they expect it to operate the economy as a public trust for the benefit of all the people.”

  But a radically different vision of corporate ownership erupted in the late 1970s and early 1980s. It came with corporate raiders who mounted hostile takeovers, wielding high-yield junk bonds to tempt shareholders to sell their shares. They used leveraged buyouts and undertook proxy fights against the industrial statesmen who, in their view, were depriving shareholders of the wealth that properly belonged to them. The raiders assumed that shareholders were the only legitimate owners of the corporation and that the only valid purpose of the corporation was to maximize shareholder returns.

  This transformation did not happen by accident. It was a product of changes in the legal and institutional organization of corporations and of financial markets—changes that were promoted by corporate interests and Wall Street. In 1974, at the urging of pension funds, insurance companies, and the Street, Congress enacted the Employee Retirement Income Security Act. Before then, pension funds and insurance companies could only invest in high-grade corporate and government bonds—a fiduciary obligation under their contracts with beneficiaries of pensions and insurance policies. The 1974 act changed that, allowing pension funds and insurance companies to invest their portfolios in the stock market and thereby making a huge pool of capital available to Wall Street. In 1982, another large pool of capital became available when Congress gave savings and loan banks, the bedrocks of local home mortgage markets, permission to invest their deposits in a wide range of financial products, including junk bonds and other risky ventures promising high returns. The convenient fact that the government insured savings and loan deposits against losses made these investments all the more tempting (and ultimately cost taxpayers some $124 billion when many of the banks went bust). Meanwhile, the Reagan administration loosened other banking and financial regulations and simultaneously cut the enforcement staff at the Securities and Exchange Commission.

  All this made it possible for corporate raiders to get the capital and the regulatory approvals necessary to mount unfriendly takeovers. During the whole of the 1970s there had been only 13 hostile takeovers of companies valued at $1 billion or more. During the 1980s, there were 150. Between 1979 and 1989, financial entrepreneurs mounted more than 2,000 leveraged buyouts, each over $250 million. (The party was temporarily halted only when raider Ivan Boesky agreed to be a government informer as part of his plea bargain on charges of ins
ider trading and market manipulation. Boesky implicated Michael Milken and Milken’s junk bond powerhouse, Drexel Burnham Lambert, in a scheme to manipulate stock prices and defraud clients. Drexel pleaded guilty. Milken was indicted on ninety-eight counts, including insider trading and racketeering, and went to jail.)

  Even where raids did not occur, CEOs nonetheless felt pressured to maximize shareholder returns for fear their firms might otherwise be targeted. Hence, they began to see their primary role as driving up share prices. Roberto Goizueta, CEO of Coca-Cola, articulated the new philosophy, which was in sharp contrast to that of the corporate statesmen of the earlier decades: “We have one job: to generate a fair return for our owners,” he said. Everyone understood that by “fair return” he meant the maximum possible.

  The easiest and most direct way for CEOs to accomplish this feat was to cut costs—especially payrolls, which constitute most firms’ largest single expense. Accordingly, the corporate statesmen of the 1950s and 1960s were replaced by the corporate butchers of the 1980s and 1990s, whose nearly exclusive focus was—in the meat-ax parlance of that era—to “cut out the fat” and “cut to the bone.” When Jack Welch took the helm of GE in 1981, the company was valued by the stock market at less than $14 billion. When he retired in 2001, it was worth about $400 billion. Welch accomplished this largely by cutting payrolls. Before his tenure, most GE employees had spent their entire careers with the company. But between 1981 and 1985, a quarter of them—one hundred thousand in all—lost their jobs, earning Welch the moniker “Neutron Jack.” Even when times were good, Welch encouraged his senior managers to replace 10 percent of their subordinates every year in order to keep GE competitive.

  Other CEOs tried to outdo even Welch. As CEO of Scott Paper, “Chainsaw” Al Dunlap laid off eleven thousand workers, including 71 percent of headquarters staff. Wall Street was impressed, and the company’s stock rose 225 percent. When Dunlap moved to Sunbeam in 1996, he promptly laid off half of Sunbeam’s twelve thousand employees. (Unfortunately for him, though, he was caught cooking Sunbeam’s books; the SEC sued him for fraud and he settled for $500,000, agreeing never again to serve as an officer or director of any publicly held company.) I have already made mention of IBM and Hewlett-Packard, both of which, before the transformation, had been known for their policies of lifetime employment and high wages. Afterward, both wielded the ax.

  In consequence, share prices soared, as did the compensation packages of CEOs, as I have noted (see figure 6).

  FIGURE 6. THE DOW JONES INDUSTRIAL AVERAGE

  Source: Courtesy e-wavecharts.com

  The results have been touted as efficient because resources theoretically have been shifted to higher and better uses. But the human costs of this transformation have been substantial. Ordinary workers have lost jobs and wages, and many communities have been abandoned. Nor have the efficiency benefits been widely shared. As corporations have steadily weakened their workers’ bargaining power, the link between productivity and workers’ income has been severed. Since 1979, the nation’s productivity has risen 65 percent, but workers’ median compensation has increased by just 8 percent. Almost all the gains from growth have gone to the top. As noted, the average worker today is no better off than his equivalent was thirty years ago, adjusted for inflation. Most are less economically secure. Not incidentally, few own any shares of stock.

  Wages have also been kept down by workers so worried about keeping their jobs that they have willingly accepted lower pay (and its equivalent in the form of paychecks that do not keep up with inflation). Here again, political decisions have played a significant role. Some of the prevailing job insecurity has been the direct consequence of trade agreements that have invited American companies to outsource jobs abroad. As I noted previously, the conventional view equating “free trade” with the “free market,” in contrast to government “protectionism,” is misguided. Since all nations’ markets reflect political decisions about how they should be organized, as a practical matter “free trade” agreements entail complex negotiations about how different market systems are to be integrated. The most important aspects of such negotiations concern items such as intellectual property, finance, and labor. Within these negotiations, the interests of large corporations and Wall Street—to fully protect the value of their intellectual property and financial assets—time and again trump the interests of average working Americans to protect the value of their labor. (A personal confession: When I was secretary of labor in the Clinton administration, I argued against the North American Free Trade Agreement within the confines of the administration but did not air my concerns publicly, believing I could do more good remaining inside than resigning in protest over this and related White House decisions I disagreed with, such as bringing China into the World Trade Organization. In subsequent years I have often wondered whether I made the right choice.)

  High levels of unemployment have also contributed to the willingness of workers to settle for lower wages. And here, too, government policies play a significant role. When the Federal Reserve raises interest rates and Congress opts for austerity—more interested in reducing budget deficits than stimulating the economy and reducing unemployment—the resulting joblessness undermines the bargaining power of average workers and translates into lower wages. When the Federal Reserve and Congress do the opposite, the result is more jobs and higher wages. During the Clinton administration the rate of unemployment became so low that hourly workers gained enough bargaining power to get higher wages—the first and only period of sustained wage growth among hourly workers since the late 1970s.

  But corporate executives and the denizens of Wall Street prefer most workers to have low wages, in order to generate higher corporate profits, which translate into higher returns for shareholders and, directly and indirectly, for themselves. This is not a winning strategy over the long term because higher returns ultimately depend on more sales, which requires that the vast middle class have enough purchasing power to buy what can be produced. But from the limited viewpoint of the CEO of a single large firm, or of an investment banker or fund manager on Wall Street, all operating globally and more concerned about the next quarter’s returns than about profits over the long term, low wages appear advantageous. Low wages are also thought to reduce the risk of inflation, which can erode the value of their assets.

  Reducing the bargaining power of the middle class has also entailed shifting the risks of economic change to them. Public policies that emerged during the New Deal and World War II had placed most of the risks squarely on large corporations by means of strong employment contracts, along with Social Security, worker’s compensation, forty-hour workweeks with time and a half for overtime, and employer-provided health benefits (wartime price controls encouraged such tax-free benefits as substitutes for wage increases). A majority of the employees of large companies remained with the companies for life, and their paychecks rose steadily with seniority, productivity, the cost of living, and corporate profits. By the 1950s this employment relationship was so commonplace that it was not an exaggeration to say that employees possessed, in effect, property rights in their jobs and their companies.

  But after the junk bond and takeover mania of the 1980s, that relationship broke down. Even full-time workers who have put in decades with a company can now find themselves without a job overnight—with no severance pay, no help finding another job, and no health insurance. Nearly one out of every five working Americans is in a part-time job. A growing number are temporary workers, freelancers, independent contractors, or consultants, whose incomes and work schedules vary from week to week or even day to day. By 2014, 66 percent of American workers were living paycheck to paycheck.

  The risk of getting old with no pension is also rising. In 1980, more than 80 percent of large and medium-sized firms gave their workers defined-benefit pensions that guaranteed them a fixed amount of money every month after they retired. Now, the share is below one-third. Instead, they offer defined-cont
ribution plans, where the risk has been shifted to the workers. When the stock market tanks, as it did in 2008, their 401(k) plans tank along with it. Today, a third of all workers with matching defined-benefit plans contribute nothing because they don’t have the cash, which means their employers do not provide any matching funds. Among workers earning less than $50,000, the share that contributes to a defined-benefit plan is even lower. Overall, the portion of workers with any pension connected to their job has fallen from just over half in 1979 to under 35 percent today. In MetLife’s 2014 survey of employees, 40 percent anticipated that their employers would reduce benefits even further.

  Meanwhile, the risk of a sudden loss of income also continues to rise. Even before the crash of 2008, the Panel Study of Income Dynamics at the University of Michigan found that over any given two-year stretch, about half of all families experienced some decline in income. And those downturns were becoming progressively larger. In the 1970s, the typical drop was about 25 percent. By the late 1990s, it was 40 percent. By the mid-2000s, family incomes rose and fell twice as much as they did in the mid-1970s, on average.

  Workers who are economically insecure are not in a position to demand higher wages. They are driven more by fear than by opportunity. This is another central reality of American capitalism as organized by those with the political power to make it so.

 

‹ Prev