Who Stole the American Dream?

Home > Other > Who Stole the American Dream? > Page 10
Who Stole the American Dream? Page 10

by Hedrick Smith


  Paradoxically, the second income stream wound up putting many middle-class families in an even tighter financial bind because of the sharply rising costs of housing from the 1980s through the 2000s. Middle-class parents wanted homes in good neighborhoods with good schools, but, as Elizabeth Warren and Amelia Warren Tyagi reported in their book, The Two-Income Trap, they found themselves in an ever-escalating bidding war, having to spend a bigger share of their combined incomes on housing. As the quality of education and safety on the streets deteriorated, people felt the need to pay premium prices for desirable housing. Even two paychecks did not seem enough to keep up with the rising cost of living. As Warren and Tyagi wrote: “The crisis in education is not only a crisis of reading and arithmetic, it is also a crisis in middle-class family economics.”

  Having a college education helped—but it didn’t generate as much of a gain as people imagine. The typical college graduate today makes only about $1,000 a year more than in 1980, adjusting for inflation. In the past decade, entry-level college graduate salaries actually went backward. Their annual pay in 2010 was about $2,000 below their pay in 2000. Young men were averaging $45,000 in 2010 and women were averaging about $38,000. Not bad for starters, but that means typical college graduates, like high school graduates, have been falling further and further behind the executive elite, such as Carol Bartz, CEO of Yahoo!, or Leslie Moonves, CEO of CBS, who were making about $150,000 a day.

  The enormity of the wealth gap between the top and the middle, Harvard economist Larry Summers said in late 2008, raises “a critical problem of legitimacy” for American capitalism.

  It Didn’t Have to Be This Way

  It didn’t have to be this way. Economists have calculated that if the laws and the social contract widely accepted by Corporate America during the middle-class boom of the 1960s and ’70s had continued, average Americans would be far better off today. Sharing the gains from America’s economic growth from 1979 to 2006 in the same way they were shared from 1945 to 1979 would have given the typical middle-class family $12,000 more per year. Overall, 80 percent of Americans, from the bottom through the entire middle class, would have earned $743 billion more a year; the richest 1 percent would have made $673 billion less; and the next 4 percent down from the top would have made $140 billion less.

  So it was the changes in our laws and in the way American business decided to divide its revenues that cost average Americans roughly three-quarters of a trillion dollars since the late 1970s. All that money went to the richest 5 percent of Americans.

  Of course, hard times are not new to ordinary Americans. Cycles of boom and bust have periodically wreaked havoc with our economy and disrupted the lives of average families for many decades. Unemployment took its cyclical toll, and people tightened their belts. But after the downturns ended and the recovery came, people got their jobs back, the economy expanded, and the middle class got back on the up escalator.

  Today, mass layoffs are no longer a cyclical convulsion during hard times, but a permanent grinding reality even in good times. Firings and job cuts, antiseptically clothed in the corporate euphemisms of “restructuring” and “downsizing,” have become a chronic economic malignancy for average Americans in good times as well as bad. In a survey of one thousand companies, the American Management Association found rising numbers of business managements reporting big job cuts during the boom years of the late 1990s. When times got tough, from 2001 to 2003, roughly 5.4 million people were thrown out of work, mostly for reasons unrelated to their work performance. When they were surveyed in 2004, one-third had failed to find new jobs, and more than half of those who had found work were making less than before—a pattern repeated in the latest recession.

  Overall, more than fifty-nine thousand factories and production facilities were shut down all across America over the last decade, and employment in the core manufacturing sector fell from 17.1 million to 11.8 million from January 2001 to December 2011, a punishing toll for what historically had been the best sector for steady, good-paying middle-class jobs. By pursuing a deliberate strategy of continual layoffs and by holding down wages, both of which yielded higher profits for investors, business leaders were not only squeezing their employees, they were slowly strangling the middle-class consumer demand that the nation needed for the next economic expansion.

  This trend has made it far harder for the private sector to pull the country out of a slump, and it has increased the need for government action to stimulate the economy. The evidence is clear. With each recession since 1990, it has taken longer and longer for the U.S. economy to dig out of the hole and to regain the jobs lost in recession. After the 1990 downturn, economists coined the term “jobless recovery” because it took much longer than usual—twenty-one months—to gain back the lost jobs. It was twice as bad after the 2001 recession. Getting the jobs back took forty-six months.

  Stephen Roach, as chief economist for Morgan Stanley, called the painfully slow 2002–03 recovery “the weakest hiring cycle in modern history.” Roach was especially alarmed that even when jobs did come back, they paid less, offered fewer benefits, and provided less security. As he said, 97 percent of the new hiring from the economic bottom in 2002 through mid-2004 was for part-time work. Millions of the better-paying, full-time jobs were gone for good—sent offshore to increase corporate profits. Unemployment was increasingly a long-term structural cancer rather than a cyclical headache from which the middle class could more readily recover.

  During the most recent recession, that highly profitable but job-crushing trend accelerated so that by early 2011, The Wall Street Journal ran a front-page story about Corporate America sitting on idle capital amid high unemployment. “No Rush to Hire Even as Profits Soar,” the headline read. Corporations were reporting year-end profits of more than $1 trillion—up 28 percent from a year earlier—and promising dividend increases to affluent shareholders. The Dow Jones Industrial Average ran up above 12,000, while roughly twenty-nine million Americans were either unemployed, involuntarily working part-time, or dropping out of the labor market in despair. The rich had recovered from the recession, the middle was wounded and in pain, and Corporate America was hoarding $1.9 trillion in cash and expanding its overseas operations. That put a crimp on America’s recovery.

  The numbers confirmed the pattern of the past three decades—the toll on average middle-class employees was heavy, while Corporate America was enjoying high profits. The old social contract had been withered away.

  CHAPTER 7

  THE GREAT BURDEN SHIFT

  FUNDING YOUR OWN SAFETY NET; CRIPPLED BY DEBT

  The burden shift has turned the traditional definition of the American dream “on its ear.”

  —“THE METLIFE STUDY OF THE AMERICAN DREAM”

  More and more economic risk has been offloaded by government and corporations onto the increasingly fragile balance sheets of workers and their families. This … is at the root of Americans’ rising anxiety about their economic standing and future.

  —JACOB HACKER,

  The Great Risk Shift

  WHEN PAUL TAYLOR AND RICH MORIN of the Pew Research Center did a poll on how people were faring during the Great Recession, they put a face on America—actually, two faces. They described a revealing dichotomy in the public mood—a schizophrenia in which 55 percent of Americans reported they were in deep trouble, but 45 percent claimed to be holding their own.

  Taylor was so struck by these two different portraits that he titled their report “One Recession, Two Americas.” That dichotomy in attitudes—and experience—helps explain the nation’s sharp political divisions on such contentious issues as President Obama’s economic stimulus package and raising taxes on the wealthy.

  The “Two Americas” report explained the dissonance in people’s experience, such as my own puzzlement at reading newspaper accounts of 15 million Americans being unemployed and 6.7 million families being foreclosed out of their homes, then seeing suburban restaurants jammed with
people on a night out, spending as if the economy were strong.

  We are literally Two Americas, remarkably out of touch with each other—the fortunate living the American Dream but lacking any practical comprehension of how the other half are suffering, month in and month out, unaware of the enervating toll of economic despair on the unfortunate half, many of whom just two or three years before had counted themselves among the fortunate.

  The Pew survey documented a class split in America. Among the losers, the picture was bleak: Two-thirds said their family’s overall financial condition had worsened; 60 percent said they had to dig into savings or retirement funds to take care of current costs; 42 percent had to borrow money from family and friends to pay their bills; 48 percent had trouble finding medical care or paying for it. The psychological toll was heavy. By contrast, the other half, the relative winners, admitted to some problems such as stock market losses but described their woes as modest and manageable.

  The fault lines dividing losers and winners were income and age. Nearly two-thirds of those earning $75,000 or more said they were holding their own, while nearly 70 percent of those making under $50,000 were losing ground. Most seniors over sixty-five, buttressed by Social Security and traditional lifetime pensions, were doing all right. But 60 percent of the people of working age, between eighteen and sixty-four, gloomily reported that they were falling behind.

  The Zero Decade: 2000–09

  The one thing the two groups had in common was their verdict that the ten-year period from 2000 through 2009 was the worst decade in more than half a century—the first one in half a century where people had more negative than positive feelings. “The single most common word or phrase used to characterize the past 10 years,” the Pew Center reported, “is downhill, and other bleak terms such as poor, decline, chaotic, disaster, scary, and depressing are common.”

  That language tells how average Americans feel. The numbers describe the damage. In just one three-month period, the final quarter of 2008, American households lost $5.1 trillion of their wealth through plunging home values and steep stock market losses—the most ever in a single quarter in the fifty-seven years that the Federal Reserve has kept records. During the full year of 2008, American households lost $11.1 trillion, close to one-fifth of their total accumulated private wealth.

  More and more trillions evaporated in 2009, 2010, and into 2011. With housing prices falling steadily for five straight years, unemployment stuck at stubbornly high levels, and the stock market bouncing up and down, periodically spooked by fear of a second dip into recession, those astronomical losses became permanently etched into the lives of millions of middle-class families. Their personal safety nets had been shredded.

  The Misery Index

  Translating cold numbers into a graphic picture of the hard economic realities in the lives of ordinary people is a challenge. In the 1990s, economist Edward Hyman of the ISI Group devised the Misery Index to capture the stress on average families by costly, unavoidable items that take a big bite out of family budgets and crimp what families have left to live on. The Misery Index tracked four items—income taxes, Social Security taxes, medical costs, and interest payments. In 1960, these four items took 24 percent of family budgets; but by the 1990s, they were taking more than 42 percent. Income taxes were lower, but Social Security payroll taxes had risen along with medical costs and interest payments on mortgages and debt. In sum, necessities, not lavish spending habits, were eating up family income.

  More recently, Yale University political economist Jacob Hacker and his research team developed the Economic Insecurity Index, which logs the harshest economic blows a family can face—an income loss of 25 percent or more in a single year; superheavy medical expenses; or the exhaustion of a family’s financial reserves. Using this index, Hacker found that in 1985, roughly 10 percent of all Americans had suffered an acute financial trauma that year. By July 2010, the proportion had jumped to 20 percent—one in five American families suffering from an economic tornado ripping through their lives.

  As that Pew Center poll discovered, even middle-class families who avoided the most acute distress have experienced rising economic anxiety in the past two decades.

  The “Great Risk Shift”

  There is good reason for pervasive middle-class angst. Financial insecurity has been written into the DNA of the New Economy. Not only has the New Economy been more volatile and the economic gains been distributed more unequally than during the era of middle-class prosperity, but Corporate America has rewritten the social contract that once underpinned the security of most average Americans. The company-provided welfare safety net that rank-and-file employees enjoyed from the 1940s into the 1970s has been sharply cut back, and a huge share of the cost burden has been shifted from companies to their employees.

  In 1980, for example, 70 percent of Americans who worked at companies with one hundred or more employees got health insurance coverage fully paid for by their employers. But from the 1980s onward, employers began requiring their employees to cover an increasing portion of the health costs. Other employers dropped company-financed health plans entirely, saying they could not afford them. Many small businesses made employees pay for all, or most, of the health insurance costs. As union membership declined in various industries, this trend gained momentum.

  So pervasive did this burden shift become that by the mid-2000s, only 18 percent of workers—one-quarter of the percentage in 1980—were getting full health benefits paid by their employers. Another 37 percent got partial help but had to pick up a large part of the tab themselves. The rest (45 percent) got no employer support. Some companies may have needed this change to survive, but many simply added the cost savings to their profit line.

  Wal-Mart, the nation’s largest employer, four of whose owners rank among America’s eleven richest individuals, decided in October 2011 to roll back health care coverage for its large part-time workforce and to sharply raise health premiums for many full-time staffers. In the early 2000s, Wal-Mart had touted the news that 90 percent of its employees had health coverage, though it neglected to reveal that at least half got coverage from other employers through their spouses. “The truth is more like 38, 39 percent” were covered by Wal-Mart, said Jon Lehman, a former manager of six different Wal-Mart stores. Very often, Lehman told me, he personally had to counsel and even drive Wal-Mart employees to nonprofit charities and organizations that provided indigent care because they had no Wal-Mart health coverage.

  Wal-Mart’s policies generated so much public controversy that in 2007, Wal-Mart took a more generous approach—picking up a larger share of the health premiums for its full-time employees and offering coverage for part-timers after a year of employment. In 2008, Wal-Mart reported that for the first time in its forty-six-year history, it was covering 50.2 percent of its employees.

  But in 2011, Wal-Mart’s management decided to back off the new benefits package. The company’s decision to deny health coverage to new part-time employees, according to Wal-Mart spokesman Gene Rossiter, was driven by rising health care premiums. “Over the last few years, we’ve all seen our health care rates increase,” Rossiter explained. “The decisions made were not easy, but they strike a balance between managing costs and providing quality care and coverage.” In reaction, some Wal-Mart employees said they could not afford the higher health premiums, and Dan Schlademan, director of Making Change at Wal-Mart, a union-backed campaign, protested that Wal-Mart’s move was “another example of corporations putting profits ahead of what’s good for everyday Americans.”

  In the same tough economic climate, Costco, a big-box retail rival of Wal-Mart, took the opposite tack. Costco has maintained health coverage for roughly 85 percent of its employees, while keeping wages steady and avoiding large layoffs. “We try to provide a very comprehensive health-care plan for our employees. Costs keep escalating, but we think that’s an obligation on our part,” explained Costco CEO Jim Sinegal. “We’re trying to build a compan
y that’s going to be here 50 and 60 years from now. We owe that to the communities where we do business. We owe that to our employees, that they can count on us for security. We have 140,000 employees and their families … who count on us.”

  Costco is known for a high retention rate among its employees, while Wal-Mart has a reputation for high employee turnover. At Wal-Mart, CEO pay packages have run as high as $20 million in recent years, whereas Costco’s Sinegal consistently took a pay package of about $2.2 million. As The Wall Street Journal put it, Sinegal chose being kind to his own workers over making Wall Street happy. In recent years, Morningstar, the investment rating service, reports that Costco has outperformed Wal-Mart and other retailers. Even so, the trend in business has moved away from the Costco model.

  The Shift from Pensions to 401(K)’s

  In terms of the overall financial burden shift from corporations to employees, by far the largest change has come in retirement benefits. In 1980, 84 percent of the workers in companies with more than one hundred employees were in lifetime pension plans financed by their employers. By 2006, that number had plummeted—only 33 percent had company-financed pensions. The rest either got nothing or had been switched into funding their own 401(k) plans with a modest employer match.

  The switch offered big savings for employers. According to longtime pension expert Brooks Hamilton, the lifetime pension system cost companies from 6 to 7 percent of their total payroll, but they spent only 2 to 3 percent on matching contributions for 401(k) plans. Often those savings went directly into corporate profits and bigger stock options bonuses for the CEO and other top executives.

  The explanation from corporate chiefs and financial officers echoed Wal-Mart. Businesses said they could no longer afford lifetime pensions. As Jeffrey Immelt, CEO of General Electric, told his stockholders: “[The] pension has been a drag for a decade.”

 

‹ Prev