Book Read Free

A Nation of Moochers

Page 12

by Sykes, Charles J.


  Working for the Man

  What all of this means is that private-sector workers work longer, for less pay, and with fewer benefits to support an increasingly affluent public sector. Americans for Tax Reform found that in 2010, taxpayers now work 231 days out of the year “just to meet all costs imposed by government—8 days later than last year and a full 32 days longer than 2008.”45

  Put another way, government at all levels now consumes more than 63 percent of the national income. Workers have to work 104 days to cover the cost of the federal government; and another 52 days to pay the freight for state and local government; and another 74 days just to cover the cost of complying with the regulations promulgated by government bureaucracies.

  It would be unfair to blame all of this on public employees, many of whom work hard at difficult and crucial jobs. The same cannot be said of the public-employee unions and their incestuous nexus with the politicians who have abetted their rise and bloat. The public-union experiment over the last half century has been a disaster: The unions have become an entrenched, obdurate, and grasping special interest devoted to expanding government spending and employment, even at the expense of public services. Ironically, the public-employee unions bear only a passing resemblance to their private counterparts. Government is a monopoly. Unlike private unions, there is no competition that limits their appetites or against which their demands can be measured. Public employees also enjoy civil service protections that are largely unknown in the private sector.

  Even in times of economic distress, the unions remain a powerful barrier to reform and innovation, protecting the status quo as they protect their own powers, privileges, and perks no matter how unsustainable. The result is that public employees have become a new class of takers, increasingly mooching off taxpayers even as services are curtailed, especially at the local and state level.

  The war in Wisconsin demonstrated just how far they are willing to go to hold on to their claim on other people’s money.

  The Moocher Empire Strikes Back

  In early 2011, traditionally progressive Wisconsin became ground zero for the fight over public-employee privileges.

  When Wisconsin governor Scott Walker proposed curtailing public-employee-union power, the state’s capitol was besieged by hundreds of thousands of protestors and gripped by weeks of legislative gridlock and legal wrangling. Teachers staged illegal sick-outs, legislators received death threats, businesses were threatened with boycotts, and civil rights leaders descended on Madison, Wisconsin, trailing clouds of apocalyptic rhetoric unheard since the heady days of the sixties. Reverend Jesse Jackson Sr. likened the struggle to preserve union power not only to the civil rights movement, but also to the Exodus, comparing the mild-mannered and somewhat wonkish Walker to “a modern-day Pharaoh.”46 This was relatively benign compared with protestor signs that compared the governor to deposed Egyptian dictator Hosni Mubarak and to Adolf Hitler.

  Ironically, the protests had been launched by some of the most generously compensated public employees in the country, who were being asked to make relatively modest contributions to the state’s massive deficit. Wisconsin had lost more than 170,000 jobs in the Great Recession—almost all in the private sector—and the state’s per capita income had fallen below the national average. Even so, Wisconsin’s overall tax burden continued to be among the heaviest in the nation, in part because government workers had been shielded from the economic tribulations. State employees in Wisconsin, for instance, enjoyed one of the best pension systems in the country. Those pensions were funded by contributions from the state and from the employees themselves—except that the state also paid the “employee” portion of the pensions. As a result, while taxpayers contributed $1.37 billion a year into the state’s pension fund in 2009, most state employees paid precisely nothing toward their own retirements.

  Governor Walker’s proposal would have required them to contribute 5.8 percent of their salaries toward those pensions, as well as 12.6 percent of the cost of their increasingly expensive health care premiums. Even with the additional health care costs, government workers would be paying less than half of the national average for health insurance.* Taxpayers had been contributing $1 billion a year on employee health insurance, while Wisconsin’s pampered public employees paid a mere $64 million.47

  The union’s dramatic backlash exposed not only the gap between the benefits of the public and private sectors, but also the depth of the sense of entitlement among well-heeled public employees who had drunk deep from the cup of victimism. In some respects, Madison may have represented the first Greece-like moocher rebellion.

  Break Out the Violins

  Among the many tales of woe that appeared in the media in the wake of the Wisconsin protests was the melancholy tale of two public schoolteachers from Oshkosh, Wisconsin, who put in their retirement papers in the wake of the union reforms.

  “Not only am I losing salary and benefits and facing a bigger work load, but now they are taking away my rights,” a 56-year-old elementary schoolteacher named Mary Herricks told The Wall Street Journal. “Retirement was supposed to be something happy. I’m so sad.”48

  But a quick search of online databases takes some of the edge off the gloom. Ms. Herricks earned a salary of $68,423. The paper noted that even though she was retiring at 56, she would be able to collect “nearly her former salary” in pension benefits. It got better. Her husband, the local head of the teachers’ union, was also retiring from a position that paid him $75,916 a year; between the two of them, they made more than $140,000 a year. When generous fringe benefits are added in, the couple earned more than $190,000 in salary and benefits.

  Nor would their decision to retire pinch very much at all. Both of them will receive taxpayer-funded health insurance until they turn 65, as well as a payment worth about $600 per year of service, which would amount to about $43,000 on top of their pensions. They will also be able to earn additional income by working as substitute teachers. “Given that pensions are off-limits to certain taxes,” noted the Journal, “Mr. Herricks says they will bring home close to what they did before.” Few, if any, private-sector employees in Wisconsin would be able to say the same.

  All About the “Rights”

  As the battle over the two Americas escalated in Wisconsin, union leaders executed a tactical pivot, insisting that the uproar was not really about money at all, but rather about “rights.” Walker’s proposal included sharply curtailing the ability of government unions to bargain for anything other than wages and eliminated both mandatory union membership and the automatic deduction of union dues from public-employee paychecks. Some of the government unions said they would agree to the increased pension and health care contributions, but insisted they be able to retain all of their collective bargaining rights. Protestors claimed that those “rights” to bargain were fundamental civil rights. This was, of course, errant nonsense.

  There is no “right” to collective bargaining. Most federal employees are not permitted to collectively bargain either for wages or for benefits—and have never been allowed to do so, under either Republican or Democratic presidents.

  In fact, the concept of government unions is of quite recent provenance. No less a giant of progressivism than Franklin Delano Roosevelt had opposed the idea of public-employee unions. “The process of collective bargaining, as usually understood, cannot be transplanted into the public service,” wrote FDR. “I want to emphasize my conviction that militant tactics have no place” in the public sector. “[A] strike of public employees manifests nothing less than an intent on their part to obstruct the operations of government until their demands are satisfied. Such action looking toward the paralysis of government by those who have sworn to support it is unthinkable and intolerable.” Even labor boss George Meany, the president of the AFL-CIO, turned a doubtful eye on the idea of public-employee unions. “The main function of American trade unions is collective bargaining,” he wrote in 1955. “It is impossible to barga
in collectively with the government.”49

  Even today, government workers are allowed to collectively bargain in only roughly half the states, while others sharply limit who is allowed to come to the bargaining table. Indiana, Texas, and North Carolina do not permit public-employee bargaining at all.50 “At last report,” quipped columnist Jeff Jacoby, “democracy, fundamental rights, and freedom were doing just fine in all of them.”

  In practice, collective bargaining is less about rights than about power, as over time governments have ceded more and more authority and benefits to unions. In Madison, for example, in the voluminous teachers’ contract, the list of items subject to collective bargaining included everything from the size of bulletin boards to lighting, noise, chairs, footrests, adjustable terminals and keyboards, wall coverings and carpets, and room temperature. Even starting cars during cold weather was subject to collective bargaining. So great was the clout of the teachers’ union that they were able to insist that school districts throughout the state make deals with the union’s own insurance company to purchase health insurance, even though it often cost far more than comparable policies.51

  After years of collective bargaining and growing union power, state and local government in Wisconsin were rife with stories of bloated salaries and benefits. The highest paid municipal employee in Madison, for example, was a bus driver who pulled down a salary of $159,258 in 2009. That total included more than $109,000 in overtime that he was required to be paid through the union contract. More than a dozen state prison guards also made more than $100,000 in 2009, using generous overtime provisions that their union had negotiated with the state. In a practice known as “sick leave stacking,” guards were able to call in sick for one shift, but show up for the next shift and be paid time and a half. As Walker’s office later noted, “This results in the officer receiving 2.5 times his or her rate of pay, while still only working 8 hours.”*

  Riding the collective bargaining gravy train, some Wisconsin teachers were able to get a full year’s salary for just thirty days of actual work. In Green Bay, for example, the teachers’ union contract created an “emeritus program,” under which teachers were paid a full year’s salary—in addition to their already generous pension—for showing up just thirty days over a three-year period.52 Madison’s teachers had an even sweeter deal, since they had collectively bargained for an “emeritus program” that paid retirees nearly $10,000 a year on top of their pensions. Unlike the Green Bay deal, Madison teachers were not required to show up for a single day of work to receive the benefit.

  In another school district teachers enjoyed a staggering ninety paid sick days a year. Because the school year in Wisconsin was 180 days long, this meant that they could be paid a full year’s salary for just ninety days of work.53 In Milwaukee, the teachers’ union was able to get the district to pay health care premiums for retirees, a benefit that in 2016 will cost $4.9 billion, four times the Milwaukee school system’s entire current annual budget.54

  Life’s Blood

  Ultimately, the ferocity of the union backlash can be explained simply: It was not about rights. It was not even about the bloated salaries and benefits. It was about power and the threat posed to that power by proposals to end forced union membership and the automatic collection of dues. The power of labor rests increasingly on the power of public-sector unions, and the mother’s milk of their power and political influence is their access to millions of dollars of mandatory, government-collected union dues.

  For the unions and their political allies there is no threat more dire than the prospect of letting government workers voluntarily choose whether they will fund the union’s coffers. In that respect the story of Indiana governor Mitch Daniels is telling. Daniels wasted little time in cutting off the power of the government unions, eliminating their right to collective bargaining on his second day in office.

  “On the second day, we discontinued it and I held my breath,” Daniels recalled. “And we didn’t have a Madison at all. I often say the only two things that happened were, one, we got the freedom to change things in a major way and, two, 95% of the employees, once it was their free choice, quit paying the dues to the union.”55 Since 2005, the number of state employees in Indiana has dropped from 35,000 to 28,700.

  When Daniels became governor, 16,408 government workers paid dues to the public-employee unions. Six years later, just 1,490 did.

  Part Four

  BAILOUT MADNESS

  Lessons in Moral Hazard

  Allard E. Dembe and Leslie I. Boden define moral hazard as “the prospect that a party insulated from risk may behave differently from the way they would if they were otherwise fully exposed to that risk. It arises when an individual or institution does not bear the full consequences of its actions, and therefore tends to act less carefully than they otherwise would, leaving a third party to bear the responsibility for the consequences of those actions.”1

  Here are some thought experiments to illustrate the principle:

  Why don’t you eat cake, ice cream, and steak every night?

  Why aren’t you driving your dream car? (Which for the sake of this illustration we assume is a Porsche or Maserati.)

  Why did you pay your mortgage this month instead of betting it on the ponies?

  Why don’t you invest your life savings in your brother-in-law’s start-up venture?

  What if:

  What if you ate all the ice cream you wanted and somebody else got fat?

  What if you could eat all of the steak you wanted and somebody else’s cholesterol exploded?

  What if you could buy any car you wanted … on credit, and somebody else had to make the payments?

  What if your sugar daddy would cover any of your losses at the track? Would that change your attitude toward gambling?

  What if the government would foot the bill for any of your losses?

  Think of it as the ice cream/cholesterol/hot car/racetrack/loser-in-law Bailout of 2011.

  Notice how the shift of consequences also shifts responsibility and influences behavior. If someone else bears the consequences of your choices, all vestiges of restraint may not be eliminated, but they are probably eroded. You’d make very different decisions.

  This is moral hazard.

  Chapter 10

  * * *

  MORTGAGE MADNESS

  * * *

  The greatest bailout in history began when the financial world realized that betting trillions of dollars on unaffordable mortgages was a bad idea.

  Any account of the housing market before the deluge is necessarily a story of near madness: of ever-increasing risk and financial blindness and recklessness, mixed into a toxic stew of greed and arrogance. The early years of the twenty-first century saw a public-private partnership of financial irresponsibility.

  Behind the massive housing collapse was the idea that everyone has a right to a house or at least that it would be a public good if housing was widely distributed to groups and individuals who would normally not be able to afford their own home. This was what was meant by “affordability”: that the path to home ownership should be eased for people who had no or almost no down payment; that mortgages should be made available to people with shaky credit histories; and that people should be encouraged to buy homes more expensive than their incomes would suggest they could afford. The mechanisms of affordability were many: subprime mortgages; adjustable rate mortgages (ARMs); interest-only mortgages; guarantees by taxpayer-backed Fannie Mae and Freddie Mac for loans with loosened credit standards. If that wasn’t sufficient, there were so-called liar loans, and even the classic NINJA mortgage: no income, no job, no assets.

  As the bubble grew, the home mortgage itself was turned on its head, transformed from a stable long-term investment and token of financial prudence and planning into a short-term casino game. Where traditional mortgage standards placed a premium on personal responsibility and financial probity, the new rules rewarded fecklessness and encouraged cutting corner
s.

  No down payment? No problem. Not enough income to service the loan? Easily handled with ARMs and interest-only loans. Lousy credit score … just sign here. You need to own this house and we need to make it affordable!

  All of that took a wrecking ball to the culture of deferral of gratification, of saving, investing, and working to be able to acquire a house as a visible symbol of responsibility. By short-circuiting the process with a rush to spread the housing wealth, politicians and investors alike ignored warning signs, fought off reform attempts, and pushed further into the murkiest waters of high-risk lending.

  The result: massive collapse. Irresponsible lenders, speculators, and borrowers took down with them the value of the homes of responsible homeowners and then applied for bailouts. This was mooching on a global scale.

  The Securitization Bubble

  In 2000, as the new century dawned, interest-only loans accounted for a mere 0.6 percent of new mortgage loans; by 2005, they had grown to 32.6 percent of the total. By 2004, 46 percent of new home loans in dollar value (33 percent in numbers) were adjustable rate mortgages. And in 2005, 43 percent of first-time buyers put no money down at all.1

  Many of these were sold by lending outfits anxious to write as many mortgages as fast as possible, with the largest possible dollar values, regardless of risk. How did that happen?

  The early 2000s saw a boom in both the “securitization” (the resale of mortgages as investments) and the explosion in so-called nondepository mortgage originators who sprang up, writes Barry Ritholz, like “so many mushrooms in cow dung after a summer rain.”2 The new mortgage brokers aggressively hawked affordable and increasingly exotic mortgages designed to get people into homes they otherwise could not afford.

 

‹ Prev