Innovative State

Home > Other > Innovative State > Page 2
Innovative State Page 2

by Aneesh Chopra


  More needed to be done to limit the unchecked economic power of the new breed of big businesses and banks. At the time, however, American government—of any kind—didn’t appear to be ideally equipped for such an assignment. On the local level, urban bosses dominated the politics of rapidly growing cities and were predominantly focused on divvying up the spoils of power, ignoring the needs of their residents in much the same way that businesses ignored those of their workers. Things weren’t operating much better at the federal level. In the early 1870s, the federal government employed just 51,020 civilians, of whom 36,696 were postal workers—and there, too, a spoils system rewarded political cronies and party hacks with the few government jobs. Nor was the federal government on the cutting edge of technology, especially as compared to the private sector. Consider the typewriter. In 1887, a report published in the Penman’s Art Journal noted that typewriter usage had proliferated in the private sector to such an extent in less than five years that the instrument had gone from curiosity to critical infrastructure “in almost every well regulated business establishment.”5 In contrast, the federal government had failed to incorporate the productivity tool at all.

  It would take considerable time for the technology gap to close, but on the management front, pressure built for something to be done, to enable the government to take a more constructive role in limiting unchecked economic power. Then came the thunderbolt event of 1881, the assassination of President James Garfield by Charles Guiteau, a political supporter angry about being denied an ambassadorship.6 That inspired progressive reformers to fight for a regularized and bureaucratized government where officeholders were insulated from politics. With this new “civil service” in place, government became more professionalized and ready to act.

  In 1887, the Interstate Commerce Commission was established, the first major attempt of the federal government to oversee the national economy. The bill creating the commission, signed by President Grover Cleveland, stemmed from the widespread belief, especially among western farmers, that railroads were systematically abusing their power, in terms of setting rates on shipments. Railroads had to set “reasonable” rates and, in an effort to curtail corruption, were restricted from giving preference to any person, company, location, city, or type of traffic. Three years later, with Benjamin Harrison in the White House, the federal government took another major regulatory action, the Sherman Antitrust Act, passed to bust up the big monopolies.

  These were the first national accomplishments of the emerging progressive movement. And by “progressive,” don’t think of how it’s used in Washington today, as a euphemism for “liberal” and a description of the political philosophy of many in the Democratic Party. The progressive movement of the late nineteenth and early twentieth centuries was not a partisan effort—it flowed through both political parties. It carried along Republicans like Teddy Roose­velt and Democrats like Woodrow Wilson, and it had devotees in small towns and big cities. But it united its adherents with a vision of a government that was brought up-to-date: a massive, hierarchical, bureaucratic enterprise to check the massive, hierarchical corporations. The progressive push set the stage for the twentieth century to be the bureaucratic century, in both the private and public sectors.

  But again, it didn’t all happen at once. While the first decades of the twentieth century saw the birth of new federal agencies (such as the Department of Labor and the Federal Trade Commission) and even the passage of important new laws to protect workers, regulate the quality of food and drugs sold, and oversee banks, it took the trauma of the Great Depression to create the building blocks of the federal government we know today.

  Faced with a society in which at least one in four Americans were without work, Franklin D. Roosevelt said, “It is common sense to take a method and try it: if it fails, admit it frankly and try another—but above all, try something.” He and his fellow New Dealers tried many things—some worked, some did not. But what they did do, without any argument, was remake the way in which Americans thought about the role of government. The New Deal doubled the percentage of the national economy devoted to government, and led to an alphabet soup of new government agencies: the SEC, FCC, CAB, NLRB, NRA, WPA, and CCC.7 World War II then accelerated the growth in the size and scope of government as the nation mobilized for war. Government spending jumped from $9 billion in 1940 to $98 billion in 1944—the year Americans landed at Normandy. Rationing of certain foods and other essential goods such as gasoline, metal, and rubber was instituted. While industries were not nationalized, the government took an active role in directing the economy through a range of government organizations such as the War Production Board; the Supply Priorities Allocation Board; and the Office of Price Administration, which implemented new price controls on goods. Once victory was in hand, the most direct controls on the economy mainly abated; what remained was an enlarged defense establishment (what President Eisenhower would later characterize as the “military-industrial complex”), the New Deal agencies and programs, and the large bureaucracies to carry out their missions.

  Though the two parties found much to fight about in the postwar years, there was a basic consensus about the role of government for much of that time: economic security would be provided to people through rule-bound bureaucracies, which, whatever their faults, would be fair and equitable in their treatment of people. As present-day government reform gurus David Osborne and Ted Gaebler put it: “[The government] delivered the basic, no-frills, one-size-fits-all services people needed and expected in the industrial era: roads, highways, sewers, and schools.” Similar to those in the private sector, these large organizations had to collect and organize millions of records, without the computing power that we have sitting on our desktops and bouncing around in our pockets today. The result was an apparatus that, for the most part, was a “government of clerks”—tens of thousands of them. At times, it could be maddening dealing with the red tape, but in an era that moved slower and in which large, impersonal, hierarchical organizations were the norm, it was accepted: if you waited to go to your bank for the three hours it was open on Wednesday to make a deposit, you were not necessarily shocked that you had to do the same for a passport or marriage license. As the economy boomed and tax revenue continued coming in, this system worked. And from the GI Bill to the Interstate Highway System to federal home loans to the space program, the postwar government did amazing things that helped to create an American golden age.8

  But that was short-lived. As America encountered more turbulence in the 1960s and 1970s, trust in government plummeted. Historians have long debated the core causes of that decline, though there’s no denying it was a time of traumatic, transformative events that shook people’s belief in institutions of many kinds: from the struggle for racial equality to the assassinations of the Kennedy brothers and Martin Luther King Jr. to the stunning lack of transparency throughout the war in Vietnam and the investigation of the Watergate affair.9 Parents whose children were drafted into a conflict of unclear purpose and no defined end certainly had cause to lose some faith in the government. But, so, too, did those who were simply sending their kids to school, in hopes of a better future. There was a growing belief that government was no longer delivering the requisite results in the areas of education or employment. As the National Commission on Excellence in Education found in its seminal 1983 report, A Nation at Risk, SAT scores had been on a steady decline from 1963 to 1980, about 13 percent of all 17-year-olds were functionally illiterate with the rate as high as 40 percent in some minority communities; and the number of students needing remedial math courses in colleges and universities jumped by 72 percent from 1975 to 1980.10

  Even high school graduates with passable skills found good-­paying jobs in short supply. The factories, foundries, and plants that powered the postwar boom in the Northeast and Midwest started to chase cheaper labor and sunnier climes in order to keep up with global competitors—taking jobs with them. A q
uarter century that saw rising incomes across the board came to an end in the early 1970s and was replaced with stagnating incomes for households even as women left the home and entered the workforce. Oil shocks and inflation battered the economy, and modern-day muckrakers exposed the problems with the air we breathed, the water we drank, and even the toys we gave our kids. In the ensuing years, cities, from Newark to Cleveland, Baltimore to Detroit, started to become trouble zones, with violent crime more than doubling during the 1960s, and welfare rolls increasing by 43 percent during the 1970s.11

  So, as the 1960s of moptops and Beatlemania slowly turned into the 1970s of bellbottoms and Saturday Night Fever, something else changed: Americans’ view of government, as many citizens began to identify an inverse relationship between its size (expanding) and its effectiveness (diminishing). Much of that growth was on account of entitlement programs intended to support the populace, programs such as Social Security, Medicare, and Medicaid that were widely popular;12 according to data from the Congressional Research Service, such mandatory spending represented less than 30 percent of total federal spending in 1962, ballooning to 45 percent of total spending by 1980.13

  No matter. The American people, fed up with rising tax rates and a declining payoff, started to disengage from their government. In 1964, 76 percent of those polled said that they trusted the federal government to do the right thing just about always or most of the time. By 1972, 53 percent expressed that view. And by 1980, only 25 percent did. In the mid-1960s, less than half of Americans agreed with the notion that people in government waste a lot of taxpayers’ money. By 1978, that figure had grown to more than three-quarters of those polled.14

  Around that time, some of those people started to strike back.15 In June 1978, voters in California—a state that had invested millions of dollars in the previous decades to create world-class public universities, miles of new highways and roads, and a vast new water system—passed Proposition 13, which constrained the rate of tax increases on property. This marked the start of a taxpayers’ revolt that spread across the country as 13 states passed efforts to limit taxing and spending and 23 state legislatures called for a constitutional convention to consider a balanced-budget amendment to the U.S. Constitution. On Election Day of that year, the Republican Party—with a reinvigorated antitax, small-government faction—gained three Senate seats, 12 House seats, and six governorships. In 1980, a former actor named Ronald Reagan rode that rebellious wave into the White House in a win that would have been absolutely unthinkable a decade earlier.

  Addressing the country after being sworn in on a cold January day in 1981, Reagan said: “In this present crisis, government is not the solution to our problem; government is the problem.”16

  Reagan’s election was a victory for those on the right who challenged the New Deal orthodoxy. Resolutely free market and antigovernment, these conservatives believed that reducing the size and scope of government—from taxes to regulations—would unleash the “creative destruction” of the marketplace. In addition to the tax cuts and regulatory rollbacks the Reagan administration undertook, it also launched an effort to cut the fat out of Washington.

  In 1982, President Reagan appointed businessman J. Peter Grace to lead a commission to “work like tireless bloodhounds to root out government inefficiency and waste of tax dollars.” For two years, 161 corporate executives and community leaders led a group of 2,000 volunteers to comb through the catalogues and catacombs of the federal bureaucracy to find waste and mismanagement. Issued by the Grace Commission, The President’s Private Sector Survey on Cost Control made 2,478 recommendations, which it predicted would save $424.4 billion over three years. It issued 47 volumes totaling 21,000 pages—all of which did little more than gather dust.17

  Part of that was because the Democratic Congress was uninterested in cooperating with a Republican president’s plans to slash the size of government. But part of it was that the Grace Commission was not solving the right problem.

  When you hear about a revolutionary, you usually think of a South American guerrilla wearing a beret or an eighteenth-century Frenchman taking to the streets and storming the barricades. But in late-twentieth-century Washington, a revolutionary was more likely to come in the form of Jim Pinkerton—an ungainly, six-foot-nine-inch 32-year-old working in a midlevel job in President George H. W. Bush’s White House.

  Jim Pinkerton was a most unlikely choice to help lead an intellectual revolution. Graduating from college in 1979, he had rushed to volunteer for Ronald Reagan’s presidential campaign and became one of a cadre of young people swept into the federal government when Reagan defeated President Jimmy Carter the following fall. By 1988, Pinkerton had become the protégé of George H. W. Bush campaign manager Lee Atwater, a legend of smashmouth politics, who made Pinkerton the Bush campaign’s director of opposition research, its master of dark arts. He was the man who brought Willie Horton to the attention of the Bush campaign. A murderer who had raped a woman after Democratic presidential candidate Michael Dukakis gave him a weekend furlough, Horton became symbolic of the candidate’s lack of strength on crime. In a campaign better known for its brutally negative attacks than real substance and ideas, Pinkerton was on the front lines, and was indispensable.18

  Pinkerton was rewarded for his work on the campaign with a critical post in the incoming Bush administration in 1989—the position of deputy director of domestic policy at the White House. Occupying a domestic policy job in an administration fixated on the foreign policy challenges surrounding the end of the Cold War, the invasion of Panama, the crackdown at Tiananmen Square, and the first Gulf War was a little like being the famous Maytag repairman: you ended up with a lot of time on your hands. In between meetings on agricultural subsidies and disability policies, Jim Pinkerton came to a realization: twentieth-century government just wasn’t working.

  The problem was not only that government had grown overweight; it had become antiquated in a rapidly changing world. As factories were being padlocked and the personal computing revolution was taking shape, government was not simply resisting downsizing, it was resisting updating. The very paradigm was under question—and had to be rethought. “Even the most conservative President in recent American history, Ronald Reagan, couldn’t put a dent in the welfare state,” Pinkerton would say. “So the issue becomes, how do we make it work better?”

  At that moment in Washington, others from both parties were coming to the realization as well. One of them was Elaine Kamarck, a Democratic policy wonk with a PhD in political science from Berkeley. Together, she and Pinkerton launched what they called the New Paradigm Society, a highfalutin’ name for what amounted to a series of dinnertime bull sessions on topics such as education reform and housing policy. Pinkerton and Kamarck invited academics like Bill Galston from the University of Maryland and Amitai Etzioni of George Washington University as well as journalists such as Joe Klein of New York magazine and Paul Gigot of the Wall Street Journal; even politicians—HUD Secretary Jack Kemp and former Arizona governor Bruce Babbitt—occasionally joined them. Backbench Congressman Newt Gingrich was also a regular.

  Much of the Bush administration had little use for the theoretical musings of the New Paradigm Society. Richard Darman, the powerful director of the White House Office of Management and Budget, derided the focus on “empowerment” as “effete,” and claimed the term itself too reminiscent of the “power to the people” calls of the radical Black Panthers. “In the real world,” he said in a public speech attacking the New Paradigm Society, observers “might simply dismiss it by picking up the refrain, ‘Hey, brother, can you paradigm?’”19

  But Pinkerton and his fellow travelers had hit upon some very real challenges with what government had become. They had looked back over the past decade and seen how the New Deal manner of governing—the centralized bureaucracy—no longer was working. In What Comes Next, a book outlining his views, Pinkerton identified five “b
ugs” with this “operating system”:20

  Parkinson’s Law: in the words of the late British author C. Northcote Parkinson, “work expands to fill the time available for its completion.”21 It is the opposite of a more productive organization; less with more, whether in number of employees or man hours per employee. At the end of the nineteenth century when about half of Americans lived on farms, the Department of Agriculture had just 2,019 employees. By the 1990s, when only 1 in 40 Americans lived on farms, the department had more than 100,000 employees.22

  Peterism or the Peter Principle: in which people who consistently fail somehow manage to keep their jobs, or keep getting promoted, nonetheless. In 1990, out of 2.2 million federal civil servants, only 403 were fired for incompetence.23

  Oligarchism: the bureaucracy comes to make its self-­preservation the overriding priority. Reams of studies support what I saw firsthand in government: no ­department—whether the National Institutes of Health or the Pentagon—relishes slashing its own budget. Extraordinary effort and leadership are required to make the slightest dent.

  Olsonism: named after the economist Mancur Olson who observed that the rise of interest groups can lead to a small number of committed individuals asserting an outsized impact on decisions. Any individual earmark or slice of Congressional pork is too small to bother the entire country, but the company or community benefiting from it will fight like mad for its inclusion.

  Information Infarction: bureaucratic decision making fails because a bureaucracy cannot know all the relevant information. In top-down, hierarchical organizations, those on the frontlines have little incentive to present information that threatens the status quo. And when they do learn new, relevant information, it often takes too long for it to travel up the chain, be considered, and trickle back to have an impact.

 

‹ Prev