Book Read Free

The Great Democracy

Page 6

by Ganesh Sitaraman


  Neoliberal embrace of individualism and opposition to “the collective society,” as Thatcher put it, also had perverse consequences for social and political life. Humans are social animals. But neoliberalism rejects both the medieval approach of having fixed social classes based on wealth and power and the modern approach of having a single, shared civic identity based on participation in a democratic community. The problem is that amid neoliberalism’s individualistic rat race, people still need to find meaning somewhere in their lives. And so there has been a retreat to tribalism and identity groups, with civic associations replaced by religious, ethnic, or other cultural affiliations. To be sure, race, gender, culture, and other aspects of social life have always been important to politics. But neoliberalism’s radical individualism has increasingly raised two interlocking problems.

  First, when taken to an extreme, social fracturing into identity groups can be used to divide people and prevent the creation of a shared civic identity. Self-government requires uniting through our commonalities and aspiring to achieve a shared future. When individuals fall back onto clans, tribes, and us-versus-them identities, the political community gets fragmented. It becomes harder for people to see each other as part of that same shared future. Demagogues rely on this fracturing to inflame racial, nationalist, and religious antagonism, which only further fuels the divisions within society. Neoliberalism’s war on “society,” by pushing toward the privatization and marketization of everything, thus indirectly facilitates a retreat into tribalism that further undermines the preconditions for a free and democratic society.

  The second problem is that neoliberals on right and left sometimes use identity as a shield to protect neoliberal policies. As one commentator has argued, “Without the bedrock of class politics, identity politics has become an agenda of inclusionary neoliberalism in which individuals can be accommodated but addressing structural inequalities cannot.” What this means is that some neoliberals hold high the banner of inclusiveness on gender and race and thus claim to be progressive reformers, but they then turn a blind eye to systemic changes in politics and the economy. Critics argue that this is “neoliberal identity politics,” and it gives its proponents the space to perpetuate the policies of deregulation, privatization, liberalization, and austerity. Of course, the result is to leave in place a political and economic system that disproportionately harms the very groups that inclusionary neoliberals purport to support. On the left, this tactic usually focuses on racial and gender inclusion; on the right, it emphasizes nationalism of one sort or another. But in either case, this variant on identity politics can turn into a way to reinforce neoliberal policies.8

  The foreign policy adventures of the neoconservatives and liberal internationalists haven’t fared much better than economic policy or cultural politics. The United States and its coalition partners have been bogged down in the war in Afghanistan for eighteen years and counting. Neither it nor Iraq is a liberal democracy, nor did the attempt to establish democracy in Iraq lead to a domino effect that swept the Middle East and reformed its governments for the better. Instead, power in Iraq has shifted from American occupiers, to sectarian militias, to the Iraqi government, to ISIS terrorists, and back to the Iraqi government. Far from being humanitarian, more than a hundred thousand Iraqis are dead. Or take the liberal internationalist 2011 intervention in Libya. The result was not a peaceful transition to stable democracy but instead civil war and instability, with thousands dead as the country splintered and portions were overrun by terrorist groups. On the grounds of democracy promotion, it is hard to say these interventions were a success. And for those motivated to expand human rights around the world, on the civilian death count alone, it is hard to justify these wars as humanitarian victories.9

  Indeed, the central anchoring assumptions of the American foreign policy establishment have been proven wrong. Foreign policy makers largely assumed that all good things would go together—democracy, markets, and human rights—and so they thought opening China to trade would inexorably lead to it becoming a liberal democracy. They were wrong. They thought Russia would become liberal through swift democratization and privatization. They were wrong. They thought globalization was inevitable and that ever-expanding trade liberalization was desirable even if the political system never corrected for trade’s winners and losers. They were wrong. These aren’t minor mistakes. And to be clear, Donald Trump had nothing to do with them. All of these failures were evident prior to the 2016 election.10

  In spite of these failures, most policy makers did not have a new ideology or different worldview through which to view the problems of this time. So, by and large, the collective response was not to abandon neoliberalism. After the Great Crash of 2008, neoliberals chafed at attempts to push forward aggressive Keynesian spending programs to spark demand. President Barack Obama’s advisors shrank the size of the postcrash stimulus package for fear it would seem too large to the neoliberal consensus of the era, and on top of that, they compromised on its content. About one-third of the stimulus ended up being tax cuts, which have a less stimulative effect than direct spending. After Republicans took back the Congress in 2010, Obama was forced into sequestration, a multiyear austerity program that slashed budgets across government, even as the country was only beginning to emerge from the Great Recession. The British Labour Party’s Chancellor of the Exchequer said after the 2008 crash that Labour’s planned cuts to public spending would be “deeper and tougher” than Margaret Thatcher’s.11

  When it came to affirmative, forward-looking policy, the neoliberal framework also remained dominant. Take the Obamacare health care legislation. Democrats had wanted to pass a national health care program since at least Harry Truman’s presidency. But with Clinton’s failed attempt in the early 1990s, when Democrats took charge of the House, Senate, and presidency in 2009, they took a different approach. Obamacare was built on a market-based model that the conservative Heritage Foundation helped develop and that Mitt Romney, the Republican governor of Massachusetts, had adopted. It is worth emphasizing that Obamacare’s central feature is a private marketplace in which people can buy their own health care, with subsidies for individuals who are near the poverty line. There was no nationalization of health care through a single-payer system, and centrist Democrats like Senator Joe Lieberman blocked the creation of a public option that might coexist and compete with private options on the marketplaces. Fearful of losing their seats, centrists extracted these concessions from progressives. Little good it did them. The president’s party almost always loses seats in midterm elections, and this time was no different. For their caution, centrists both lost their seats and gave Americans fewer and worse health care choices. Perhaps the bigger shock was that courageous progressive politicians who also lost in their red-leaning districts, like Virginia’s Tom Perriello, actually did better than their cautious colleagues.12

  On the right, the response to the crash went beyond ostrich-like blindness to doubling down on the failed approaches of the past. The Republican Party platform in 2012, for example, called for weaker Wall Street, environmental, and worker safety regulations; lower taxes for corporations and wealthy individuals; and further liberalization of trade. It called for abolishing federal student loans, in addition to privatizing rail, western lands, airport security, and the post office. Republicans also continued their support for cutting health care and retirement security. After forty years moving in this direction—and with it failing at every turn—you might think they would change their views. But Republicans didn’t, and some still haven’t.13

  Although neoliberalism had little to offer, in the absence of a new ideological framework, it hung over the Obama presidency—but now in a new form. Many on the center-left adopted what we might call the technocratic ideology, a rebranded version of the policy minimalism of the 1990s that replaced minimalism’s tactical and pragmatic foundations with scientific ones. The term itself is somewhat oxymoronic, as technocrats seem like the opposite of
ideologues. But an ideology is simply a system of ideas and beliefs, like liberalism, neoliberalism, or socialism, that shapes how people view their role in the world, society, and politics. As an ideology, technocracy holds that the problems in the world are technical problems that require technical solutions. It is worth pointing out what this implies: First, it means that the structure of the current system isn’t broken or flawed. Any problems are relatively minor and can be fixed by making small tweaks in the system. Second, the problems are not a function of deep moral conflicts that require persuading people on a religious, emotional, or moral level. Instead, they are problems of science and fact, in which we can know “right” answers and figure out what works because there is consensus about what the end goals are. The result is that the technocratic ideology largely accepts the status quo as acceptable.

  The technocratic ideology preserves the status quo with a variety of tactics. We might call the first the complexity canard. Technocrats like to say that entire sectors of public policy are very complicated and therefore no one can propose reforms or even understand the sector without entry into the priesthood of the technocracy. The most frequent uses of this tactic are in sectors that economists have come to dominate—international trade, antitrust, and financial regulation, for example. The result of this mindset is that bold, structural reforms are pushed aside and highly technical changes adopted instead. Financial regulation provides a particularly good case, given the 2008 crash and the Great Recession. When it came time to establish a new regulatory regime for the financial sector, there wasn’t a massive restructuring, despite the biggest crash in seventy years.

  Instead, for the most part, the Dodd-Frank Act was classically technocratic. It kept the sector basically the same, with a few tweaks here and there. There was no attempt to restructure the financial sector completely. Efforts to break up the banks went nowhere. No senior executives went to jail. With the exception of creating the Consumer Financial Protection Bureau, most reforms were relatively minor: greater capital requirements for banks, more reporting requirements. Where proponents claimed they were doing something bold, Dodd-Frank still fell prey to the technocratic ideology. The Volcker Rule, for example, sought to ban banks from proprietary trading. But instead of doing that through a simple, clean breakup rule (like the old Glass-Steagall regime), the Volcker Rule was subject to exceptions and carve outs—which federal regulators were required to explain and implement with hundreds of pages of technical regulations.14

  Dodd-Frank also illustrates a second tenet of the technocratic ideology: the failures of technocracy can be solved by more technocracy. Whenever technocratic solutions fail, the answer is rarely to question the structure of the system as a whole. Instead it is to demand more and better technocrats. Those who acknowledge that voting for the Iraq War was a mistake regretted not having better intelligence and postwar planning. Rare was the person who questioned the endeavor of policing vast regions of the world simultaneously with little knowledge of the local people, customs, or culture. All that was needed was better postwar planning. It was a technical, bureaucratic problem. Dodd-Frank created the Financial Stability Oversight Council, a government body tasked with what is called macroprudential regulation. What this means is that government regulators are supposed to monitor the entire economy and turn the dials of regulation up and down a little bit to keep the economy from another crash. But ask yourself this: Why would we ever believe they could do such a thing? We know those very same regulators failed to identify, warn about, or act on the 2008 crisis. We know markets are dynamic and diverse and that regulators can’t have full information about them. And we know regulators are just as likely as anyone else to be caught up in irrational exuberance or captured by industry. Instead of establishing structural rules for permissible financial activities, even if they are a little bit overbroad or underinclusive, Dodd-Frank, once again, put its faith and our fates in the hands of technocrats.

  Perhaps the most prominent and celebrated technocratic approach to liberal public policy in recent years is nudging. If there was a court intellectual in the Obama administration, it was surely Cass Sunstein—one of the most prolific, brilliant, and influential legal scholars of his generation and the leading proponent of nudging. Behavioral psychologists have shown that small changes in how choices are framed can have meaningful effects. For example, whether desserts or fruit are served first or last in a school cafeteria can radically change what students will eat. In their book Nudge, Richard Thaler and Sunstein argue that it makes sense to design the structure of choices in a way that helps people become healthier, wealthier, and happier. Nudges have become so popular that the Obama administration created a Social and Behavioral Sciences Team to apply the lessons of nudging across the federal government, and UK prime minister David Cameron created a Behavioural Insights Team, nicknamed the Nudge Unit.15

  Nudges assume that policy makers should work within the existing structures and frameworks of public policy. Many proponents of the behavioral approach to public policy see this as a feature, not a bug. As the heirs to the minimalism of the 1990s, Thaler and Sunstein declare that in an age of political polarization and partisan gridlock, nudging is the “real third way.” But as with minimalism more generally, the result is a comparatively unambitious approach to public policy. The dominance of nudges has severely limited our ability to see the full range of policy options and to tackle some of the biggest problems facing our society.16

  Consider retirement security as an example. Nudgers suggest that employers adopt automatic enrollment in retirement plans for their employees at a relatively low rate of 3 percent. The policy, however, has had a variety of problems. First, it only changes the behavior of people who have employers that offer retirement plans—and who make enough money that they can put away savings on a regular basis. It does nothing for those who either don’t have access to an employer-based retirement plan or those who are too poor to save. Indeed, one recent study suggests that not having enough money to make ends meet—not psychological biases—is why people fail to save. If you can’t afford to save, nudging isn’t going to help.17

  A second problem is that nudgers often don’t think about the broader trade-offs between the programs that they try to improve and other policy options. In their critique of behavioral policy, professors Ryan Bubb and Richard Pildes point out that the United States spends billions on tax incentives for people to save through private retirement plans, like 401(k)s and IRAs. But studies suggest that for every dollar spent on these tax incentives, there is an increase in savings of only one penny—a single penny. As a result, they argue that we are spending billions of dollars on policies that have almost no impact. And yet, instead of arguing for fixing that problem or the broader retirement savings crisis, nudgers have put their efforts into making minor changes to the existing regime. Why not, instead, spend that money to expand Social Security?18

  Economist George Loewenstein has said that “the single biggest contribution of behavioral economics to public policy is taking this flawed approach to retirement savings and making it a little bit more viable.” To be fair, the leading proponents of nudging do not claim that their approach can solve every policy problem or that it is an all-encompassing vision for the future. But the fact that this is considered one of the cutting-edge and most celebrated paradigms for public policy shows that technocracy cannot possibly rise to meet the challenges of today. And the fact that the biggest brains and brightest minds spend their time on tweaking the system shows how limited the ambitions of policymakers had become in the late neoliberal era.19

  We should not be surprised by these dynamics. The arc of neoliberalism followed a pattern common in history. In the first stage, neoliberalism gained traction in response to the crises of the 1970s. It is easy to think of Thatcherism and Reaganism as emerging fully formed, springing from Zeus’s head like the goddess Athena. But it is worth remembering that Thatcher was a gradualist. Rhetorically, she would champion the
causes of the right wing. But practically, her policies would often fall short of the grand vision. For example, she refused to allow any attempt to privatize the Royal Mail and the railways. She even preferred to use the phrase denationalization to privatization, thinking the latter unpatriotic and far too radical. The central problem, as she noted in her memoirs, was that “there was a revolution still to be made, but too few revolutionaries.”20

  A similar story can be told of Ronald Reagan. Partly because he faced a Democratic House of Representatives, conservative radicals were occasionally disappointed with the extent the Reagan administration pushed its goals. Under Ronald Reagan, William Niskanen writes, “no major federal programs… and no agencies were abolished.” The Intergovernmental Panel on Climate Change (IPCC) was created during the Reagan administration, and President Reagan signed a variety of environmental laws. Early leaders were not as ideologically bold as later mythmakers think.21

  In the second stage, neoliberalism became normalized. It persisted beyond the founding personalities and, partly because of its longevity in power, grew so dominant that the other side adopted it. Thus, when the Tories ousted Thatcher and replaced her with John Major, they unwittingly made Thatcherism possible. Major objected to the term Majorism because he wanted to be Thatcher’s heir, “to dust himself with the gold of greatness, to render Thatcherism safe for perpetuity.” Major wasn’t the bare-knuckled fighter that Thatcher was; he was a lower-key prime minister. He wanted to offer Britain “Thatcherism with a human face,” and he set himself to smoothing out the rough edges. The result was to consolidate and advance the neoliberal project in Britain. When Major was elected in his own right in 1992, he got more votes than Thatcher ever had—and more than Tony Blair received in 1997. As Major himself noted, “1992 killed socialism in Britain… Our win meant that between 1992 and 1997 Labour had to change.”22

 

‹ Prev