Fault Lines

Home > Other > Fault Lines > Page 31
Fault Lines Page 31

by Kevin M. Kruse


  Despite the signals that his presidency would depart from past practices of Republicans, some of Bush’s earliest efforts were quite familiar. Though he was the son of former Republican president George H. W. Bush, George W. Bush looked instead to Ronald Reagan as a role model, so much so that an early account of his presidency was titled Reagan’s Disciple. Returning to President Reagan’s resistance to environmental regulation, for instance, the Bush administration made early moves to push back against the growing campaign to curb carbon emissions. At the end of his term, President Clinton had used executive orders to bypass a resistant Republican Congress, but his successor quickly worked to undo the measures. Most notably, in March 2001, Bush withdrew US support for the Kyoto Protocol, a landmark international agreement adopted in 1997 to curb the release of greenhouse gases into the atmosphere. An outspoken advocate of action, Vice President Al Gore had signed the measure on behalf of the Clinton administration, but the Senate never approved it. In sharp contrast, the Bush White House announced it had “no interest” in implementing the Kyoto Protocol. Rejecting the scientific consensus on the relationship between greenhouse gases and climate change, the administration and its allies insisted there was no need to act. At the same time, the Bush White House worked to reduce the oversight and enforcement role of the EPA, relegating inspection duties to the states. “The president’s cuts take the environmental cop off the beat,” warned Representative Robert Menendez, “and it creates a devastating blow to EPA’s ability to enforce clear air, clean water and hazardous waste laws.” 2

  Much as Bush stuck to the Reagan script with its drive to roll back the regulatory state, he likewise tried to copy his predecessor’s playbook with a massive supply-side tax cut. Notably, the Bush tax cut—signed into law on June 7, 2001—was the first major reduction in twenty years and, at an estimated cost of $1.35 trillion, the largest single tax cut in American history. While Reagan had claimed that supply-side tax cuts were specifically suited to spur the sluggish conditions of the early 1980s, Bush argued that the budget surpluses from the Clinton-era boom meant that the government could, and indeed should, cut taxes again. “The surplus is not the government’s money,” Bush had said in his 2000 acceptance speech. “The surplus is the people’s money.” Now in office, he promised to “give the people their money back.” Republicans saw the tax cut and the larger Bush budget as a means of restoring their glory years. “It would, they believe, revive and retrofit the Reagan Revolution,” a reporter noted in March 2001, “by putting new spending constraints on the federal government with the tax cut; by ceding more responsibilities to state and local governments and religious institutions; and by transforming two great monuments of the New Deal and Great Society, with partial privatizing of Social Security and Medicare.” Democrats, of course, had different memories of the reckoning wrought by Reaganomics and worried that the country would suffer again. “I just know that at some point that reality is going to come crashing down on all of us,” Senate Minority Leader Tom Daschle noted, “and we’re going to have to deal with it.” 3

  Not everything, however, was about re-creating the Reagan Revolution. Early in Bush’s term as president, the White House had pointed to three areas in which his administration was seeking to implement its new philosophy. “The President’s vision of compassionate conservatism,” an April 2002 press release noted, “effectively tackles some of society’s toughest assignments—educating our children, fighting poverty at home, and aiding poor countries around the world.” In these three realms, the Bush administration tried to transform its vision of compassionate conservatism into concrete policy. Specifically, the policies were known as the faith-based initiative, the No Child Left Behind Act, and the global initiative on HIV/AIDS.4

  More than any other policy, George W. Bush’s “compassionate conservatism” was embodied in what he called the “faith-based initiative.” At heart, this promised to empower private religious and community organizations and enlist their charitable arms in the provision of social services, especially to the poor. This had been a major issue for Bush during his presidential campaign, one he used to distance himself from the Republican Party’s traditional stances on economic and fiscal conservatism. “I know that economic growth is not the solution to every problem,” he announced at one campaign event. “The invisible hand of the free market works many miracles. But it cannot touch the human heart.” As a result, Bush proposed a new program of government spending to support religious bodies engaged in social welfare. “Without more support and resources, both public and private,” he insisted, “we are asking them to make bricks without straw.” Laying out the details of his vision, the Republican candidate proposed a combination of tax credits for charitable donations and direct government financing that would put an estimated $8 billion annually into the cause.5

  Upon taking office, President Bush continued to emphasize the faith-based initiative as the centerpiece of his domestic agenda. Notably, the longest section of his inaugural address had been devoted to describing the program. “America, at its best, is compassionate,” the new president observed. “Church and charity, synagogue and mosque lend our communities their humanity, and they will have an honored place in our plans and in our laws.” Despite his high rhetoric, the faith-based initiative still remained unformed. Indeed, it soon became clear that for all the emphasis the new president had given to the issue in his speeches, nothing at all had been done to turn those words into deeds. Many members of the administration saw the promise as a political ploy, rather than serious policy. Late in the transition period, a young aide named Don Willett had stepped forward to craft plans for the program after he realized no one else had taken charge. To his surprise, he found requests for funding and staff rebuffed at every turn. In an abrupt change, just three days after the inaugural, political advisor Karl Rove told Willett that the full faith-based initiative would be unveiled to reporters in six days. The thirty-four-year-old asked how that could be accomplished when the initiative still lacked a director, a staff, an office, or even a general plan of action. “I don’t know,” Rove responded wearily. “Just get me a fucking faith-based thing. Got it?” 6

  The following Monday, January 29, 2001, Bush issued his first two executive orders as president. The first created a new executive branch agency, the White House Office of Faith-Based and Community Initiatives (WHOFBCI), charged with setting priorities, coordinating public education campaigns, and monitoring faith-based initiatives across the federal government. The second executive order, meanwhile, created faith-based programs for five cabinet-level departments, which would conduct internal audits to help private-sector community organizations and religious charities provide social services.7 Despite the president’s enthusiasm for the executive branch programs, the faith-based initiatives never fully formed there. Frustrated with the White House’s lack of leadership, House Republicans embraced more socially conservative proposals championed by the Religious Right. These proposals were marked by little or no government oversight for grant recipients; complete freedom for them to discriminate in hiring, especially in the realm of gender and sexuality; and a license to use public funds for private religious work. As WHOFBCI Director John DiIulio later observed, the House bill “bore few marks of compassionate conservatism and was, as anybody could tell, an absolute political nonstarter.” Its already slim chances of passing the Senate became even slimmer in May 2001, when Senator Jim Jeffords of Vermont switched parties, changing from a Republican to a Democrat, and taking control of the Senate with him.8

  Without legislation to shore it up, the core of the president’s original plan—the provision to secure a massive boost in private charitable giving through tax incentives—was lost as well. By mid-2001, the faith-based initiative had been effectively ended. The legislative struggle, according to one study, had transformed Bush’s original “bipartisan rallying cry for the armies of compassion” into little more than “a throwback to partisan blitzkriegs of the Newt G
ingrich era.” The WHOFBCI soon became an empty shell. DiIulio had only originally planned on staying for six months, but his planned departure at the end of the summer was seen by many as a sign that the office had failed. His replacement, Jim Towey, essentially turned the office into a political arm of the Republicans’ midterm campaign in 2002 and the president’s reelection effort in 2004.9

  Bush’s plans for a new era of “compassionate conservatism” were crippled by the failure of the faith-based initiative. Other programs, however, still sought to fill the void, with some success. Chief among these was an education initiative, the No Child Left Behind Act of 2001. The law solidified the federal commitment to funding public education, representing a stark reversal to decades of conservative calls for the dismantling of the Department of Education. Reaffirming the federal role in education and even expanding it, No Child Left Behind imposed rigorous national standards in subjects like math and English. The standards served a double political purpose in that they constituted a direct attack on the authority of the teachers’ unions, which had traditionally retained strong control over hiring decisions. Conservatives had long criticized public schools for allowing “social promotion”—advancing unqualified students when they should have been held back, a practice Bush had decried on the campaign trail as “the soft bigotry of low expectations.” Under the new law, every public school in the nation would now be required to meet or surpass government-mandated achievement levels in core subjects or lose federal funds. Students in these failing schools would then be able to transfer to better-performing ones. Entire schools could be closed if the problems were not solved.10

  No Child Left Behind initially had broad bipartisan support, a sign of the importance that education had for the suburban voters both parties coveted. Future Republican Speaker John Boehner was one of two sponsors in the House, while Senator Ted Kennedy, a noted liberal, was one of two in the Senate. Though the standards at the heart of the legislation were strongly opposed by teachers’ unions, a key Democratic constituency, many members of the party joined Kennedy in backing the bill. In a sign of bipartisan support, the measure passed the House by a wide margin of 384 to 45, and then passed in the Senate 91 to 8. President Bush then signed it into law in January 2002. Though No Child Left Behind seemed to be a bipartisan success, partisan divisions over the program quickly took hold. Democrats were furious when, one month after signing the bill into law, Bush drastically cut its funding. Democrats and many teachers warned that NCLB was an unfunded mandate imposed on local schools by the federal government without providing school districts with any serious funding. The policy did little to advance the standing of “compassionate conservatism” at large.11

  Although Bush’s compassionate conservatism demonstrated that he was willing to challenge the traditional stances of the Republican Party on key issues such as funding of social services and the federal commitment to public education, these proposals were poorly formed at the outset and then were quickly obliterated by events in the fall.

  9/11

  The front page of the New York Times, printed in the early hours of the morning of Tuesday, September 11, 2001, featured a typical range of stories. One explained how New York State public schools were requiring dress codes, in an effort to curb the fishnet stockings and see-through clothing that were becoming more common in classrooms. Another recounted the last day of the Democratic primaries for the candidates seeking to replace New York’s Mayor Rudy Giuliani. In national news, a panel of scientific experts urged President Bush to rethink his restrictions on federally financed stem cell lines, while leaders in both parties, another piece explained, wanted Bush to make more aggressive moves to boost a now-struggling economy. Media critic Bill Carter, meanwhile, reported that the networks were competing to gain a bigger share of the booming audience for morning television. Amid these front-page stories, there was one article about terrorism: an account of a teacher in Westchester County, who was being charged in a 1971 airline hijacking, after his role in it had been discovered thirty years later in an internet search.12

  The events that would later define September 11, 2001, seemingly came out of nowhere. Early that day, nineteen members of the Islamic terrorist organization al-Qaeda boarded four commercial airplanes in American cities. Armed with box cutters they had snuck through airport security, the attackers killed the pilots and took control of the planes. They then proceeded to fly three of them into major buildings: two into the twin towers of the World Trade Center in New York and a third into the Pentagon outside Washington, DC. A fourth plane, United Flight 93, was meant to target the White House, but passengers on board, hearing reports of the other hijackings over their personal cell phones, were able to seize control from the terrorists and crash the plane in a rural field in Pennsylvania. Despite those heroics, the elaborate plan still killed roughly 3,000.

  Americans had witnessed acts of terrorism before, but rarely on their own shores and never on a scale like this. When the first plane hit the tower, many assumed it was a tragic accident, a pilot who had somehow been taken off course. When the second plane hit, however, and the buildings then started to crumble before the television cameras, it was suddenly clear that this had been something far more noxious—an intentional attack against the United States. As the streets of lower Manhattan filled with smoke and debris, news channels broadcast scenes of confusion to the nation. Normally cheerful morning shows switched abruptly from soft celebrity stories and weather reports to breaking news. NBC’s Today Show segued from a conversation on a book about “America’s first billionaire,” Howard Hughes, to live shots of the initial crash; ABC’s Good Morning America switched to cover the carnage by cutting away from an interview with a member of the British royal family. After the sudden shift, the networks and cable news channels alike remained focused nonstop on the ongoing developments in New York and Washington, with images of the twin towers—first smoking, then collapsing—replaying over and over again. According to a later study, Americans watched an average of 8.1 hours of coverage that day, transfixed in horror.

  For those in New York, the attacks upended everything. The first responders, who had helped evacuate the buildings as they fell, found themselves unsure of what to do next. “Like dazed and bloodied soldiers,” one reporter recounted, “thousands of firefighters and police officers wandered helplessly throughout the afternoon and evening on the West Side Highway, blocked by the danger of further catastrophe from attempting to enter the scene.” Meanwhile, residents searched for missing friends and family members, blanketing subway stations and telephone poles with homemade posters and fliers, looking in vain for the lost.13 “The sense of security and self-confidence that Americans take as their birthright suffered a grievous blow, from which recovery will be slow,” a reporter noted in the next day’s edition of the New York Times. “The aftershocks will be nearly as bad, as hundreds and possibly thousands of people discover that friends or relatives died awful, fiery deaths.” 14

  While the nation reeled from the assault, its political leaders mounted a joint response, hastily scrambling to put a nation that had largely been at peace for a decade back on a wartime footing. The night after the attacks, President George W. Bush offered a brief but earnest declaration of the nation’s resolve and noted “a quiet, unyielding anger” that had been brought up across the country. Leaders of both parties offered their support, joining together to sing “God Bless America.” Indeed, national unity in the face of the crisis became their predominant concern. “We have just seen the war of the 21st century,” Bush declared in a conference call with reporters two days after the attack. “There is universal approval of the statements I have made, and I am confident there will be universal approval of the actions this government takes.” Democrats echoed the president, both in his condemnation of the attacks and his commitment to national unity. Senate Majority Leader Tom Daschle denounced the “despicable acts” that were “an assault on our people and on our freedom.” “I know th
at the most important thing now,” said Hillary Clinton, who had been elected in 2000 to serve as US senator for New York, “is for us to be united, united behind our president and our government, sending a very clear message that this is something that transcends any political consideration or partisanship.” 15

  Americans’ impulse to unite came easily, but the question of just what “unified action” might entail was much more difficult to determine. For one thing, the threat itself remained unclear. Many suspected al-Qaeda would soon wage additional attacks across the nation, perhaps against vulnerable “soft targets,” such as shopping malls or movie theaters. Others speculated that hostile nations might try to take advantage of the chaos by launching a more conventional attack of their own. At the same time, the nature of the American response was equally murky. “As Washington struggled to regain a sense of equilibrium, with warplanes and heavily armed helicopters crossing overhead,” R. W. Apple Jr., noted in the New York Times, “past and present national security officials earnestly debated the possibility of a Congressional declaration of war—but against precisely whom, and in what exact circumstances?” Ultimately, President Bush did decide to treat the attacks as an act of full-blown war rather than an isolated criminal act, an important departure from how Clinton had handled the 1993 bombing of the World Trade Center. Putting the nation on a formal war footing, his aides noted, would give the president more expansive executive power with fewer legislative restraints.16

 

‹ Prev