These Truths

Home > Other > These Truths > Page 88
These Truths Page 88

by Jill Lepore


  The Iraq War mired U.S. soldiers in counterinsurgency campaigns. The war on terror had its dissenters: among them were those who fought it. A 2011 Pew study reported that half of veterans of Afghanistan and Iraq thought the war in Afghanistan wasn’t worth fighting, nearly 60 percent thought the war in Iraq wasn’t worth it, and a third thought neither war was worth what it cost.68 One of the war on terror’s severest critics was Andrew J. Bacevich, a West Point graduate and career army officer who, after fighting in Vietnam in 1970 and 1971, had risen to the rank of colonel and become a history professor. Bacevich’s only son was killed in Iraq. A Catholic and a conservative, Bacevich argued that while few Americans served in the military, Americans and the American government had “fallen prey to militarism, manifesting itself in a romanticized view of soldiers, a tendency to see military power as the truest measure of national greatness, and outsized expectations regarding the efficacy of force.” Somehow, Bacevich wrote, Americans accepted that it was the fate of the United States to engage in permanent war, without dissent: “The citizens of the United States have essentially forfeited any capacity to ask first-order questions about the fundamentals of national security policy.”69

  By no means had the wars in Afghanistan and Iraq gone unquestioned, but one reason there had been relatively little debate had to do not only with a widening gap between the civilian and the military populations but also with the consequences of disruptive innovation. In many parts of the country, the daily paper, with its side-by-side op-ed essays, had vanished. Voters had been sorted into parties, the parties had been sorted, ideologically, and a new political establishment, the conservative media, having labeled and derided the “mainstream media” as biased, abdicated dispassionate debate. Rigorous studies of newspapers had not, up to that point, been able to discern a partisan bias. Nevertheless, the conservative establishment insisted that such bias existed, warned their audiences away from nonconservative media outlets, and insulated their audience from the possibility of persuasion by nonconservative outlets by insisting that anything except the conservative media was the “liberal media.”70 This critique applied not only to the news but to all manner of knowledge. “Science has been corrupted,” Rush Limbaugh said on the radio in 2009. “We know the media has been corrupted for a long time. Academia has been corrupted. None of what they do is real. It’s all lies!”71

  Limbaugh, who came of age during the Vietnam War but did not serve in the military (apparently due to a cyst), strenuously supported the war on terror.72 Roger Ailes, who, like Limbaugh, had neither seen combat in Vietnam nor served in the military (Ailes suffered from hemophilia), strongly supported U.S. military action in both Afghanistan and Iraq. And his network, Fox News, did more than report the wars; it promoted them. After 9/11, when Fox News anchors and reporters began wearing flag pins, some journalists, including CBS’s Morley Safer, condemned the practice. Ailes brushed him off: “I’m a little bit squishy on killing babies, but when it comes to flag pins I’m pro-choice.” When the United States invaded Iraq, Fox News adopted an on-air chyron: “The War on Terror.” John Moody, Fox’s vice president for news, circulated morning memos with directives for the day’s coverage. On June 3, 2003, he wrote, “The president is doing something that few of his predecessors dared undertake: putting the US case for Mideast peace to an Arab summit. It’s a distinctly skeptical crowd that Bush faces. His political courage and tactical cunning are worth noting in our reporting through the day.” On March 23, 2004, following early reports that the 9/11 commission was investigating the degree of negligence involved in the Bush administration leading up to the attacks, Moody wrote: “Do not turn this into Watergate. Remember the fleeting sense of national unity that emerged from this tragedy. Let’s not desecrate that.” Moody’s editorial directives included prohibitions on certain words. On April 28, 2004, he wrote: “Let’s refer to the US marines we see in the foreground as ‘sharpshooters,’ not snipers, which carries a negative connotation.” Walter Cronkite said of the memos, after they were leaked: “I’ve never heard of any other network nor any other legitimate news organization doing that, newspaper or broadcast.”73

  The conservative media establishment broadcast from a bunker, garrisoned against dissenters. Those who listened to Rush Limbaugh, and who only years before had also gotten news from their local newspapers and from network television, were now far more likely to watch only Fox News and, if they read a newspaper, to read only the Wall Street Journal, which, like Fox, was owned, as of 2007, by Rupert Murdoch. The conservative websites to which search engines directed listeners of Limbaugh, watchers of Fox News, and readers of the Wall Street Journal only reinforced this view. “It’s a great way to have your cake and eat it too,” wrote Matt Labash in the Weekly Standard in 2003. “Criticize other people for not being objective. Be as subjective as you want. It’s a great little racket. I’m glad we found it actually.”74

  Other administrations, of course, had lied, as the Pentagon Papers had abundantly demonstrated. But in pursuing regime change in the Middle East, the Bush administration dismissed the advice of experts and took the radically postmodern view that all knowledge is relative, a matter of dueling political claims rather than of objective truth. That view had characterized not only its decision to go to war in Iraq but also the campaign’s argument against the recount in 2000, and Bush’s withdrawal from the Kyoto Protocol, a climate change agreement, in 2001.75 In 2002, a senior Bush adviser told a reporter for the New York Times that journalists “believe that solutions emerge from your judicious study of discernible reality” but that “that’s not the way the world works anymore. We’re an empire now, and when we act, we create our own reality.”76 The culture and structure of the Internet made it possible for citizens to live in their own realities, too.

  Jaundiced journalists began to found online political fact-checking sites like PolitFact, which rated the statements of politicians on a Truth-O-Meter. “I’m no fan of dictionaries or reference books: they’re elitist,” the satirist Stephen Colbert said in 2005, when he coined “truthiness” while lampooning George W. Bush. “I don’t trust books. They’re all fact, no heart. And that’s exactly what’s pulling our country apart today.”77 But eventually liberals would respond to the conservative media by imitating them—two squirrels, chasing each other down a tree.

  WHAT DID HE know and when did he know it? had been the pressing question of the Watergate investigation. What does anyone know anymore, and what is knowledge, anyway? became the question of the Bush era.

  The United States’ position as the leader of a liberal world order based on the rule of law entered a period of crisis when, pursuing its war on terror, the country defied its founding principles and flouted the Geneva Conventions, international law, and human rights through the torture of suspected terrorists and their imprisonment without trial.

  On October 26, 2001, Bush signed the Patriot Act, granting the federal government new powers to conduct surveillance and collect intelligence to prevent and investigate terrorist acts. It passed both houses less than two months after the 9/11 attacks, in a frenzied climate in which legislators who dared to break ranks were labeled unpatriotic. Outside the Capitol, the ACLU and the Electronic Frontier Foundation were among the many vocal opponents of the act, citing violations of civil liberties, especially as established under the Fourth Amendment, and of civil rights, especially the due process provision of the Fourteenth Amendment. John Ashcroft, Bush’s attorney general, defended the Patriot Act, citing the war on drugs as a precedent for the war on terror. “Most Americans expect that law enforcement tools used for decades to fight organized crime and drugs be available to protect lives and liberties from terrorists,” Ashcroft said.78

  In November 2001, Bush signed a military order concerning the “Detention, Treatment, and Trial of Certain Non-Citizens in the War Against Terrorism.” Suspected terrorists who were not citizens of the United States were to be “detained at an appropriate location designated by the Se
cretary of Defense.” If brought to trial, they were to be tried and sentenced by military commissions. The ordinary rules of military law would not apply. Nor would the laws of war, nor the laws of the United States.79

  The conduct of war will always challenge a nation founded on a commitment to justice. It will call back the nation’s history, its earlier struggles, its triumphs and failures. There were shades, during the war on terror, of the Alien and Sedition Acts passed in 1798 during the Quasi-War with France, of the Espionage Act of the First World War, and of FDR’s Japanese internment order during the Second World War. But with Bush’s November 2001 military order, the war on terror became, itself, like another airplane, attacking the edifice of American law, down to its very footings, the ancient, medieval foundations of trial by jury and the battle for truth.

  “You’ve got to be kidding me,” Ashcroft said when he read a draft of the order. He’d expected the prosecution of people involved in planning the attacks on 9/11 to be handled criminally, by his department—as had been done successfully with earlier terrorism cases, with due process. National security adviser Condoleezza Rice and Secretary of State Colin Powell only learned that Bush had signed the order when they saw it on television. In the final draft, the Department of Justice was left out of the prosecutions altogether: suspected terrorists were to be imprisoned without charge, denied knowledge of the evidence against them, and, if tried, sentenced by courts following no established rules. The order deemed “the principles of law and the rules of evidence generally recognized in the trial of criminal cases in the United States district courts” to be impractical. The means by which truth was to be established and justice secured, traditions established and refined over centuries, were deemed inconvenient. “Now, some people say, ‘Well, gee, that’s a dramatic departure from traditional jurisprudence in the United States,’” Vice President Cheney said, but “we think it guarantees that we’ll have the kind of treatment of these individuals that we believe they deserve.”80

  The Bush administration’s course of action with the wars in Afghanistan and Iraq and with the military tribunals and with the Patriot Act rested on an expansive theory of presidential power. The party in control of the White House tends to like presidential power, only to change its mind when it loses the White House. From Woodrow Wilson through FDR and Lyndon Johnson, Democrats had liked presidential power, and had tried to extend it, while Republicans had tried to limit it. Beginning with the presidency of Richard Nixon, Democrats and Republicans switched places, Republicans extending presidential power with Nixon and Reagan. But the conservative effort to expand the powers of the presidency reached a height in the George W. Bush administration, in powers seized while the nation reeled from an unprecedented attack.81

  Beginning in the fall of 2001, the U.S. military dropped flyers over Afghanistan offering bounties of between $5,000 and $25,000 for the names of men with ties to al Qaeda and the Taliban. “This is enough money to take care of your family, your village, your tribe, for the rest of your life,” one flyer read. (The average annual income in Afghanistan at the time was less than $300.) The flyers fell, Secretary of Defense Donald Rumsfeld said, “like snowflakes in December in Chicago.” (Unlike many in Bush’s inner circle, Rumsfeld was a veteran; he served as a navy pilot in the 1950s.)82 As hundreds of men were rounded up abroad, the Bush administration considered where to put them. Taking over the federal penitentiary at Leavenworth, Kansas, and reopening Alcatraz, closed since 1963, were both considered but rejected because, from Kansas or California, suspected terrorists would be able to appeal to American courts and under U.S. state and federal law. Diego Garcia, an island in the Indian Ocean, was rejected because it happened to be a British territory, and therefore subject to British law. In the end, the administration chose Guantánamo, a U.S. naval base on the southeastern end of Cuba. No part of either the United States or of Cuba, Guantánamo was one of the known world’s last no-man’s-lands. Bush administration lawyer John Yoo called it the “legal equivalent of outer space.”83

  On January 9, 2002, Yoo and a colleague submitted to the Department of Defense the first of what came to be called the torture memos, in which they concluded that international treaties, including the Geneva Conventions, “do not apply to the Taliban militia” because, although Afghanistan had been part of the Geneva Conventions since 1956, it was a “failed state.” International treaties, the memo maintained, “do not protect members of the al Qaeda organization, which as a non-State actor cannot be a party to the international agreements governing war.” Two days later, the first twenty prisoners, shackled, hooded, and blindfolded, arrived at Guantánamo. More camps were soon built to house more prisoners, eventually 779, from 48 countries. They weren’t called criminals, because criminals have to be charged with a crime; they weren’t called prisoners, because prisoners of war have rights. They were “unlawful combatants” who were being “detained” in what White House counsel Alberto Gonzales called “a new kind of war,” although it was as ancient as torture itself.84

  The White House answered terrorism, an abandonment of the law of war, with torture, an abandonment of the rule of law. Aside from the weight of history, centuries of political philosophy and of international law, and, not least, its futility as a means for obtaining evidence, another obstacle to torture remained: the Convention against Torture and other Cruel, Inhuman or Degrading Treatment or Punishment, a treaty the United States had signed in 1988. This objection was addressed in a fifty-page August 2002 memo to Gonzales that attempted to codify a distinction between acts that are “cruel, inhuman, or degrading” and acts that constitute torture. “Severe pain,” for instance, was defined as pain that caused “death, organ failure, or permanent damage resulting in the loss of significant bodily functions.” (“If the detainee dies, you’re doing it wrong,” the chief counsel for the CIA’s counterterrorism center advised, according to meeting minutes later released by the Senate Armed Services Committee.) Methods described in the torture memos included stripping, shackling, exposure to extremes of temperature and light, sexual humiliation, threats to family members, near-drowning, and the use of dogs. Many of these forms of torment, including sleep deprivation and semi-starvation, came from a 1957 U.S. Air Force study called “Communist Attempts to Elicit False Confessions From Air Force Prisoners of War,” an investigation of methods used by the Chinese Communists who tortured American prisoners during the Korean War. Top security advisers, including Colin Powell, objected to what the White House called “enhanced interrogation techniques.” Others, including Ashcroft, urged discretion. “Why are we talking about this in the White House?” he is said to have asked at one meeting, warning, “History will not judge this kindly.” But the position of the secretary of defense prevailed. On a list of interrogation techniques approved for the use of U.S. military, Rumsfeld wrote: “I stand for 8–10 hours a day. Why is standing limited to 4 hours? D.R.”85

  Torture wasn’t confined to Guantánamo. In Iraq, American forces inflicted torture at Abu Ghraib, and in Afghanistan, in a CIA prison in Kabul and at Bagram Air Base, where, in 2002, two men died while chained to the ceiling of their cells. Within the legal academy and among civil liberties organizations, opposition both to provisions of the Patriot Act and to the treatment of suspected terrorists had been ongoing. During Barack Obama’s 2003 Senate bid, he called the Patriot Act “a good example of fundamental principles being violated,” and objected to the lack of due process in the arrest and trials of suspected terrorists. Glimpses of what was happening only reached the American public in 2004, after The New Yorker and 60 Minutes reported on abuses at Abu Ghraib and the ACLU published the torture memos. In June 2006, in Hamdan v. Rumsfeld, the Supreme Court ruled that, without congressional authorization, the president lacked the power to establish the military commissions. Six months later, Congress authorized the commissions, but in 2008, the court found this act unconstitutional as well.86 Still, something crucial about the fundamental institutions on which
the nation had been founded had been very badly shaken.

  The Supreme Court’s ruling had neither righted the Republic nor healed its divisions. During Bush’s two terms in office, income inequality widened and polarization worsened, as they had during the Clinton years and the Reagan years, and as they would under Obama and Trump. A Bush-era tax cut granted 45 percent of its savings to the top 1 percent of income earners, and 13 percent to the poorest 60 percent. In 2004 and again in 2008, the percentage of voters who did things like post campaign yard signs in front of their houses or paste bumper stickers onto their cars was higher than it had been at any time since people had been counting those things, in 1952. Members of Congress no longer regretted hyperpartisanship but instead celebrated it, outgoing Republican House majority leader Tom DeLay insisting in his 2006 farewell address that “the common lament over the recent rise in political partisanship is often nothing more than a veiled complaint instead about the recent rise of political conservatism.”87

  DeLay had been indicted for money laundering and had also been tied to all manner of other political grubbiness in connection with the Russian government and with lobbyists. Political insiders like DeLay had a financial stake in heightened partisanship: the more partisan the country, the more money they could raise for reelection, and the more money they could make after they left office. Before the 1990s, “change elections,” when a new party took over Congress or the White House or both, meant that politicians who were thrown out of office left town, along with their staff. That stopped happening. Instead, politicians stayed in Washington and became pundits, or political consultants, or management consultants, or, most likely, lobbyists, or—for those with the least scruples—all of the above. They made gargantuan sums of money, through speaking fees, or selling their memoirs, or hawking their connections, or appearing on television: the cable stations, compelled to fill twenty-four hours of airtime, needed talking heads at all hours of every day, the angrier and more adversarial the talk, the higher the ratings. “Insiders have always been here,” the New York Times’s Mark Leibovich observed in 2013. “But they are more of swarm now: bigger, shinier, online, and working it all that much harder.”88

 

‹ Prev