Idiot America: How Stupidity Became a Virtue in the Land of the Free

Home > Other > Idiot America: How Stupidity Became a Virtue in the Land of the Free > Page 22
Idiot America: How Stupidity Became a Virtue in the Land of the Free Page 22

by Charles P. Pierce


  In 2002, Paul Pillar was working for the Central Intelligence Agency as its national intelligence officer for the Near East and South Asia. One of Pillar’s duties was to assess and evaluate intelligence regarding, among other places, Iraq. In October 2002, the CIA produced two documents in which Pillar had had a hand. The first was a National Intelligence Estimate that the agency presented to Congress regarding what the Bush administration argued was the overwhelming evidence that Iraq had stockpiled a vast number of dangerous weapons. There were mobile biological laboratories: a captured spy codenamed Curveball said so. There was a deal to buy uranium from the African nation of Niger. There were documents that said so, produced by the Italian government and vetted by British intelligence. There were aluminum tubes that could only be used for building centrifuges for the production of nuclear bombs.

  The NIE also contained within its fine print the information that a number of government agencies thought the whole case was a farrago of stovepiped intelligence, cherry-picked data, wishful thinking, and utter bullshit. For example, the State Department’s Bureau of Intelligence and Research thought the tale of the aluminum tubes was a bunch of hooey. The Air Force pooh-poohed the threat of Iraqi “drone aircraft” that could zoom in through U.S. air defenses, spraying anthrax. The whole Niger episode read to the people who knew the most about Niger, uranium, Iraq, or all three like comic-opera Graham Greene. “There were so many ridiculous aspects to that story,” says one source familiar with the evidence. “Iraq already had five hundred tons of uranium. So why would they bother buying five hundred tons from a country in remote Africa? That would raise the profile in such a high way.”

  According to the investigative journalists David Corn and Michael Isikoff, one staffer for the Senate Foreign Relations Committee read the NIE for the first time and determined, “If anyone actually takes time to read this, they can’t believe there actually are major WMD programs.” The staffer needn’t have worried. Hardly anyone in the Congress read the NIE.

  Instead, two days later, the CIA released a white paper on the same subject. The white paper was produced with congressional lassitude in mind. It was easy to read. It had color maps and charts and it was printed on glossy paper. All the troublesome caveats in the NIE were gone. In their place were scary skull-and-crossbones logos indicating where the scary weapons were. The thing looked like a pesticide catalog. Seven months after the release of the NIE and the white paper, the United States launched the invasion and occupation of Iraq.

  Ever since, Pillar has written and spoken about the climate within which those two documents were produced—an environment in which expertise was devalued within government for the purpose of depriving expertise of a constituency outside government. “The global question of whether to do it at all,” he muses. “There never was a process that addressed that question. There was no meeting in the White House or in the Situation Room. There was no policy options paper that said, ‘Here are the pros and cons of invading. Here are the pros and cons of not invading.’ That never happened. Even today, with all the books that have been written, and with some great investigative reporting, you still can’t say, ‘Ah, this is where the decision was made.’”

  In the years since the war began, more than a few people have said that the invasion of Iraq was a foregone conclusion the moment that the Supreme Court ruled on the case of Bush v. Gore in 2000. The incoming administration was stacked to the gunwales with people who’d been agitating for over a decade to “finish” what they believed had been left undone at the end of the first Gulf War. Lost in the now endless postmortems of how the country got into Iraq as a response to the attacks of September 11 is the fact that the country had been set on automatic pilot years earlier.

  Madison warned at the outset how dangerous the war powers could be in the hands of an unleashed executive. “War,” he wrote in 1793, “is in fact the true nurse of executive aggrandizement.” As the years went by, and the power of the presidency grew within both government and popular culture, Madison looked even more prescient. Writing in the aftermath of the Lyndon Johnson presidency, which collapsed like a dead star from the pressure of an ill-conceived war, Johnson’s former press secretary George Reedy limned the perfect trap that a president can set for himself:

  “The environment of deference, approaching sycophancy, helps to foster an insidious belief: that the president and a few of his trusted advisors are possessed of a special knowledge that must be closely held within a small group lest the plans and designs of the United States be anticipated and frustrated by enemies.”

  Reedy cites the decisions that were made regarding the bombing of North Vietnam. As he concedes, “it is doubtful that a higher degree of intelligence could have been brought to bear on the problem;” the flaw lay in the fact that “none of these men [Johnson’s pro-bombing advisers] were put to the test of defending their position in public debate.” Even then, President Johnson solicited the opinion of Under Secretary of State George Ball, who thought the whole Vietnam adventure a bloody and futile waste. Johnson wanted Ball’s opinions even when those opinions sent him into paroxysms of rage. Pillar sees inevitable, if imperfect, parallels in the meetings he sat through during the period between September 11, 2001, and the invasion of Iraq. He recalls walking face-first into a foregone conclusion.

  “The only meetings and discussion were either, How do we go about this? or, most importantly, How do we sell this, and how do we get support for this?” he says. “It was a combination of a particular bunch of people who were really determined to do this thing, with a president who seized on the post-9/11 environment and thought, ‘Oh, I’m going to be the war president.’ That’s my thing, after sort of drifting theme-less for a while. There was a synergy there. They came into office with even greater contempt for the bureaucracy and for all the sources of expertise beyond what they considered their own.

  “I look at the work of the people who would have been my counterparts … during the Vietnam era, and I admire the courage of some of them. On the other hand, they had it a lot easier than people like me in Iraq, because they were asked and their opinions were welcome. In Iraq, our opinions were never asked. Your opinion clearly wasn’t welcome.”

  Elsewhere, the country had become accustomed to confronting self-government through what the historian Daniel Boorstin called “a world of pseudo-events and quasi-information, in which the air is saturated with statements that are neither true nor false, but merely credible.” The effect on the country’s leaders was that they began to believe their own nonsense. The effect on the country was that citizens recognized it as nonsense and believed it anyway. A culture of cynical innocence was born, aggrieved and noisy, nurtured by a media that put a premium on empty argument and Kabuki debate. Citizens were encouraged to deplore their government, ridicule its good intentions, and hold themselves proudly ignorant of its functions and its purposes. Having done so, they then insisted on an absolute right to wash their hands of the consequences.

  Cynics bore even themselves eventually. However, as a land of perpetual reinvention and of many frontiers, and founded on ideas and imagination, America had a solution within its genome. It could create fictions to replace the things from which cynicism had drained its faith. It could become a novelized nation.

  Novelizations are so preposterous an idea that they only could have been hatched as an art form here. They are based on the assumption that people will read a book that fills in the gaps of the screenplay of a movie they’ve already seen. A novelization is pure commerce, a salesman’s delight. Few writers brag about writing them; one online critic referred to them memorably as “flipping burgers in someone else’s universe.”

  The very first one, written by Russell Holman in 1928, was aimed at promoting a Clara Bow film called Follow the Fleet. Since then, science fiction fans have come to dote on them as treasure troves of previously unknown arcana, the movies themselves having spent little time on Han Solo’s childhood bout with Rigellian rin
gworm. The most successful of the genre was William Kotzwinkle’s best-selling rendition of Steven Spielberg’s E.T.: The Extra-Terrestrial, which sold more than a million copies on top of the tens of millions of people who saw the actual movie, but in which, as the film writer Grady Hendrix pointed out in a piece for Slate, Kotzwinkle grafted onto the story a genuinely creepy obsession on the part of the lovable little alien with the mother of the children who take him in. Some gaps are best left unfilled.

  As art, novelizations are almost completely worthless. As commerce, they make perfect sense. They are creatures of the First Great Premise, by which anything has value if it moves units. And their principles are ready to be applied to almost every endeavor in a country dedicated to using whatever raw material is at hand to create vast vistas of abject hooey.

  Once, when there were still actual frontiers, novelizing the country helped explain the new parts of the country to the old. Now, though, all frontiers in America are metaphorical, and the novelization of the country serves to give the national cynicism an America it can believe in. In this, the presidency came to represent a comforting counterfeit. If you sold a presidency well—and it was all about selling—the easy cynicism about “government” could be abandoned with respect to the president, who was the one part of “the government” over whom citizens seemed proud to claim common ownership.

  All the way back to Parson Weems, presidents have been in some way fictionalized, but the modern presidency now takes place in the place where art is defined almost solely by commerce, and a place where the president is the only fungible product. In a way that would have shaken Madison down to the buckles on his shoes, the presidency became the government’s great gravitational source, around which every other part of the political culture orbited, and it became the face of government in the popular culture.

  Actual presidents—and people who wanted to be the actual president—caught on quickly. The pursuit of the presidency is now a contest of narratives. Create your own and get it on the market fast, before someone—possibly your opponent, but probably the media—creates one for you. Poor Al Gore learned this lesson far too late. The successful narrative is judged only by how well it sells. Its essential truth becomes merely a byproduct. The Third Great Premise now dominates the marketplace of narratives, which is not necessarily the same as the marketplace of ideas. If enough people believe that Gore said he’d invented the Internet, or that George Bush is a cowboy, then those are facts, even though Gore never said it and Bush is afraid of horses. If people devoutly hate Gore for saying what he never said, or profoundly like Bush for being what he isn’t, then that becomes the truth.

  In 1960, Nixon had lost to the first thoroughly novelized presidency, that of John F. Kennedy. The New Frontier was a fairly conventional political narrative; nothing sells in America like the notion that we have to pick up ourselves and start anew. But, like William Kotzwinkle cobbling together E.T.’s libido, historians and journalists and other scribbling hangers-on fell all over themselves to fill in the elided details of the television photoplay. The idea of the cool and ironic Jack Kennedy, who used to run with the Rat Pack in Vegas, turning mushy over a piece of treacle like Camelot is on its face preposterous. But it sold well enough to define, in shorthand, everything from Pablo Casals playing in the East Room to the Cuban missile crisis, which was decidedly not a time for happy-ever-aftering.

  The apotheosis of the modern novelized presidency was that of Ronald Reagan. He and his people created a remarkable and invulnerable narrative around him, so complete and whole that it managed to survive, relatively intact, until Reagan’s death in 2004, when what was celebrated in lachrymose detail was not his actual biography but what had been created out of it over the previous forty years. To mention his first marriage, to Jane Wyman, during the obsequies was not merely in bad taste, but seemed irrelevant, as though it had happened to someone else besides the deceased.

  Reagan’s people maintained their basic story line even through the perilous comic opera of the Iran-Contra scandal. The country learned that Reagan had arranged to sell missiles to the people who sponsored anti-American terrorism in the Middle East, in order to finance pro-American terrorism in Latin America, and that on one occasion, he sent an important official to Teheran with a Bible and a cake. The country learned this without laughing its beloved, befuddled chief executive out of office. Ol’ Dutch, what a card.

  When Karl Rove (or whoever) talked to Ron Suskind about the contempt he felt for the “reality-based community,” and how his administration would create its own reality for the rest of us to study, he wasn’t saying anything groundbreaking. People in his job had been doing that for years. What he had was a monumental event to act upon. When September 11 happened, and it was clear that events moved whether people wanted them to or not, the country swung radically behind a president who, somehow, was not a part of “the government,” but a quasi-official king and father. It was said that irony died on September 11; but cynicism was what fell most loudly.

  Suddenly, “the government” was us again. Of course, “the government” largely was defined as the president, whom we were accustomed to treat as our common property. Dan Rather told David Letterman that he would “line up” wherever George Bush told him to line up. This attitude of wounded deference obtained for nearly three years. The Iraq war happened because the people who’d wanted it all along were uniquely positioned to create a narrative about why it should happen, and seized the right moment for its release date.

  In short, all the outside checks on what Paul Pillar saw within the government were gone. Events were becoming novelized, and the wrong people were filling in the elided details; the relationship between Al Qaeda and Iraq, which didn’t exist in fact, existed within the prevailing narrative. The Iraqi nuclear program was an established threat, as real as Jack Kennedy’s love for the scores of Lerner and Loewe. Public opinion, which Madison said “sets bounds for every government,” was in no condition to set any limits whatsoever. It needed a narrative, and the people who were selling the war gave the country what screenwriters call a through-line, from Ground Zero through Kabul to Baghdad.

  “We are talking here about national moods,” says Paul Pillar. “And, of course, 9/11 was the big event here in suddenly bringing about a change in the national mood. It became far more belligerent, far more inclined to strike out somewhere, and so it was the perfect environment for something like going to war on automatic pilot in Iraq to work. Politically, it wouldn’t have been possible without 9/11 at all.

  “We are talking about people who had a basis for thinking they were smarter than just about anyone else they met. So, sure, if we’re going to manipulate an issue like the weapons thing, or even distort things about the terrorist connections, if it helps bring about a result I believe with all my intellectual firepower is right for the country, then so be it. If there are a few misrepresentations along the line, that doesn’t matter.”

  “THE fact is,” Carl Ford, Jr., says, leaning across a round, cluttered table in an office in another part of Washington, “there were all kinds of opportunities to speak up. The fact was that—those people in CIA and in the DIA [Defense Intelligence Agency]? I didn’t hear them, and there were plenty of opportunities for my analysts who were out there among them, just among the leaders like I was, and they didn’t hear [the CIA and DIA people speak up], either. The fact was we felt like we were just spitting into the wind.

  “There were those like Paul Pillar who, if he talks about weapons of mass destruction, I said, ‘Paul, where were you?’ Because he was in a position where he could have spoken up. It’s not a case in which the intelligence community has a significant impact and they want to cry about the fact that policymakers don’t listen to them. They want to say, ‘Well, I tried and they wouldn’t let me do it.’ Bullshit. The fact is that they couldn’t convince anyone that they were right simply because they were smart. Because that’s not the way the world works.”

  Ford is
an intelligence lifer. He’d worked at CIA and in the Pentagon. He was a good friend and a longtime admirer of Vice President Dick Cheney. As events moved toward war, Ford was working at the State Department as the director of its Bureau of Intelligence and Research (INR). A short and fiery ex-Marine who served in Vietnam, Ford had a reputation of being a very hard sell, the kind of person who flourished in cold-shower briefings like the ones Richard Clarke recalled from his time at State, in which people who brought in badly researched reports or half-baked proposals found themselves leaving the meeting room through a meat grinder. Ford felt this intellectual rigor reverse itself when it came to Iraq. The facts were whatever was malleable enough to fit into a salable narrative. The truth was sent through the meat grinder.

  “On the case of the internal Iraq issues,” recalls Ford, “the policymakers really didn’t listen at all. My point is that, if we had said, ‘There are no weapons of mass destruction,’ it might have slowed them down, but I don’t think it would have had any impact on most of the people who were deciding on the war. I think that it would have made it more difficult for them to sell that war. In fact, one of the things that disturbed me the most, that eventually led to my leaving, was the sort of view that, ‘Well, okay, but if we tell the people that, if we don’t focus on weapons of mass destruction, we might not be able to sell the war.’

  “That’s what a democracy is all about. You haven’t got the evidence, even if you passionately believe that they [the Iraqis] have them [WMDs], then it’s up to you to make that case. But there was a sense that they were so certain that it didn’t really matter.”

  In his job at INR, Ford was intimately involved with one of the crucial elided details in the narrative that was concocted to justify the invasion of Iraq. And because this detail fit so perfectly into the story that was being developed, all the people developing the story believed it—or so effectively pretended they did that the difference hardly mattered. It became the source of a series of the noisiest subplots of the ongoing narrative, and it was a moment of utter fiction, a passage of the purest novelization.

 

‹ Prev