Decision at Sea
Page 36
The new template of warfare suggested by Gulf War I was that of a very precise, highly technical, and relatively bloodless (for Americans) application of force on behalf of a world consortium directed at aggressors and rogue nations. Indeed, the outcome surprised even some Pentagon planners. As one expert noted, the results of the war proved that “even the most ‘optimistic’ estimates turned out to be overly ‘pessimistic.’”9 Moreover, U.S. consultation and cooperation with the United Nations, as well as support from the nations most concerned, cast an international legitimacy over the effort that allowed the United States to emerge from the war as the world’s white knight. The very success of the war, however, provoked concern in some quarters that, if it wished, the United States could act unilaterally, and that if it chose to act alone, there was little the rest of the world could do about it.
Military supremacy proved to be no guarantor of security, as became evident on October 12, 2000. On that date, the Arleigh Burke–class guided-missile destroyer USS Cole (DDG-67) was in the port of Aden on the coast of Yemen making a routine refueling stop en route to the Persian Gulf. Seizing a target of opportunity, a handful of anti-American terrorists in a small rubber boat, not unlike those used by the Iranians in the Gulf, maneuvered their little craft alongside the Cole and detonated a bomb. The resulting explosion blew a giant hole in the side of the Cole, killed seventeen Americans, and wounded thirty-nine more. This was no accident like the attack on the Stark, nor was it random like the mining of the Roberts; it was a deliberate act of sabotage and terrorism conducted by men who were willing to give up their lives to hurt the United States. U.S. intelligence analysts attributed it to continuing resentment by Arabs, and especially Islamic fundamentalists, of America’s pro-Israel stance. President Bill Clinton called it “a despicable and cowardly act” and redoubled efforts both to find the presumed leader of the terrorists, the multimillionaire Osama bin Laden, and to broker a solution to the Israeli-Palestinian struggle.10
Much worse was to come. A year later, in a coordinated assault on September 11, 2001, nineteen fanatical followers of bin Laden hijacked four commercial airliners, flew two of them into the twin towers of the World Trade Center in New York City, and smashed a third into the Pentagon. A fourth plane, very likely intended for the Capitol, crashed instead into the Pennsylvania countryside when the passengers attempted to retake the plane from the hijackers. Almost as much as the collapse of the Soviet Union, the terror attacks of 9/11 (as it came to be called) marked a historic milestone in American and world history.
The USS Cole in the port of Aden in Yemen shows the results of the suicide attack in October 2000 that killed 17 and wounded 39. (U.S. Navy)
The attack galvanized Americans. In many ways 9/11 was essentially an escalation, albeit a dramatic one, of the several previous attacks that had been aimed at American icons worldwide, from the 1983 bombing of the Marine barracks in Lebanon and the 1988 destruction of Pan American flight 103, which exploded over Lockerbie, Scotland, to the USS Cole. Nevertheless, for many Americans 9/11 marked a watershed. In part this was because of the sheer horror of the televised images of that day: the second plane crashing into the south tower of the Trade Center, the desperate men and women jumping from the upper floors, the dedication and heroism of the rescue teams, and the searing memory of the moment the buildings themselves collapsed in a giant cloud of dust and debris that roiled through the streets of Manhattan. It was a horror epic become real. But another factor in the American reaction was that unlike any of those other events, this terrorist act occurred on American soil. Americans perceived at once that their insulating oceans no longer protected them from a dangerous world.
Americans reacted as they always had to attacks on their country. Much like the response to the destruction of the USS Maine one hundred years earlier, or the Japanese attack on Pearl Harbor forty-three years after that, Americans after 9/11 were united in their desire for justice and revenge. The events of 9/11 provoked an outpouring of patriotic anger. American flags, often accompanied by the declaration “United We Stand,” blossomed on front porches, lapels, and car bumpers.
The news was a shock overseas as well. The immediate international response to the disaster was an outpouring of international sympathy for the United States. Le Monde, the leading French daily newspaper, carried a headline on September 13 that read “Nous Sommes Tous Américains” (we are all Americans).* President George W. Bush delivered a moving address to a joint session of Congress and rode the wave of American anger by promising retaliation against terrorists and a final victory in what soon came to be called a “war on terror.”11
The first American strike in that war was directed at the fundamentalist Islamic Taliban government in Afghanistan, a country that had provided terrorists with a safe haven and which served as a base of operations for bin Laden’s followers.* Though the Soviets had struggled unsuccessfully for ten years to suppress the Afghans in the 1980s, the U.S. war effort there in 2002 was a marvel of efficiency. As in Gulf War I, a principal reason for America’s success was its ability to apply precision ordnance from the sky, and in fact the war in Afghanistan demonstrated a dramatic improvement, even a revolution, in the effectiveness of such ordnance. Whereas during Gulf War I the Navy had devoted ten aircraft to each target in order to ensure target destruction, in Afghanistan each aircraft was assigned two targets. The Navy’s EA-6B Prowlers jammed Taliban communications, F-14 Tomcats passed global positioning satellite information to the attack aircraft, and Navy and Air Force attack planes so dominated the skies above the battlefield that even though Americans and their Afghan allies on the ground were outnumbered by as much as two to one, they soon overran the country. As Vice Admiral Mike Mullen noted, “For thousands of years the conventional wisdom has required a five to one advantage of offense to defense, [but] we rewrote the rule book in Afghanistan.” The campaign ended so quickly that it was over before the United States had time to negotiate basing rights in neighboring countries, rights it ended up not needing.12
Having disposed of the Taliban in Afghanistan, the George W. Bush administration then turned its attention to Iraq.
More than Gulf War I, which was a response to a specific provocation, the American invasion of Iraq in 2003 marked another sea change for the United States and its military. Earnest Will, Praying Mantis, and Desert Storm had all demonstrated America’s willingness to act as a global policeman, but Gulf War II (officially Operation Iraqi Freedom) was a dramatic escalation, even a redefinition, of that role. Despite attempts by administration spokesmen to imply a connection between the events of 9/11 and the government of Iraq, the only suggestion of such a link was that Iraq was one of only three countries (the others being China and Libya) that refused to lower its flag to half mast after the 9/11 disaster, and the unconfirmed testimony of the expatriate Iraqi Ahmed Chalabi (who had motives of his own and whose testimony proved unreliable), who claimed that there were terrorist training camps in Iraq. Nevertheless, the events of 9/11 created a new public climate in the United States that allowed the George W. Bush administration to propose and execute a dramatically new doctrine for the exercise of American military power abroad.*
Though Gulf War I had been a model of international cooperation, a segment of the American defense establishment had argued, even at the time, that international cooperation was vastly overrated. Some members of the Reagan and first Bush administrations were uncomfortable with cooperative efforts under the aegis of either the United Nations or NATO, preferring when possible to go it alone so as not to have to convince reluctant or skeptical allies to accept the American vision of what needed to be done. Long before the 9/11 disaster, and even before the events of Praying Mantis, some members of the Reagan administration were expressing the view that “foreign alliances may hinder America’s global might.”13
More than any other single individual, Paul D. Wolfowitz was the architect of this view. The number three man in the Department of Defense at the end of the
first Gulf War, Wolfowitz was a Ph.D. out of the University of Chicago who had taught briefly at both Yale and Johns Hopkins. Some administration officials called him the “resident egghead.” In 1992, in the wake of Gulf War I, Wolfowitz offered a position paper in which he addressed the role the United States should play in the “new world order.” Wolfowitz discounted the value of collective security through the United Nations and argued that as the world’s only superpower, the United States should assume unilateral responsibility for the maintenance of world stability. In September 2002, ten years after he first proposed it, the United States publicly announced a new national security policy that embraced Wolfowitz’s views: “Our forces will be strong enough to dissuade potential adversaries from pursuing a military build-up in hopes of surpassing, or equaling, the power of the United States.” The United States thus not only acknowledged its global military supremacy but also asserted its determination to maintain that supremacy. By implication, at least, this meant that it was no longer necessary for the United States to compromise, or for that matter even to negotiate, with its rivals or its allies.* If another power demonstrated either the willingness or the capability to challenge American supremacy, the United States would be justified in striking first in order to maintain its global domination. It was, in short, a doctrine of preemptive or preventive war.14
Liberal critics of the second Bush administration pointed out that the United States had never before espoused such a policy. “This goes much further than the notion of America as the policeman of the world,” Hendrick Hertzberg wrote in the New Yorker. “It’s the notion of America as both the policeman and the legislator of the world.” The historian Arthur Schlesinger, Jr. noted that the United States was espousing a policy of “anticipatory self-defense” that was disturbingly similar to the one advanced by Japan as a justification for its attack on Pearl Harbor. And the liberal gadfly Noam Chomsky stated bluntly that the new policy was an “imperial strategy” that suggested a “quest for global dominance.”15
Dismissing such criticism, the George W. Bush administration determined soon after the 9/11 disaster to go to war against Iraq for the second time in a dozen years. Though the administration tried hard to connect Iraq and Saddam Hussein to the events of 9/11, three other factors actually led the United States to war in 2003. The first was that Saddam Hussein represented unfinished business. Despite the lopsided American victory in Gulf War I, the first President Bush had ended hostilities after the headlong retreat of the Iraqi army from Kuwait in 1991. Fulfilling a pledge he had made to the Arab states participating in the coalition, the elder Bush announced that Kuwait was liberated, and since that had been the original object of the war, he declared hostilities at an end. He very likely expected that Hussein’s humiliation would lead to his collapse in any case, and in pursuit of that, American operatives encouraged the Shiite Muslims in southern Iraq and the Kurds in the north to rise against him. But when Hussein’s armed forces ruthlessly crushed both uprisings, the United States did not interfere, and Saddam Hussein survived. Consequently, some members of the second Bush administration were urging a renewal of military operations in order to bring about a “regime change” in Iraq even before the events of 9/11.16
The second reason for an American invasion of Iraq, and the one that commanded most of the prewar (and postwar) dialogue, was American insistence that Saddam Hussein possessed “weapons of mass destruction”—defined as nuclear, chemical, or biological weapons. If so, not only would this justify an attack under the Wolfowitz Doctrine of preempting any challenge to American supremacy, it also would have been a breach of the agreement Hussein had been forced to accept at the end of the 1991 war, for his possession of such weapons would have constituted a kind of international parole violation. Though United Nations inspection teams had found no such weapons, Hussein was spectacularly (and stupidly) uncooperative with the inspectors, and Bush administration spokesmen declared repeatedly and categorically that the United States had proof that such weapons existed. The remarkable efficiency of U.S. electronic gadgetry demonstrated in Gulf War I and during the Afghan War led most Americans to conclude that the United States had special capabilities for detecting such weapons—capabilities that the United Nations inspectors lacked—and they were willing to accord their president the benefit of any doubt. Bush made a connection between Iraq’s purported weapons and the “war on terror” by raising the specter that Hussein might make some of these weapons available to terrorists.
Months after the war was officially declared over and it became evident that Saddam Hussein had no connection with the 9/11 attacks and that Iraq did not, in fact, have any weapons of mass destruction, a third justification was advanced for the war. Regime change in Iraq had been necessary, the Bush administration argued, because Saddam Hussein was an evil man. He had terrorized his own people, killed thousands of Iraqi Kurds during the war with Iran, and was a ruthless dictator. This fact alone, the United States now declared, was sufficient justification for the war to depose Saddam Hussein. It was a claim reminiscent of the U.S. argument in 1898 that even if the Spanish had not blown up the Maine, their misrule in Cuba was sufficient grounds for a war to liberate the Cuban people. Whatever the motives, the Bush administration swept aside opposition from NATO allies and expressions of concern from the United Nations and took the United States to war in Iraq in the spring of 2003.
Once again, the star of the war was the suite of sophisticated weapons delivered primarily by air in a blitz described by Secretary of Defense Donald Rumsfeld as a campaign of “shock and awe.” Inspired, perhaps, by the ease of American success in Afghanistan, the administration planned a more streamlined war in 2003 than the one in 1991. Some hoped that the application of American air-launched weapons would cause the Iraqi regime to collapse almost at once. Impressed by the assurances of expatriate Iraqis, Vice President Cheney and others in the administration insisted that all the United States had to do was kick in the door and the whole rotten structure would come crashing down.17
It was not quite as easy as that. Though badly overmatched—and completely unable to respond to American air superiority—the Iraqis demonstrated surprising tenacity in the ground fighting, especially in southern Iraq around Basra. Nevertheless, U.S. and British forces soon overwhelmed the Iraqi army, and U.S. forces entered Baghdad on April 7, 2003.
But just as Dewey’s apparently easy victory in Manila Bay had been followed by a long and bloody war in the Philippines, the swift march to Baghdad was followed by a long and wasting guerilla campaign, especially in the so-called Sunni triangle in central Iraq. In this second Gulf War, the United States abandoned the Powell Doctrine of overwhelming force and instead sought to win the war by a surgical application of precise munitions and a minimum number of ground troops in accordance with Secretary Rumsfeld’s view that the United States could do more with less. In the aftermath of the initial victory, however, it soon became evident that U.S. planners were completely unprepared to fill the vacuum caused by the collapse of Iraq’s government, and much of the country descended into chaos. Resistance continued after Hussein himself was captured, pulled ignominiously from a “spider hole” near Baghdad in December 2003; even after the handover of official sovereignty to Iraqis in June 2004, sporadic and deadly attacks on American forces continued.18
The cost of the war multiplied. As in the Philippines a hundred years earlier, the United States became involved not only in an ugly guerilla-style war but also with the burdens of a lengthy “nation-building” campaign. After the fall of Baghdad and the formal declaration that “major combat operations” had ended, Bush asked Congress for a budget supplement of $87 billion to rebuild the Iraqi economy, and another $82 billion in February 2005. In addition, the continued violence strained the capacity of the American armed forces, compelling the United States to call up an unprecedented number of reservists and, in what was called a “stop-loss” order, require soldiers whose terms of service had expired to stay in Iraq be
yond that term. Finally, American unilateralism in Iraq cost it the goodwill of much of the rest of the world. By choosing to prosecute the war despite widespread world opposition, the United States demonstrated its resolve, but it also weakened the Cold War–era alliances—NATO in particular—that had helped secure victory over the Soviet Union. Public opinion polls outside the United States showed widespread hostility toward the United States. When photos were published depicting the mistreatment of Iraqi prisoners by American forces in the notorious Abu Ghraib prison in Baghdad, some Iraqis declared angrily, if hyperbolically, that U.S. occupation was no better than Saddam’s tyranny. It was evident that superpower status and a policy of preemptive war came with a price, both tangible and intangible.
It remains to be seen which of the two Gulf Wars will serve as a model for future U.S. policy: the international cooperation of Gulf War I or the unilateralism of Gulf War II. A “war on terror” has no geographical boundaries or chronological limits, and the long-term cost of sustaining an open-ended and unilateral commitment may in the end prove unsustainable. Indeed, almost as soon as the active phase of the war was officially over, the George W. Bush administration began efforts to persuade as many countries as possible to share in the burden of the occupation, both financially and with ground troops, in part perhaps to demonstrate its willingness to revive a sense of international partnership, but also in belated recognition of the enormous costs—financial and political—of sustaining a lengthy American occupation.