Book Read Free

Twilight of the American Century

Page 34

by Andrew J Bacevich


  During the 1980s and 1990s, this combustible mix produced a shift in the US strategic center of gravity, overturning geopolitical priorities that had long appeared sacrosanct. A set of revised strategic priorities emerged, centered geographically in the energy-rich Persian Gulf but linked inextricably to the assumed prerequisites for sustaining American freedom at home. A succession of administrations, Republican and Democratic, opted for armed force as the preferred means to satisfy those new priorities. In other words, a new set of strategic imperatives, seemingly conducive to a military solution, and a predisposition toward militarism together produced the full-blown militarization of US policy so much in evidence since 9/11.

  The convergence between preconditions and interests suggests an altogether different definition of World War IV—a war that did not begin on 9/11, does not have as its founding purpose the elimination of terror, and does not cast the United States as an innocent party. This alternative conception of a fourth world war constitutes not a persuasive rationale for the exercise of US military power in the manner pursued by the administration of George W. Bush, but the definitive expression of the dangers posed by the new American militarism. Waiting in the wings are World Wars V and VI, to be justified, inevitably, by the ostensible demands of freedom.

  Providing a true account of World War IV requires that it first be placed in its correct relationship to World War III, the Cold War. As the great competition between the United States and the Soviet Union slips further into the past, scholars work their way toward an ever more fine-grained interpretation of its origins, conduct, and implications. Yet as far as public perceptions of the Cold War are concerned, these scholars’ diligence goes largely unrewarded. When it comes to making sense of recent history, the American people, encouraged by their political leaders, have shown a demonstrable preference for clarity rather than nuance. Even as the central events of the Cold War recede into the distance, the popular image of the larger drama in which these events figured paradoxically sharpens.

  “Cold War” serves as a sort of self-explanatory, all-purpose label, encompassing the entire period from the mid-1940s through the late 1980s. And since what is past is prologue, this self-contained, internally coherent, authoritative rendering of the recent past is ideally suited to serve as a template for making sense of events unfolding before our eyes.

  From a vantage point midway through the first decade of the twenty-first century, the commonly accepted metanarrative of our time consists of three distinct chapters. The first, beginning where World War II leaves off, recounts a period of trial and tribulation lasting several decades but ending in an unambiguous triumph for the United States. The next describes a short-lived “post–Cold War era,” a brief, dreamy interlude abruptly terminated by 9/11. The second chapter gives way to a third, still in the process of being written but expected to replicate in broad outlines the first—if only the United States will once again rise to the occasion. This three-part narrative possesses the virtues of simplicity and neatness, but it is fundamentally flawed. Perhaps worst of all, it does not alert Americans to the full dimensions of their present-day predicament. Instead, the narrative deceives them. It would be far more useful to admit to a different and messier parsing of the recent past.

  For starters, we should recognize that, far from being a unitary event, the Cold War occurred in two distinct phases. The first, defined as the period of Soviet-American competition that could have produced an actual World War III, essentially ended by 1963. In 1961, by acquiescing in the erection of the Berlin Wall, Washington affirmed its acceptance of a divided Europe. In 1962, during the Cuban Missile Crisis, Washington and Moscow contemplated the real prospect of mutual annihilation, blinked more or less simultaneously, and tacitly agreed to preclude any recurrence of that frightening moment. A more predictable, more stable relationship ensued, incorporating a certain amount of ritualistic saber rattling but characterized by careful adherence to a well-established set of routines and procedures.

  Out of stability came opportunities for massive stupidity. During the Cold War’s second phase, from 1963 to 1989, both the major protagonists availed themselves of these opportunities by pursuing inane adventures on the periphery. In the 1960s, of course, Americans plunged into Vietnam, with catastrophic results. Beginning in 1979, the Soviets impaled themselves on Afghanistan, with results that proved altogether fatal. Whereas the inherent resilience of democratic capitalism enabled the United States to repair the wounds it had inflicted on itself, the Soviet political economy lacked recuperative powers. During the course of the 1980s, an already ailing Soviet empire became sick unto death.

  The crucial developments hastening the demise of the Soviet empire emerged from within. When the whole ramshackle structure came tumbling down, Andrei Sakharov, Václav Havel, and Karol Wojtyla, the Polish prelate who became Pope John Paul II, could claim as much credit for the result as Ronald Reagan, if not more. The most persuasive explanation for the final outcome of the Cold War is to be found in Soviet ineptitude, in the internal contradictions of the Soviet system, and in the courage of the dissidents who dared to challenge Soviet authority.

  In this telling of the tale, the Cold War remains a drama of compelling moral significance. But shorn of its triumphal trappings, the tale has next to nothing to say about the present-day state of world affairs. In a post-9/11 world, it possesses little capacity either to illuminate or to instruct. To find in the recent past an explanation of use to the present requires an altogether different narrative, one that resurrects the largely forgotten or ignored story of America’s use of military power for purposes unrelated to the Soviet-American rivalry.

  The fact is that, even as the Cold War was slowly reaching its denouement, World War IV was already under way—indeed, had begun two full decades before September 2001. So World Wars III and IV consist of parallel rather than sequential episodes. They evolved more or less in tandem, with the former overlaid on, and therefore obscuring, the latter.

  The real World War IV began in 1980, and Jimmy Carter, of all people, declared it. To be sure, Carter acted only under extreme duress, prompted by the irrevocable collapse of a policy to which he and his seven immediate predecessors had adhered—specifically, the arrangements designed to guarantee the United States a privileged position in the Persian Gulf. For Cold War–era US policymakers, preoccupied with Europe and East Asia as the main theaters of action, the Persian Gulf had figured as something of a sideshow before 1980. Jimmy Carter changed all that, thrusting it into the uppermost tier of US geopolitical priorities.

  From 1945 through 1979, the aim of US policy in the gulf region had been to ensure stability and American access, but to do so in a way that minimized overt US military involvement. Franklin Roosevelt had laid down the basic lines of this policy in February 1945 at a now-famous meeting with King Abd al-Aziz Ibn Saud of Saudi Arabia. Henceforth, Saudi Arabia could count on the United States to guarantee its security, and the United States could count on Saudi Arabia to provide it preferential treatment in exploiting the kingdom’s vast, untapped reserves of oil.

  From the 1940s through the 1970s, US strategy in the Middle East adhered to the military principle known as economy of force. Rather than establish a large presence in the region, Roosevelt’s successors sought to achieve their objectives in ways that entailed a minimal expenditure of American resources and, especially, US military power. From time to time, when absolutely necessary, Washington might organize a brief show of force—in 1946, for example, when Harry Truman ordered the USS Missouri to the eastern Mediterranean to warn the Soviets to cease meddling in Turkey, or in 1958, when Dwight Eisenhower sent US Marines into Lebanon for a short-lived, bloodless occupation—but these modest gestures proved the exception rather than the rule.

  The clear preference was for a low profile and a hidden hand. Although by no means averse to engineering “regime change” when necessary, the United States preferred covert action to the direct use of force. To police the
region, Washington looked to surrogates—British imperial forces through the 1960s, and, once Britain withdrew from “east of Suez,” the shah of Iran. To build up the indigenous self-defense (or regime defense) capabilities of select nations, it arranged for private contractors to provide weapons, training, and advice. The Vinnell Corporation’s ongoing “modernization” of the Saudi Arabian National Guard (SANG), a project now well over a quarter-century old, remains a prime example.

  By the end of 1979, however, two events had left this approach in a shambles. The first was the Iranian Revolution, which sent the shah into exile and installed in Tehran an Islamist regime adamantly hostile to the United States. The second was the Soviet invasion of Afghanistan, which put the Red Army in a position where it appeared to pose a direct threat to the entire Persian Gulf—and hence to the West’s oil supply.

  Faced with these twin crises, Jimmy Carter concluded that treating the Middle East as a secondary theater, ancillary to the Cold War, no longer made sense. A great contest for control of the region had been joined. Rejecting out of hand any possibility that the United States might accommodate itself to the changes afoot in the Persian Gulf, Carter claimed for the United States a central role in determining exactly what those changes would be. In January 1980, to forestall any further deterioration of the US position in the gulf, he threw the weight of American military power into the balance. In his State of the Union address, the president enunciated what became known as the Carter Doctrine. “An attempt by any outside force to gain control of the Persian Gulf region,” he declared, “will be regarded as an assault on the vital interests of the United States of America, and such an assault will be repelled by any means necessary, including military force.”

  From Carter’s time down to the present day, the doctrine bearing his name has remained sacrosanct. As a consequence, each of Carter’s successors has expanded the level of US military involvement and operations in the region. Even today, American political leaders cling to the belief that skillful application of military power will enable the United States to decide the fate not simply of the Persian Gulf proper but of the entire greater Middle East. This gigantic project, begun in 1980 and now well into its third decade, is the true World War IV.

  What prompted Jimmy Carter, the least warlike of all recent US presidents, to take this portentous step? The Pentagon’s first Persian Gulf commander, Lieutenant General Robert Kingston, offered a simple answer when he said that his basic mission was “to assure the unimpeded flow of oil from the Arabian Gulf.” But General Kingston was selling his president and his country short. What was true of the three other presidents who had committed the United States to world wars—Woodrow Wilson, FDR, and Truman—remained true in the case of Carter and World War IV as well. The overarching motive for action was preservation of the American way of life.

  By the beginning of 1980, a chastened Jimmy Carter had learned a hard lesson: It was not the prospect of making do with less that sustained American-style liberal democracy, but the promise of more. Carter had come to realize that what Americans demanded from their government was freedom, defined as more choice, more opportunity, and, above all, greater abundance, measured in material terms. That abundance depended on assured access to cheap oil—and lots of it.

  In enunciating the Carter Doctrine, the president was reversing course, effectively renouncing his prior vision of a less materialistic, more self-reliant democracy. Just six months earlier, this vision had been the theme of a prescient, but politically misconceived, address to the nation, instantly dubbed by pundits the “Crisis of Confidence” speech, though, in retrospect, perhaps better called “The Road Not Taken.”

  Carter’s short-lived vision emerged from a troubled context. By the third year of his presidency, economic conditions as measured by postwar standards had become dire. The rates of inflation and unemployment were both high. The prime lending rate was 15 percent and rising. Trends in both the federal deficit and the trade balance were sharply negative. Conventional analysis attributed US economic woes to the nation’s growing dependence on increasingly expensive foreign oil.

  In July 1979, Carter already anticipated that a continuing and unchecked thirst for imported oil was sure to distort US strategic priorities, with unforeseen but adverse consequences. (When Carter spoke, the United States was importing approximately 43 percent of its annual oil requirement; today it imports 56 percent.) He feared the impact of that distortion on an American democracy still reeling from the effects of the 1960s. So on July 15 he summoned his fellow citizens to change course, to choose self-sufficiency and self-reliance—and therefore true independence. But the independence was to come at the cost of collective sacrifice and lowered expectations.

  Carter spoke that night of a nation facing problems “deeper than gasoline lines or energy shortages, deeper even than inflation or depression.” The fundamental issue, in Carter’s view, was that Americans had turned away from all that really mattered. In a nation once proud of hard work among strong, religious families and close-knit communities, too many Americans had come to worship self-indulgence and consumption. What you owned rather than what you did had come to define human identity. But according to Carter, owning things and consuming things did not satisfy our longing for meaning. Americans were learning that piling up goods could fill the emptiness of lives devoid of real purpose.

  This moral crisis had brought the United States to a historic turning point. Either Americans could persist in pursuing “a mistaken idea of freedom” based on “fragmentation and self-interest” and inevitably “ending in chaos and immobility,” or they could opt for “true freedom,” which Carter described as “the path of common purpose and the restoration of American values.”

  How the United States chose to deal with its growing reliance on foreign oil would determine which of the two paths it followed. Energy dependence, according to the president, posed “a clear and present danger” to the nation, threatening the nation’s security as well as its economic well-being. Dealing with this threat was “the standard around which we can rally.” “On the battlefield of energy,” declared Carter, “we can seize control again of our common destiny.”

  How to achieve this aim? In part, by restricting oil imports, investing in alternative sources, limiting the use of oil by the nation’s utilities, and promoting public transportation. But Carter placed the larger burden squarely in the lap of the American people. The hollowing out of American democracy required a genuinely democratic response. “There is simply no way to avoid sacrifice,” he insisted, calling on citizens as “an act of patriotism” to lower thermostats, observe the highway speed limit, use carpools, and “park your car one extra day per week.”

  Although Carter’s stance was relentlessly inward looking, his analysis had important strategic implications. To the extent that “foreign oil” refers implicitly to the Persian Gulf—as it did then and does today—Carter was in essence proposing to annul the growing strategic importance attributed to that region. He sensed intuitively that a failure to reverse the nation’s energy dependence was sure to draw the United States ever more deeply into the vortex of Persian Gulf politics, which, at best, would distract attention from the internal crisis that was his central concern, but was even more likely to exacerbate it.

  But if Carter was prophetic when it came to the strategic implications of growing US energy dependence, his policy prescription reflected a fundamental misreading of his fellow countrymen. Indeed, as Garry Wills has observed, given the country’s propensity to define itself in terms of growth, it triggered “a subtle panic [and] claustrophobia” that Carter’s political adversaries wasted no time in exploiting. By January 1980, it had become evident that any program summoning Americans to make do with less was a political nonstarter. The president accepted this verdict. The promulgation of the Carter Doctrine signaled his capitulation.

  Carter’s about-face did not achieve its intended political purpose of preserving his hold on the Whi
te House—Ronald Reagan had already tagged Carter as a pessimist, whose temperament was at odds with that of the rest of the country—but it did set in motion a huge shift in US military policy, the implications of which gradually appeared over the course of the next two decades. Critics might cavil that the militarization of US policy in the Persian Gulf amounted to a devil’s bargain, trading blood for oil. Carter saw things differently. On the surface the exchange might entail blood for oil, but beneath the surface the aim was to guarantee the ever-increasing affluence that underwrites the modern American conception of liberty. Without exception, every one of Carter’s successors has tacitly endorsed this formulation. Although the result was not fully apparent until the 1990s, changes in US military posture and priorities gradually converted the Persian Gulf into the epicenter of American grand strategy and World War IV’s principal theater of operations.

  “Even if there were no Soviet Union,” wrote the authors of NSC-68, the spring 1950 US National Security Council document that became the definitive statement of America’s Cold War grand strategy, “we would face the great problem of the free society, accentuated many fold in this industrial age, of reconciling order, security, the need for participation, with the requirement of freedom. We would face the fact that in a shrinking world the absence of order among nations is becoming less and less tolerable.” Some three decades later, with the Soviet Union headed toward oblivion, the great problem of the free society to which NSC-68 alluded had become, if anything, more acute. But conceiving the principles to guide US policy turned out to be a more daunting proposition in World War IV than it had been during any of the three previous world wars. Throughout the 1980s and 1990s, US policymakers grappled with this challenge, reacting to crises as they occurred and then insisting after the fact that their actions conformed to some larger design. In fact, only after 9/11 did a fully articulated grand strategy take shape. George W. Bush saw the antidote to intolerable disorder as the transformation of the greater Middle East through the sustained use of military power.

 

‹ Prev