Book Read Free

Twilight of the American Century

Page 48

by Andrew J Bacevich


  The thicket of unreality that is American politics has now become all-enveloping. The problem is not Trump and Clinton, per se. It’s an identifiable set of arrangements—laws, habits, cultural predispositions—that have evolved over time and promoted the rot that now pervades American politics. As a direct consequence, the very concept of self-government is increasingly a fantasy, even if surprisingly few Americans seem to mind.

  At an earlier juncture back in 1956, out of a population of 168 million, we got Ike and Adlai. Today, with almost double the population, we get—well, we get what we’ve got. This does not represent progress. And don’t kid yourself that things really can’t get much worse. Unless Americans rouse themselves to act, count on it, they will.

  45

  War and Culture, American Style

  (2016)

  In the dispiriting summer of 1979, a beleaguered President Jimmy Carter tried to sell his fellow citizens on a radical proposition: having strayed from the path of righteousness, the nation was in dire need of moral and cultural repair.

  Carter’s pitch had a specific context: an “oil shock”—this one a product of the Iranian Revolution—had once more reminded Americans that their prevailing definition of the good life depended on the indulgence of others. The United States was running out of oil and was anxiously counting on others to provide it.

  Yet the problem at hand, Carter insisted, went far beyond “gasoline lines or energy shortages.” A “mistaken idea of freedom” had led too many Americans “to worship self-indulgence and consumption.” The nation therefore faced a fundamental choice. Down one path lay “fragmentation and self-interest,” pointing toward “constant conflict” and “ending in chaos and immobility.” Down the other lay a “path of common purpose and the restoration of American values.” By choosing rectitude over profligacy, the nation could save itself. Making the sacrifices needed to end their dependence on foreign oil would enable Americans to “seize control again of our common destiny.”

  Alas, members of the congregation weren’t buying what Pastor Carter was selling. They had no interest in getting by with less. Ronald Reagan, sunny where Carter was dour and widely expected to challenge the president for reelection, was offering an alternative view: for Americans, there is always more. Besides, austerity didn’t sound like much fun. Soon enough Carter himself got the message. In January 1980, he capitulated, declaring Persian Gulf oil a cause worth fighting for.

  The implications of the Carter Doctrine were not immediately apparent. Yet what unfolded over the course of subsequent decades was a vast military enterprise that today finds US forces engaged in something approximating permanent war, not only in the Persian Gulf but across large parts of the Islamic world.

  At odd intervals during this very long conflict, Carter’s theme of moral and cultural restoration resurfaced, albeit with a twist. Observers expressed hopes that war itself might somehow provide the instrument of national redemption.

  George W. Bush was eloquent on this point. In his 2002 State of the Union address, Bush depicted 9/11 itself as an occasion for cultural transformation. “This time of adversity,” he announced, offers “a moment we must seize to change our culture.” Indeed, the change was already happening. “After America was attacked, it was as if our entire country looked into a mirror and saw our better selves. We were reminded that we are citizens with obligations. . . . We began to think less of the goods we can accumulate and more about the good we can do.” For too long, Americans had adhered to the dictum “if it feels good, do it.” Now, even as Bush was urging his fellow citizens to shop and take vacations, he commended them for embracing “a new culture of responsibility.”

  This was mostly nonsense, of course. Just as we had ignored Carter’s critique of their “mistaken idea of freedom,” so too we passed on Bush’s “culture of responsibility.” To avoid getting sucked into the Middle East, Carter had admonished Americans to change their ways. The response: piss off. With the United States now wading into a Middle Eastern quagmire, Bush revived Carter’s call for a cultural Great Awakening. Although 9/11 briefly induced a mood of “United We Stand,” the invasion of Iraq, with all the mournful consequences that ensued, terminated that feel-good moment and demolished Bush’s standing as moral arbiter.

  In truth, the inclinations, habits, and mores that Carter bemoaned and that Bush fancied war might banish are immune to presidential authority. Presidents don’t control the culture; they cope with it. In times of war, they abide by what the culture permits and adhere to what it requires. Simply put, culture shapes the American way of war.

  Certainly this was the case during prior conflicts of US history such as the Civil War and World War II. During each conflict, a widely shared (if imperfect) collective culture imbued the war effort with effectiveness that contributed directly to victory. From the very outset of the war that the United States has for decades waged in various parts of the Islamic world, just the reverse has been true. An absence of cultural solidarity has undermined military effectiveness.

  These days, American culture posits a minimalist definition of citizenship. It emphasizes choice rather than duty and self-gratification over sacrifice—except where sacrifice happens to accord with personal preference. Individuals enjoy wide latitude in defining the terms of their relationship to the state. Pay your taxes and obey the law; civic obligation extends that far and no further.

  So in conducting military campaigns in the Islamic world, presidents from Carter’s day to our own have asked little—indeed, next to nothing—from the vast majority of citizens. They are spectators rather than participants.

  The people find this arrangement amenable. On occasion, some exceptionally egregious calamity such as the Beirut bombing of 1983, the “Black Hawk down” debacle of 1993, or the botched occupation of Iraq following the invasion of 2003 may briefly command their attention. But in general, they tune out what they view as not their affair.

  As recently as the 1960s, antipathy toward a misguided and failing war generated mass protest. Today, instead of protest there is accommodation, with Americans remarkably untroubled by the inability of those presiding over the ebb and flow of military actions across the Greater Middle East to explain when, how, or even whether they will end.

  To be sure, even today we retain a residual capacity for outrage, as the Occupy Wall Street and Black Lives Matter movements have demonstrated. When the issue is inequality or discrimination based on race, gender, or sexuality, we still take to the streets. When it comes to war, however, not so much. The “peace movement,” to the extent that it can be said to exist, is anemic and almost entirely devoid of clout. Our politics allows no room for anything approximating an antiwar party. Instead, the tacit acceptance of war has become a distinguishing feature of the contemporary American scene.

  With The People opting out, the burden of actually conducting the various campaigns launched pursuant to the Carter Doctrine falls to those who willingly make themselves available to fight. We may compare these volunteers to fighter pilots during the Battle of Britain: they are the Few. The many have other options and act accordingly.

  Given the choice between a job in finance and the chance to carry an assault rifle, Harvard grads opt for Wall Street, destination of roughly one-third of graduating seniors in recent years. In 2015, by comparison, participants in Harvard’s annual military commissioning ceremony numbered exactly four. Offered the opportunity to sign with the pros or the Army, top athletes opt for the playing field rather than the battlefield. The countercultural Tillman Exception awaits replication.

  So a central task for field commanders has been to figure out how to fight wars that the political class deems necessary but to which the rest of us are largely indifferent. In 2007, Admiral Mike Mullen, the Joint Chiefs of Staff chairman, neatly summarized the problem. “In Afghanistan, we do what we can,” he remarked. “In Iraq, we do what we must.” Implicit in Mullen’s can/must formulation was the fact that in neither Afghanistan nor
Iraq were commanders able to do what they wished. The constraint they labored under was not money or equipment, which were available and expended in prodigious quantities, but troops.

  We tend to rank those two conflicts among this nation’s “big wars.” In reality, except as measured by duration, both qualify as puny. The total number of troops committed to both Operation Iraqi Freedom and Operation Enduring Freedom together peaked at one-third the total number of Americans in Vietnam in 1968. However much commanders in Iraq and Afghanistan might have wanted more troops—and they did—more were not forthcoming.

  For an explanation, look not to need but to availability. In truth, the pool of the willing is not deep. Sustaining what depth there is requires incentives. Since 9/11, new recruit pay has jumped by 50 percent. Reenlistment bonuses can run as high as $150,000. In a material culture, appeals to patriotism don’t suffice to elicit and retain volunteers. Inducing people to put their lives on the line requires upfront compensation. Even then, the Few remain few.

  In practice, roughly 1 percent of the population bears the burden of actually fighting our wars. A country that styles itself a democracy ought to find this troubling. Yet unlike the inequitable distribution of income, which generates considerable controversy, this inequitable distribution of sacrifice generates almost none. Even in a presidential election year, it finds no place on the nation’s political agenda. In the prevailing culture of choice, those choosing to remain on the sidelines are not to be held accountable for the fate that befalls those choosing to go fight.

  Yet while the availability of warriors may be limited, money is another matter. Ours is not a pay-as-you-go culture. It’s go now and worry about the bills later. So it has been with the funding of recent military operations. Rather than defraying war costs through increased taxation—thereby drawing public attention to the war’s progress or lack thereof—the government borrows, with the sums involved hardly trivial. Since 9/11 alone, the national debt has nearly quadrupled. By and large, Americans are OK with sloughing off onto future generations the responsibility of paying for wars presumably undertaken on their own behalf—a bit like buying a pricey car and sticking your grandkids with the payments.

  Granted, money partially offsets the shortage of troops. In Iraq and Afghanistan, the Pentagon found it expedient to contract out functions traditionally performed by soldiers. When each of those wars was at its height, contractors in the employ of profit-minded security firms outnumbered G.I.s. Privatizing war provides a workaround to the predicament caused by having a large appetite for war while the people nurse appetites of a different sort.

  In some quarters, as a sort of hangover from Vietnam, the belief persists that American culture, at least in those quarters where professors and artsy types congregate, is intrinsically anti-military. Nothing could be further from the truth. American culture is decidedly pro-military. All it asks is that military institutions get in step with the culture’s core requirements.

  And so they have, becoming open to all as venues for individual self-actualization. The Pentagon has embraced diversity—the very signature of contemporary culture. In today’s military, overt racism has ceased to exist. Women wear four stars, fly fighter jets, graduate from Ranger school, and enlist as combat Marines. Barriers preventing members of the LGBT community from serving openly? On the way out.

  This is to the good. Yet where it really counts, our culturally compliant military falls short. Thrust into a series of wars to make good on the people’s expectations of more, while respecting their aversion to sacrifice, the Few—admirable in so many ways—find themselves unable either to win and or to get out.

  Whether Carter’s “restoration of American values” or Bush’s “culture of responsibility” would find us today in a different place is a moot point. Our culture remains on a fixed trajectory.

  When it comes to making war, that culture hinders rather than helps. Rather than fighting the problem, policy makers should consider turning it to US advantage. Instead of promoting American-style freedom at the point of a bayonet, they should pursue alternatives to war. Instead of coercion, perhaps it’s time to try seduction.

  46

  Under God

  (2015)

  Whether Americans today sin more frequently or less frequently than they did when I was a boy growing up in the 1950s is difficult to say. Yet this much is certain: back then, on matters related to sex and family, a rough congruity existed between prevailing American cultural norms and the traditional teachings of the Catholic Church. Today that is no longer the case. If any doubts persisted on that point, the Supreme Court decision in the case of Obergefell v. Hodges, legalizing gay marriage, demolished them once and for all.

  During the early days of the Cold War, when it came to marriage, divorce, abortion, and sex out of wedlock—not to mention a shared antipathy for communism—Washington and Rome may not have been in lockstep, but they marched to pretty much the same tune. As for homosexuality, well, it ranked among those subjects consigned to the category of unmentionables.

  No doubt the hypocrisy quotient among those upholding various prohibitions on what Americans could do in the privacy of their own bedroom or the backseat of their father’s car was considerable. But for a nation locked in an existential struggle against a godless adversary, paying lip service to such norms was part of what it meant to be “under God.”

  Cold War–era sexual mores had implications for US foreign policy. Even if honored only in the breach, the prevailing code—sex consigned to monogamous heterosexual relationships sanctified by marriage—imparted legitimacy to the exercise of American power. In measured doses, self-restraint and self-denial offered indicators of collective moral fiber. By professing respect for God’s law, we positioned ourselves on his side. It followed that he was on ours. Here was American chosenness affirmed. Certainty that the United States enjoyed divine favor made it possible to excuse a multitude of transgressions committed in the name of defending a conception of freedom ostensibly mindful of God’s own strictures.

  The justices who voted in favor of gay marriage don’t care a lick about whether the United States is “under God” or not. On that score, however dubious their reading of the Constitution, they have accurately gauged the signs of the times. The people of “thou shall not” have long since become the people of “whatever,” with obligations deriving from moral tradition subordinated to claims of individual autonomy. That’s the way we like it. August members of the Supreme Court have now given their seal of approval.

  Their decision highlights just how attenuated the putative link between God’s law and American freedom has become. As a force in American politics, religion is in retreat. True, even today politicians adhere to rituals that retain religious overtones. For example, the National Prayer Breakfast that President Eisenhower instituted in 1953 remains a fixture on Washington’s calendar. Yet this is an exercise in nostalgia, devoid of substance. Some symbols matter—like displaying the Confederate battle flag, for example. Others—like emblazoning your currency with “In God We Trust”—don’t mean squat.

  On balance, we may judge this a clarifying moment. The irreversibility of the cultural revolution that has unfolded across the past half-century now becomes apparent to all. The trajectory of change that this revolution mandates will continue, even if its ultimate destination remains hidden. For some the transformation underway may be a source of regret, for others cause for celebration, but there is no denying its profound significance, reaching into every corner of American existence.

  A nation once purportedly “under God” has decisively rejected the hierarchical relationship that phrase implies. Those who interpret the nation’s laws have dropped all pretense of deferring to guidance from above. From here on out, we’ve got the green light to chart our own course.

  Again, however, there are likely to be foreign policy implications. For Americans, as the Obergefell case vividly testifies, freedom is not a fixed proposition. It evolves, expa
nds, and becomes more inclusive, bringing freedom’s prevailing definition (at least in American eyes) ever closer to perfection.

  But a nation founded on universal claims—boldly enumerating rights with which “all men” are endowed—finds intolerable any conception of freedom that differs from its own. The ongoing evolution of American freedom creates expectations with which others are expected to comply.

  Recall that for the first two centuries of this nation’s existence, American diplomats were indifferent to discrimination against women. Then gender equality found a place on the American political agenda. Now the State Department maintains an Office of Global Women’s Issues devoted to “empowering women politically, socially, and economically around the world.”

  We should anticipate something similar occurring in relation to LGBT communities worldwide. Their plight, which is real, will necessarily emerge as a matter of official US concern. Today the United States condemns the racism, sexism, and anti-Semitism that Americans once found eminently tolerable. Tomorrow standing in principled opposition to anti-LGTB discrimination wherever it exists will become a moral imperative, Americans declaring themselves rid of sins they themselves had committed just yesterday.

  Whether or not US support for LGBT rights goes beyond the rhetorical, societies still viewing themselves as “under God” will bridle at this sudden turnabout. Especially in the Islamic world, demands to conform to the latest revision of American (and therefore universal) freedom will strike many as not only unwelcome but also unholy encroachments. Whether the upshot will contribute to the collective well-being of humankind or sow the seeds of further conflict remains to be seen. At least in the near term, the latter seems more likely than the former.

 

‹ Prev