At the time of writing, the shortage of military personnel in Iraq was acknowledged by nearly every informed observer outside the Office of the Secretary of Defense. Of the army’s thirty-three front-line brigades, sixteen were in Iraq in September 2003; by the end of the year, active duty force levels had been increased by 33,000, and 165,000 members of the National Guard and Reserve had been called up, a substantial number of whom went to Iraq. Even with the support of other countries, however, a total U.S. presence of around 120,000 was not sufficient to impose order on the country.30 The crisis was such that the administration was forced to swallow its pride and seek foreign reinforcements—even from the very countries that had opposed the war at the outset.31 This can be seen as a direct consequence of the sustained contraction in the size of the American armed forces since the early 1970s (when the total number of active service personnel peaked at 3 million, compared with under 1.4 million today). True, the United States in 2002 had around the same number of service personnel overseas as the United Kingdom did back in 1881, just over a quarter of a million in each case.32 But there the resemblance ends. In those days, less than a third of Britain’s total armed forces were stationed in the United Kingdom itself. By contrast, more than four-fifths—82 percent—of Americans on active military duty are based in the United States.33 Even the B-2 stealth bombers that pounded Serbia into quitting Kosovo in 1999 were flying out of Knob Noster, Missouri. It is also striking that when American service personnel are posted abroad they generally do not stay for very long. The introduction of yearlong tours of duty in Iraq marks a break with the system of minimal overseas stints introduced thirty years ago after Vietnam.
Twelve months, to be sure, are longer than the average duration of a foreign trip by a Wall Street investment banker, which can be measured in days, but it is scarcely long enough to acquire much local knowledge. In any case, it is worth remembering that more than half of America’s seventy-three major overseas bases are in Western Europe, and no fewer than twenty-five of them in Germany, near towns like Heidelberg and Kaiserslautern, where living standards are higher than in some American states.34 Unlike the British, who built barracks in hostile territories precisely in order to subjugate them, the Americans today locate a quarter of their overseas troops in what is one of the most prosperous and arguably one of the most pacifist countries in the world. (Significantly, when the Pentagon detects serious local hostility to one of its overseas outposts, as in the case of Subic Bay in the Philippines, the base is hastily shut down.)
The problem of manpower is not purely military, however. Unlike the United Kingdom a century ago, the United States is an importer of people, with a net immigration rate of 3 per 1,000 and a total foreign-born population of 32 million (nearly 1 in 9 U.S. residents).35 Moreover, when Americans do opt to reside abroad, they tend to stick to the developed world. There are an estimated 3.8 million Americans currently resident abroad. That sounds like a great many, but it is just one eighth of the number of foreign-born residents of the United States. And of the expatriate Americans, more than three-quarters live in the two next-door countries (1 million in Mexico, 687,000 in Canada) or in Europe (just over 1 million). Of the 290,000 who live in the Middle East, nearly two-thirds are to be found in Israel. A mere 37,500 live in Africa.36 This, in other words, is an empire without settlers, or rather the settlers come to the metropolis rather than leave it for distant lands. How far it is possible to exert power outside a country’s borders by drawing foreigners inside those borders is debatable, to say the least. It can be argued that luring foreign elites to study at America’s universities is a kind of indirect rule, in the sense that it involves a form of collaboration and cooptation, not to say acculturation, of indigenous elites. Much, however, depends on how long these foreign students stay in the United States. Since quite a large proportion of them never return to their native lands, it is not clear how much influence is in fact thereby exerted.37
A further important contrast with the British experience is that the products of America’s elite educational institutions seem especially reluctant to go overseas, other than on flying visits and holidays. The Americans who serve the longest tours of duty are the volunteer soldiers, a substantial proportion of whom are African-Americans (12.7 percent of the U.S. population, 28.9 percent of Army enlisted personnel).38 Hence Timothy Garton Ash’s pun on Kipling when he visited Kosovo after the 1999 war: here (as in Vietnam) “the white man’s burden” was visibly being borne by a disproportionate number of black men.39 It is of course just possible that the African-Americans will turn out to be the Celts of the American empire, driven to overseas adventure by comparatively poor opportunities at home, just as the Irish and the Scots were in the nineteenth century. Indeed, if the occupation of Iraq is to be continued for any length of time, it can hardly fail to create career opportunities for the growing number of African-American officers in the army. The Central Command’s most effective press spokesman during the war, General Vincent K. Brooks, exemplifies the type.
The British, however, were always wary about giving the military too much power in their imperial administration. Parliamentarians at Westminster had read enough Roman history to want to keep generals subordinate to civilian governors. The “brass hats” were there to inflict the Victorian equivalent of “shock and awe” whenever the natives grew restive; otherwise, colonial government was a matter for Oxbridge-educated mandarins. It would be interesting to know, by way of comparison, how many members of Harvard’s or Yale’s class of 2004 are seriously considering careers in the postwar administration of Iraq. The number is likely to be small. In 1998–99 there were 43,683 undergraduate course registrations at Yale, of which just 335 (less than 1 percent) were for courses in Near Eastern languages and civilization. There was just one, lone undergraduate majoring in the subject (compared with 17 doing film studies).40 After graduation too the members of America’s academic elite generally subscribe to the Wizard of Oz principle: “There’s no place like home.” According to a 1998 survey, there are currently 134,798 registered Yale alumni. Of these, little more than 5 percent live outside the United States. Scarcely any, just over 50, live in Arab countries.41 At Oxford and Cambridge a hundred years ago ambitious students dreamed of passing the ICS exam and embarking on careers as imperial proconsuls. Today the elite products of the Ivy League set their sights on law school or business school; their dream is by definition an American dream. This, then, is not only an empire without settlers, but also an empire without administrators. Though he himself was an experienced diplomat whose past postings ranged from Afghanistan to Malawi, L. Paul Bremer and his staff were manifestly short of Middle Eastern expertise. It is a sobering statistic that just 3 of his initial team of officials were fluent in Arabic.42
It may be that the bolder products of Harvard’s Kennedy School are eager to advise the Iraqi Governing Council on its constitutional options. And a few of the country’s star economists may yearn to do for Iraq what they did for post–Soviet Russia back in the early 1990s. But we may be fairly certain that their engagement will take the form of a series of weeklong trips rather than long-term residence: consultancy, not colonization. As far as the Ivy League nation builders are concerned, you can set up an independent central bank, reform the tax code, liberalize prices and privatize the major utilities—and be home in time for your first class reunion.
It can of course be argued that the American tendency to pay flying visits to their putative imperium—rather than settle there—is just a function of technology. Back in the 1870s, by which time the British had largely completed their global network of railways and steamships, it still took a minimum of eighty days to circumnavigate the world, as Jules Verne celebrated in the story of Phileas Fogg. Today it can be done in less than three. The problem is that along with the undoubted advantages of modern technology comes the disadvantage of disconnection. During the diplomatic crisis over Iraq in early 2003, Secretary of State Colin Powell was criticized for conducti
ng his foreign policy by telephone. Powell retorted that he had traveled abroad twice that year already, but the destinations and durations of these trips were revealing: one was to Davos, Switzerland, for the World Economic Forum (January 25–26) and the other was to the Far East (February 21–25).43 We can only guess at what these trips achieved—and what Secretary Powell might have achieved if instead he had paid visits to Paris and Ankara.
It is not just the most senior American officials who prefer the comforts of sweet home. Shortly before the terrorist attacks of September 2001, a former CIA man admitted that the agency “probably doesn’t have a single truly qualified Arabic-speaking officer of Middle Eastern background who can play a believable Muslim fundamentalist who would volunteer to spend years of his life with shitty food and no women in the mountains of Afghanistan.” “For Christ’s sake,” he went on, “most case officers live in the suburbs of Virginia. We don’t do that kind of thing.” In the immortal words of one such case officer, “Operations that include diarrhea as a way of life don’t happen.”44 This was precisely the attitude that another CIA officer sought to counter in the wake of the terrorist attacks when he hung a sign outside his office that read as follows: “Officers wanted for hazardous journey. Small wages. Bitter cold. Long months of complete darkness. Constant danger. Safe return doubtful. Honour and recognition in case of success.” Significantly, this was the recruiting poster used by the British explorer Ernest Shackleton before his 1914 expedition to the Antarctic.45 At the time of the invasion of Iraq, the short-lived Office for Reconstruction and Humanitarian Assistance also sought British imperial inspiration: it relied on retired British army Gurkhas from Nepal to provide the security around its Kuwait base.46
What, then, about the much-vaunted role of the voluntary sector, the governmental and nongovernmental aid agencies? Might they provide the Americans on the ground who are so conspicuously hard to find in government service? The institution that, since the 1960s, has done most to channel the idealism of young Americans into what we now call nation building is of course the Peace Corps. Since 1961 more than 168,000 Americans have joined it, serving in a variety of civilian capacities in no fewer than 136 countries. Today there are some 6,678 Peace Corps volunteers, an improvement on the low point of 5,380 in 1982, and they can be found in 69 countries.47 The Peace Corps certainly attracts the right type of person: among the universities that have sent the most volunteers are Berkeley and Harvard; disproportionate numbers also come from the exclusive liberal arts colleges like Dartmouth, Tufts and Middlebury.48 Yet the total number of volunteers remains just two-thirds of the target of 10,000 set by Congress in 1985, a target that was supposed to be attained by 1992.
We should not, in any case, pin too much hope on agencies like the Peace Corps. Civilian aid agencies can, like the missionaries of old, be as much an irritant as a help to those trying to run a country like Iraq. It is one of the unspoken truths of the new “imperialism of human rights” that around every international crisis there soon swarms a cloud of aid workers, whose efforts are not always entirely complementary. If the United States successfully imposes law and order in Iraq, economic life will swiftly revive and much aid will simply be superfluous. If it fails to impose order, on the other hand, aid workers will simply get themselves killed.
After Kipling, John Buchan was perhaps the most readable writer produced by British imperialism. In his thriller Greenmantle (1916) he memorably personifies imperial Britain in the person of Sandy Arbuthnot, an Orien-talist so wily that he can pass for a Moroccan in Mecca or a Pathan in Peshawar. Arbuthnot’s antithesis is the dyspeptic American millionaire John Scantlebury Blenkiron, “a big fellow with a fat, sallow, clean-shaven face [with] a pair of fully sleepy eyes, like a ruminating ox.” “These eyes have seen nothing gorier than a Presidential election,” he tells Buchan’s hero, Richard Hannay. The symbolism is a little crude, but it has something to it.
Since September 2001 the Blenkirons have certainly been seeing something gorier than an election. But will it whet their appetites for an empire in the British mode? Only, it would seem, if Americans radically rethink their attitude to the world beyond their borders. Until there are more U.S. citizens not just willing but eager to shoulder the “nation builder’s burden,” ventures like the occupation of Iraq will lack a vital ingredient. For the lesson of Britain’s imperial experience is clear: you simply cannot have an empire without imperialists—out there, on the spot—to run it.
Could Blenkiron somehow mutate into Arbuthnot? Could the United States work out how to produce men like John Buchan himself, whose career led him from the obscurity of a Scottish manse, by way of Oxford, to the post of Governor-General of Canada? Perhaps. After all, it has happened before. In the years after the Second World War the generation that had just missed out on fighting left Harvard and Yale with something like Buchan’s zeal for global rule. Many of them joined the Central Intelligence Agency and devoted their lives to fighting communism in far-flung lands from Cuba to Cambodia. Yet as Graham Greene foresaw in The Quiet American, their efforts at what the British would have called indirect rule were vitiated by the low quality of the local potentates they backed and constrained by the need to shore them up more or less covertly. Today the same fiction that underpinned American strategy in Vietnam—that America was not attempting to resurrect French colonial rule in Indochina—is being peddled in Washington to rationalize what is going on in Iraq. It may look like the resurrection of British colonial rule. But all Americans want to do is give the Iraqi people democracy and then go home.
THE INCENTIVE TO COLLABORATE
It is perhaps inherent in the nature of a democratic empire that it should operate with a short time horizon. The constraints imposed on the executive by the election cycle are tight, and there is strong evidence from previous conflicts—not only Korea but Vietnam—of a negative correlation between the level of American casualties and the popularity of an executive at war. There are those who insist that the Vietnam syndrome was finally “kicked” in the 1990s. In reality, however, the sensitivity of the American electorate to casualties seems to have grown more acute since the cold war. Between April and October 2003, there was a 29 percent drop in the popularity of the war in Iraq, yet only a little over 350 U.S. service personnel lost their lives in that period, only two-thirds of whom were killed as a result of hostile action (see figure 11). Compare that with Vietnam, where it took around three years and more than thirty thousand “killed in action” to reduce popular support for the war by a comparable amount. Small wonder American politicians have a tendency to start looking for an exit some time before the drama has been concluded.
Unfortunately, there is a fatal flaw to the project of short-term nation building, and that is the extreme difficulty of securing local support when an American pledge to depart imminently has been announced and—more important—is believed by the inhabitants of the occupied country in question. Perhaps more than anything else, the British Empire was an empire based on local collaboration; how else could fewer than a thousand ICS men have governed a population of four hundred million Indians? But why should any Iraqi have risked collaborating with a fly-by-night occupier like L. Paul Bremer? No sooner had he created a Governing Council for Iraq than he began talking of packing his bags. What is especially striking is that this desire for an American withdrawal was not at first shared by a majority of the Iraqi population. In a poll conducted in Baghdad in July 2003, people were asked: “Right now, would you prefer to see the U.S. (and Britain) stay in Iraq or pull out?” Only 13 percent favored immediate withdrawal. Nearly a third—31 percent—answered that the coalition “should stay for a few years”; a further 25 percent said “for about a year.”49
FIGURE 11
The War Against Iraq, 2003: Casualties and Popularity
Source: Poll data from the Gallup Organization; casualty data from http://lunaville.org/warcasualties/Summary.aspx.
This brings us to a critical point. It is simply tha
t the time frame is the key to successful nation building.50 It is no coincidence that the countries where American military intervention has been most successful have been those in which the United States has maintained a prolonged military presence. As we have seen, President Bush is fond of citing Japan and West Germany after 1945 as examples of what successful American intervention can achieve. “America has made and kept this kind of commitment before,” he argued in February 2003, drawing an implicit parallel with 1945. “After defeating enemies we did not leave behind occupying armies, we left constitutions and parliaments.”51 This overlooks the awkward fact that the formal occupation regimes lasted seven years in the Japanese case and ten in the West German, and that—even to this day—the deployments of American troops in those two countries remain among the largest anywhere in the world. It is also worth remembering a third success story, South Korea, which took until the late 1980s to become a genuine democracy, after nearly forty years of an American military presence.52 By contrast, relatively little good, and probably a good deal of ill, came of the numerous short-term American interventions in Central America and the Caribbean, which began in 1898. Unfortunately, the time frames contemplated for Iraq (not to mention Afghanistan) are closer to these dismal episodes than to the post-1945 success stories. Baghdad simply cannot be turned into the capital of a Western-style democracy in the space of two years. The goal in itself is not wholly unrealistic, despite the very obvious social and cultural differences between Iraq in 2003 and West Germany in 1945.* In September 2003 nearly two-fifths (39 percent) of those polled by Gallup in Baghdad picked multiparty parliamentary democracy as the form of government they would most like to see established in Iraq. Slightly more—42 percent—thought this was the system their country was most likely to have in five years’ time. However, more than half—51 percent—believed the outcome would be the result of direct American influence.53 That seems to suggest that many Iraqis expected the Americans to stay longer than the Americans themselves were planning to and that they anticipated political benefits from an ongoing American presence. Unfortunately, if the United States does walk away from Iraq in the course of 2005, those Iraqi hopes will almost certainly be dashed. Premature elections, held before order has been restored and economic life resumed, would almost certainly fail to produce a stable government. They would be much more likely to accentuate the ethnic and religious divisions within Iraqi society.54
Colossus Page 28