Book Read Free

The World Turned Inside Out

Page 18

by James Livingston


  The turn-of-the century moment when Mitteleuropa squared off against the Open Door was the culmination of the second great stage of globalization (that’s right, globalization didn’t begin in 1989). The first stage began in the sixteenth century, with the rise of joint-stock companies and the colonization of the Western Hemisphere by European powers. Trade rather than direct investment—rather than the transfer of technology—was the medium of globalization in this period. The second stage began in the 1870s, when France and Germany started acquiring new colonial possessions in the hope of challenging Great Britain for control of the world’s resources; the United States became a participant in 1898 with the Spanish American War. In this second stage, surplus capital, that is, direct investment, became the medium of globalization, but it flowed mainly into extractive industries (mining, for example) or plantation agriculture. The extraordinary economic growth it fueled came to an abrupt halt in 1914 with the start of World War I; the volume of world trade and investment did not recover until the 1960s.

  In the third stage, our own time, the medium of globalization is the financial integration of world markets enabled by the collapse of the Soviet Union and the new purchase of “free trade” on the imaginations of policymakers everywhere. It is driven by massive transfers of technology (“outsourcing”) that export manufacturing plants—thus jobs—from the United States to less developed countries. This of course is the kind of direct investment that downsized Dad in the late twentieth century. From his standpoint, globalization is not such a good thing: the North American Free Trade Agreement may be dangerous to his health.

  At the height of the second stage of globalization, the collision of two realities—the new importance of world markets for American enterprise and the aggressive new colonialism of the Great Powers—happened in China, and thus became the occasion for Hay’s Open Door Notes of 1899 to 1900. The so-called Boxer Rebellion, the 1899 uprising of the Chinese people against the Great Powers and their colonial ambitions (it was the prelude to the revolution that officially began in 1911, something like what started in Russia twelve years before the Bolsheviks seized power in 1917), looked like a real crisis from the standpoint of the State Department. For it confirmed that colonized peoples—impoverished peoples held in subjection by superior military force—will just keep rebelling.

  More important, their rebellions will disrupt the world’s markets by creating political upheaval, thus impairing the confidence of traders and investors. And so that disruption will probably lead, in turn, to war between the Great Powers over access to diminished markets. As Woodrow Wilson asked on September 5, 1919, in a speech in St. Louis, “Why, my fellow citizens, is there any man here or any woman, let me say is there any child here, who does not know that the seed of war in the modern world is industrial and commercial rivalry?”

  At any rate this was the economic logic at work in Hay’s notes. The Open Door world he proposed in them was the outline of an anticolonial imperialism. Like other contributors to the doctrine and citizens of the world it would turn inside out, Hay did not shrink from the idea or the enactment of imperialism; he knew the United States itself was originally conceived as a “mighty empire” to be realized by the acquisition of an entire continent. Charles Conant, the most important theorist of American empire, explained in 1898 that the labels didn’t matter very much because the development of “decadent nations” by means of imperialism was not a matter of choice; the countries with a surplus of capital had to export it—the alternatives he posed were foreign wars, outright socialism, or state spending for welfare—and the countries with a shortage of capitalism as well as capital had to import it.

  New markets and new opportunities for investment must be found if surplus capital is to be profitably employed. In pointing out the necessity that the U.S. shall enter upon a broad policy, it need not be determined in just what manner that policy shall be worked out. . . . Whether this policy carries with it the direct government of groups of half-savage islands may be a subject of argument, but upon the economic side of the question there is but one choice—either to enter by some means upon the competition for the employment of American capital and enterprise in these countries, or to continue the needless duplication of existing means of production and communication.

  In other words, if the world’s markets were crucial to the future of the American experiment, as Conant, Vanderlip, and almost everyone else believed, the United States had to intervene in the ongoing contest among advanced nations for access to those markets; only the form of that intervention was in question. But Hay did propose to dismantle the exclusive spheres of influence the Great Powers had created, and in doing so he was redefining imperialism as such. This redefinition would shape U.S. foreign policy until the very end of the twentieth century.

  The Open Door World

  The assumptions and imperatives built into the Open Door world Hay outlined went approximately as follows. First, the growth of world income was the key to world peace. If the volume of world income grew consistently, no nation would have to increase its share at the expense of others by military means, by conquering peoples, creating colonies, or going to war over access to resources (as happened regularly in the seventeenth, eighteenth, and nineteenth centuries). Other things being equal, the best way to grow the world’s income was to allow for the free flow of capital and finished goods across international borders—tariffs and exclusive spheres of influence (colonies) were the principal obstacles to economic growth so conceived. An Open Door world, “a fair field and no favor,” was, then, the necessary condition of world peace.

  Second, national sovereignty or self-determination, even for the most backward of peoples, had to be acknowledged and enforced. China, which had been systematically dismembered by the Great Powers, was Exhibit A, but there were many other parts of the world ruled by the lawless whims of imperial, exploitative purpose. The Belgian Congo was just the worst case; French Indochina was almost as bad. The regulative assumption here cut two ways, both determined by close study of American historical experience. On the one hand, the United States had been able to absorb enormous amounts of foreign capital in the nineteenth century without becoming a colonial appendage of a Great Power because its political integrity, its national sovereignty, had been secured by the Revolution and the Constitution. On the other hand, the rebelliousness and backwardness of the American South after the Civil War had been reanimated by military conquest and occupation. The tentative conclusions to be drawn from the study of this historical experience were simple but profound—maybe military conquest is not the prerequisite of imperial success, and maybe world power need not wear a uniform.

  Third, trade was less important than investment, or, to put it in contemporary parlance, the “transfer of technology” from more advanced to less developed countries was the path to a future in which all nations might share the benefits of growth. Trade between countries can, in fact, contribute to the underdevelopment of the more backward party to the bargain—just as “dependency theory” and liberation theology in the late twentieth century claimed—because the goods being exchanged cost the less developed country more of its current, available labor time; by contrast, the value of the goods exported by the more advanced country include the past labor embodied in its fixed capital, its technology, so more of its population can be doing something other than producing raw materials for export. Adam Smith, one of the founding fathers of modern political economy, clearly understood this asymmetry as a result of free trade. Here is how he explained it in The Wealth of Nations (1776):

  The revenue of a trading and manufacturing country must, other things being equal, always be much greater than that of one without trade or manufactures.

  . . . A small quantity of manufactured produce purchases a great quantity of rude produce. A trading and manufacturing country, therefore, naturally purchases with a small part of its manufactured produce a great part of the rude produce of other countries; while, on the co
ntrary, a country without trade and manufactures is generally obliged to purchase, at the expense of a great part of its rude produce, a very small part of the manufactured produce of other countries.

  The American architects of anticolonial imperialism believed that direct investment—transfers of technology—could change this dynamic by equipping everyone with the tools of industrial development. At any rate, they were convinced that changing the rules of trade in raw materials or finished goods could not address the real question, the real problem—that is, how to invest surplus capital in the less developed countries. Here is how Conant characterized this problem in 1901: “The benefit to the old countries in the control of the underdeveloped countries does not lie chiefly in the outlet for consumable goods. It is precisely to escape the necessity for the reduplication of the plants which produce the goods, by finding a field elsewhere for the creation of new plants, that the savings of the capitalistic countries are seeking an outlet beyond their own limits.” That Conant worked for every administration from William McKinley to Woodrow Wilson, from 1899 to 1915, that he put the Philippines on the gold standard—the banknotes circulated thereafter were called “Conants”—and that he meanwhile published five books on modern banking and modern imperialism might suggest how widely shared such an idea was in the policymaking circles of the time.

  But if you assume that investment is more important than trade and that transfers of technology compose the path to a peaceful future, you must believe that development is the property of all people. You must believe that there is no disqualification residing in race, or in religion, or in any other cultural artifact. You must believe that every person, every civilization, is capable of turning toward modernity and finding its possibilities compelling. You must also believe that the continuum of civilization is mostly a matter of economic development, and you will act accordingly, as if racial differences and religious disputes are real but artificial, as if they can be smoothed, if not erased, by the growth of world income an Open Door policy permits and requires. “Civilization follows material development,” as William Howard Taft, the future president and former civil governor of the Philippines, explained in 1901 to a congressional committee.

  But go ahead, put the proposition more pragmatically—even if you are a racist (as Taft was according to our standards), you know you cannot conduct foreign policy in racial terms because if you do, you are validating European colonialism and re-creating the conditions of war. A “clash of civilizations,” as Samuel P. Huntington tried to frame the prospects of American diplomacy in a 1996 book, would have sounded merely bizarre from this standpoint. For again, civilization was a continuum: all human beings want and need both the development and the recognition of their natural talents, learned skills, and past efforts.

  Fourth, power and standing in the world were not functions of military capacity or accomplishment; they were instead results of what Henry Luce called, with emphatic irony, “all the trivial ways”—“intellectual, scientific, and artistic”—all the ways in which the black aesthetic changed the United States, all the ways in which the United States changed the world, all the ways in which we citizens of the late twentieth century changed ourselves. Perhaps the most important achievement of the Open Door Notes was to announce to the world that from now on, power in the world is to be understood as a function of economic capacity, political pluralism, and cultural attainment, rather than the result of a huge military and a grimly determined officer corps.

  The corollary of this announcement, an achievement in its own right, was the idea that multilateral institutions and cooperation would have to be the political infrastructure of peaceful economic development. The U.S. Congress rejected the League of Nations in 1919 on exactly these grounds—Article 10 of the League’s covenant committed American military power, such as it was, to the preservation and protection of European spheres of influence. The United Nations was approved in 1945 on the same grounds—Article 51 of the UN charter outlaws unilateral use of armed force.

  The original architects of an anticolonial imperialism were not naïve. They knew that the United States would have to build and eventually deploy significant military power to defend its growing overseas interests. “Nations with large interests abroad must necessarily encounter many difficulties,” as The Bankers’ Magazine explained in 1900, “which frequently can only be overcome by force of arms.” But like the American people, these original architects had a deep suspicion of standing armies and an abiding aversion to the big government required to sustain them. Until the 1930s, their goal, often expressed in treaties, was to limit the spread of military technology (battleships in the 1920s, for example) in the hope of blunting the “balance-of-power” approach taken by the victors in the First World War.

  Vietnam Syndrome

  The Great Depression and the Second World War changed everything except the assumptions and imperatives at work in the original Open Door Notes. Yes, the military component of the federal budget did increase almost exponentially in the 1950s, in accordance with the Cold War agenda of “containing” Communism, and, yes, the United States did sink, slowly and surely, into the quagmire of Vietnam, a proxy war with China very much like the previous disaster in Korea. But the rationale for war in Southeast Asia was perfectly consistent with the principles of American foreign policy in the twentieth century as summarized above. The goal was to make sure that Japan did not revert to its traditional economic relationship with China, the recipient of most of its exports and surplus capital until 1949, when the Communists seized power. The goal was to make sure that Japan had another outlet for its enterprise and would, as a result, remain within the orbit of the “free world” now dominated by the United States. Here is how a study group sponsored by the Woodrow Wilson Foundation and the National Planning Association, which included two State Department alumni, a National City Bank economist, and the chair of Harvard’s Department of Government—the mentor of both Samuel P. Huntington and Henry Kissinger—framed that goal in 1955: “The history of the 1930s should be a warning to the West, and especially to the United States, that failure to make sufficient economic opportunity for the expansion of Japan’s exports and for Japanese economic growth can be disastrous for the security of the West and the peace of the world. The logical way to open this opportunity would be to make possible greater Japanese participation in the development of Southern Asia.”

  The social and intellectual results of the debacle in Vietnam did, however, make a big difference toward the end of the twentieth century: they did change everything in both interesting and dangerous ways. The social result was the All-Volunteer Army, which presupposed the end of the draft that had conscripted so many young men into a military still befuddled by the idea of civil rights (that is, the idea of racial equality). In the aftermath of Vietnam, the four major military branches rebuilt themselves from scratch, and they did it with the ideals of the 1960s in mind. The Marine Corps went farther than the others, but by the 1990s the American military was the most egalitarian social program in the United States. It fulfilled the goals of Lyndon Johnson’s “Great Society” by becoming the most active, even strident, adherent of affirmative action, and in doing so it became the nation’s least racist institution. It fulfilled, in this sense, the goals of the civil rights movement, but meanwhile it provided a portal to education for working-class kids of every kind, of every color.

  The intellectual results were more ambiguous, of course. On the Left , the critique of American imperialism became much more forceful and apocalyptic. Jimmy Carter was elected president in 1976 in part because the Congress and the American people were so disgusted by the atrocities perpetrated in their name, from Saigon to Santiago. U.S. complicity in the 1973 overthrow of Salvador Allende, the democratically elected president of Chile, was especially galling and led directly to Senator Frank Church’s indictment of the CIA in 1976. Carter’s notion of “human rights” as the core principle of foreign policy has, by now, reshaped th
e very idea of diplomacy; in its absence, it is improbable that Augusto Pinochet—the general who led the coup against Allende—or Slobodan Miloševic would have been arrested or tried in international courts of law. But in the late 1970s, it was an astonishing departure to make human rights the centerpiece of foreign policy. “Realism” had been the rule in the White House since 1968 because Henry Kissinger had been the presiding officer of American diplomacy.

  On the Right, meanwhile, the critique of American foreign policy became much more pointed and persuasive. Ronald Reagan was elected president in 1980 in part because the Congress and the American people were so

  disgusted by the atrocities perpetrated by foreigners, from Luzon to Tehran—remember the hostages held for over a year by the Iranian Revolutionary Guard, starting in 1979? The simple fact that Carter had relinquished U.S. rights to the Panama Canal (Teddy Roosevelt’s bully project) signified the twilight of American power, according to Reagan, but he promised to turn dusk to dawn.

  The debate that ensued on U.S. foreign policy cannot be characterized, however, as a Left/Right divide that corresponds somehow to a liberal/conservative opposition. The Vietnam Syndrome took over the body politic in the late 1970s, when everybody acknowledged that we lost the war, and then asked why, and then wondered how the United States would regain what Henry Luce had called its “indefinable, unmistakable sign of leadership: prestige.” This unfortunate syndrome had many symptoms, many manifestations, and many unintended consequences.

  Then as now, the United States was the preeminent military power in the world. Why did it lose the Vietnam War to a guerilla movement that had one-tenth of the logistical capacity at the disposal of American forces—and no airplanes? There are four available explanations these days. First, the U.S. military was fighting the wrong war. It was using strategy and tactics suited to a war of maneuver, in which the object is to defeat another army on a field of battle, when it should have been fighting a war of position—engaging in counterinsurgency—and trying to win the “hearts and minds” of the local populations by (re)building the material foundations of their daily lives. Sound familiar? Second, the politicians in the Congress wouldn’t permit the president to do what was necessary to win: they kept reining in the war powers of the executive branch, and in doing so they betrayed the American military and cancelled the American Century. By this account, mind you, a military resolution of the war was both possible and imminent. It would have happened, and soon, except that the Congress got in the way. Sound familiar?

 

‹ Prev