Book Read Free

The Collapse of Western Civilization

Page 4

by Naomi Oreskes


  The crux of this second point was the belief that marketplaces represented distributed power. Various individuals making free choices held power in their hands, preventing its undue concentration in centralized government. Conversely, centrally planned economies entailed not just the concentration of economic power, but of political power as well. Thus, to protect personal liberty—political, civic, religious, artistic—economic liberty had to be preserved. This aspect of the philosophy, called neoliberalism, hearkened back to the liberalism of the eighteenth- and nineteenth-century Enlightenment, particularly the works of Adam Smith, David Hume, John Locke, and, later, John Stuart Mill. Reacting to the dominant form of Western governance in their time—that is, monarchy—these thinkers lionized personal liberty in contrast to control by unjust, arbitrary, and often incompetent despots. At a time when some political leaders were imagining alternatives to despotic monarchy, many viewed the elevation of individual rights as a necessary response. In the late eighteenth century, these views influenced the architects of the American Republic and the early, “liberal” phase of the French Revolution. Even then, however, such views were more idealistic than realistic. The late-eighteenth-century U.S. Constitution preserved and validated race-based slavery; even when that nation abolished slavery in the mid-nineteenth century, it preserved economic and social apartheid for more than a century thereafter. In Europe, the French Revolution collapsed in a wave of violence and the restoration of autocratic rule under Napoleon Bonaparte.

  In the nineteenth century, power became concentrated in the hands of industrialists (the “robber barons,” monopolies, and “trusts” of the United States and elsewhere), challenging liberal conceptions of the desirability of weak political governance. In Europe, the German philosopher Karl Marx argued that an inherent feature of the capitalist system was the concentration of wealth and power in a ruling class that siphoned off the surplus value produced by workers. Industrialists not only employed workers under brutal and tyrannical conditions (the nineteenth-century “satanic mills”), they also corrupted democratic processes through bribery and extortion and distorted the marketplace through a variety of practices. A powerful example is the development and expansion of American railroads. Supply of these “roads to nowhere” was heavily subsidized, and the demand for them was manufactured at the expense of displaced native peoples and natural environment of the American West.5

  Marx’s analysis inspired popular leaders in many nation-states then in existence—for example, Russia, China, Vietnam, Ghana, and Cuba—to turn to communism as an alternative economic and social system. Meanwhile, the capitalist United States abolished slavery and made adjustments to remedy power imbalances and losses of liberty due to the concentration of wealth. Among other reforms, the federal government introduced antitrust laws to prevent monopolistic practices, established worker protections such as limits on the length of the working day and prohibitions on child labor, and developed a progressive income tax. By the early twentieth century, it was clear that capitalism in its pure, theoretical form did not exist, and few could argue for its desirability: the failures were too obvious. Intellectuals came to see the invisible hand, akin to the hand of God, as the quasi-religious notion that it was. The Great Depression of the 1930s—from which Europe and the United States emerged only through the centralized mobilization of World War II—led scholars and political leaders to view the idea of self-regulating markets as a myth. After WWII, most non-communist states became “mixed” economies with a large degree of both individual and corporate freedom and significant government involvement in markets, including extensive systems of taxes, tariffs, subsidies, regulation of banks and exchanges, and immigration control.6

  Meanwhile communism, which had spread throughout Eurasia and to some parts of Africa and Latin and South America, was revealing even worse failures than capitalism. Communist economies proved grossly inefficient at delivering goods and services; politically, early ideas of mass empowerment gave way to tyrannical and brutal dictatorship. In the Soviet Union under Joseph Stalin, tens of millions died in purges, forced collectivization of agriculture, and other forms of internal violence. Tens of millions died in China as well during the “Great Leap Forward”—the attempt by 毛泽东 (Mao Zedong) to force rapid industrialization—and many more in the so-called Cultural Revolution of the First People’s Republic of China (PRC).7

  Following World War II, the specter of Russian communism’s spread into Eastern (and possibly even Western) Europe—thus affecting U.S. access to markets and stoking fears that the West could sink back into economic depression—led the United States to take a strong position against Soviet expansion. Conversely, the Soviet Union’s desire to control its western flanks in light of historic vulnerability led to the political occupation and control of Eastern Europe. The resulting Cold War (1945–1989) fostered a harshly polarized view of economic systems, with “communists” decrying the corruption of the capitalist system and “capitalists” condemning the tyranny and violence in communist regimes (acknowledging here that in practice neither system was purely communist nor purely capitalist). Perhaps because of the horrible violence in the East, many Western intellectuals came to see everything associated with communism as evil, even—and crucially for our story—modest or necessary forms of intervention in the marketplace such as progressive taxation and environmental regulation, and humanitarian interventions such as effective and affordable regimes of health care and birth control.

  Neoliberalism was developed by a group of thinkers—most notably Austrian Friedrich von Hayek and American Milton Friedman—who were particularly sensitive to the issue of repressive centralized government. In two key works, von Hayek’s Road to Serfdom and Friedman’s Capitalism and Freedom, they developed the crucial “neo-” component of neoliberalism: the idea that free market systems were the only economic systems that did not threaten individual liberty.

  Neoliberalism was initially a minority view. In the 1950s and 1960s, the West experienced high overall prosperity, and individual nations developed mixed economies that suited their own national cultures and contexts. Things began to shift in the late 1970s and 1980s, when Western economies stalled and neoliberal ideas attracted world leaders searching for answers to their countries’ declining economic performance, such as Margaret Thatcher in the United Kingdom and Ronald Reagan in the United States. Friedman became an advisor to President Reagan; in 1991, von Hayek received the Presidential Medal of Freedom from President George H. W. Bush.8

  An ironic aspect of this element of our story is that Friedrich von Hayek had great respect and admiration for the scientific enterprise, seeing it as the historical companion to enterprise capitalism. By fostering commerce, von Hayek suggested, science and industry were closely linked to the rise of capitalism and the growth of political freedom; this view was shared by mid-twentieth-century advocates for an expanded role of government in promoting scientific investigations.9 However, when environmental science showed that government action was needed to protect citizens and the natural environment from unintended harms, the carbon-combustion complex began to treat science as an enemy to be fought by whatever means necessary. The very science that had led to U.S. victory in World War II and dominance in the Cold War became the target of skepticism, scrutiny, and attack. (Science was also attacked in communist nations for the same basic reason—it came into conflict with political ideology.)10 The end of the Cold War (1989–1991) was a source of celebration for citizens who had lived under the yoke of oppressive Soviet-style governance; it also ignited a slow process of overdue reforms in the First People’s Republic of China. But for many observers in the West, the Soviet Union’s collapse gave rise to an uncritical triumphalism, proof of the absolute superiority of the capitalist system. Some went further, arguing that if capitalism was a superior system, then the best form of capitalism lay in its purest form. While it is possible that some academic economists and intellectuals genuinely held this view, it was the in
dustrialists and financiers who perceived large opportunities in less regulated markets who did the most to spread and promote it. As a result, the 1990s and 2000s featured a wave of deregulation that weakened consumer, worker, and environmental protections. A second Gilded Age reproduced concentrations of power and capital not seen since the nineteenth century, with some of the accumulated capital used to finance think tanks that further promoted neoliberal views. (And some of the rest reinvested in fossil fuel production.) Most important for our purposes, neoliberal thinking led to a refusal to admit the most important limit of capitalism: market failure.

  When environmental science showed that government action was needed to protect citizens and the natural environment from unintended harms, the carbon-combustion complex began to treat science as an enemy to be fought by whatever means necessary. The very science that had led to U.S. victory in World War II and dominance in the Cold War became the target of skepticism, scrutiny, and attack.

  When scientists discovered the limits of planetary sinks, they also discovered market failure. The toxic effects of DDT, acid rain, the depletion of the ozone layer, and climate change were serious problems for which markets did not provide a spontaneous remedy. Rather, government intervention was required: to raise the market price of harmful products, to prohibit those products, or to finance the development of their replacements. But because neoliberals were so hostile to centralized government, they had, as Americans used to say, “painted themselves into a corner.” The American people had been persuaded, in the words of U.S. President Ronald Reagan (r. 1980–1988), that government was “the problem, not the solution.” Citizens slid into passive denial, accepting the contrarian arguments that the science was unsettled. Lacking widespread support, government leaders were unable to shift the world economy to a net carbon-neutral energy base. As the world climate began to spin out of control and the implications for market failure became indisputable, scientists came under attack, blamed for problems they had not caused, but had documented.

  The toxic effects of DDT, acid rain, the depletion of the ozone layer, and climate change were serious problems for which markets did not provide a spontaneous remedy.

  Physical scientists were chief among the individuals and groups who tried to warn the world of climate change, both before and as it happened. (In recognition of what they tried to achieve, millions of survivors have taken their names as middle names.) Artists noted the tendency to ignore warning signs, such as the mid-twentieth-century Canadian songwriter Leonard Cohen, who sang “We asked for signs. The signs were sent.” Social scientists introduced the concept of “late lessons from early warnings” to describe a growing tendency to neglect information. As a remedy, they promoted a precautionary principle, whereby early action would prevent later damage. The precautionary principle was a formal instantiation of what had previously been thought of as common sense, reflected in the nineteenth-century European and American adages, “A stitch in time saves nine” and “An ounce of prevention is worth a pound of cure.” Yet this traditional wisdom was swept away in neoliberal hostility toward planning and an overconfident belief in the power of markets to respond to social problems as they arose. (Indeed, neoliberals believed markets so powerful they could “price in” futures that had not happened yet—pre-solving problems as it were, a brilliant case of wishful fantasy that obviated the need for hateful planning.) Another of the many ironies of the Penumbral Period is that the discipline of economics—rooted in the ancient Greek concept of household management (oikos, “house,” and nomos, “laws” or “rules”)—failed to speak to the imperative of a managed transition to a new energy system. The idea of managing energy use and controlling greenhouse gas emissions was anathema to the neoliberal economists whose thinking dominated at this crucial juncture. Thus, no planning was done, no precautions were taken, and the only management that finally ensued was disaster management.

  The precautionary principle was a formal instantiation of what had previously been thought of as common sense, reflected in the nineteenth-century European and American adages, “A stitch in time saves nine” and “An ounce of prevention is worth a pound of cure.”

  Discerning neoliberals acknowledged that the free market was not really free; interventions were everywhere. Some advocated eliminating subsidies for fossil fuels and creating “carbon” markets.11 Others recognized that certain interventions could be justified. Von Hayek himself was not opposed to government intervention per se; indeed, as early as 1944, he rejected the term laissez-faire as misleading because he recognized legitimate realms of government intervention: “The successful use of competition as the principle of social organization precludes certain types of coercive interference with economic life, but it admits of … and even requires [others],” he wrote. In his view, legitimate interventions included paying for signposts on roads, preventing “harmful effects of deforestation, of some methods of farming, or of the noise and smoke of factories,” prohibiting the use of poisonous substances, limiting working hours, enforcing sanitary conditions in workplaces, controlling weights and measures, and preventing violent strikes.12 Von Hayek simply (and reasonably) believed that if the government were to carry out such functions, and particularly if doing so selectively limited the freedom of particular groups or individuals, then the justification for the intervention should be clear and compelling. Given the events recounted here, it is hard to imagine why anyone in the twentieth century would have argued against government protection of the natural environment on which human life depends. Yet such arguments were not just made, they dominated the public sphere.

  The ultimate paradox was that neoliberalism, meant to ensure individual freedom above all, led eventually to a situation that necessitated large-scale government intervention. Classical liberalism was centered on the idea of individual liberty, and in the eighteenth century most individuals had precious little of it—economic or otherwise. But by the mid-twentieth century this situation had changed dramatically: slavery was formally outlawed in the nineteenth century, and monarchies and other forms of empire were increasingly replaced by various forms of “liberal” democracy. In the West, individual freedoms—both formal and informal—probably peaked around the time von Hayek was writing, or shortly thereafter. By the end of the twentieth century, Western citizens still held the formal rights of voting, various forms of free thought and expression, and freedom of employment and travel. But actionable freedom was decreasing, first as economic power was increasingly concentrated in a tiny elite, who came to be known as the “1 percent,” and then in a political elite propelled to power as the climate crisis forced dramatic interventions to relocate citizens displaced by sea level rise and desertification, to contain contagion, and to prevent mass famine. And so the development that the neoliberals most dreaded—centralized government and loss of personal choice—was rendered essential by the very policies that they had put in place.

  Epilogue

  The former state of Florida (part of the former United States) In one of the many paradoxes of history, the inhabitants of late-twentieth-century Florida were engaged in a grand project to save an enormous sea-level wetlands region known as the Everglades from urban growth and the diversion of freshwater to urban and agricultural use. Yet even the low-end estimates of twenty-first century sea-level rise rendered this effort pointless due to inundation; what actually happened cost Floridians both the Everglades and many of their major cities.

  As the devastating effects of the Great Collapse began to appear, the nation-states with democratic governments—both parliamentary and republican—were at first unwilling and then unable to deal with the unfolding crisis. As food shortages and disease outbreaks spread and sea level rose, these governments found themselves without the infrastructure and organizational ability to quarantine and relocate people.

  In China, the situation was somewhat different. Like other post-communist nations, China had taken steps toward liberalization but still retained a powe
rful centralized government. When sea level rise began to threaten coastal areas, China rapidly built new inland cities and villages and relocated more than 250 million people to higher, safer ground.1 The relocation was not easy; many older citizens, as well as infants and young children, could not manage the transition. Nonetheless, survival rates exceeded 80 percent. To many survivors—in what might be viewed as a final irony of our story—China’s ability to weather disastrous climate change vindicated the necessity of centralized government, leading to the establishment of the Second People’s Republic of China (SPRC) (also sometimes referred to as Neocommunist China) and inspiring similar structures in other, reformulated nations. By blocking anticipatory action, neoliberals did more than expose the tragic flaws in their own system: they fostered expansion of the forms of governance they most abhorred.

  Today, we remain engaged in a vigorous intellectual discussion of whether, now that the climate system has finally stabilized, decentralization and redemocratization may be considered. Many academics, in the spirit of history’s great thinkers, hope that such matters may be freely debated. Others consider that outcome wishful, in light of the dreadful events of the past, and reject the reappraisal that we wish to invite here. Evidently, the Penumbra falls even today—and likely will continue to fall for years, decades, and perhaps even centuries to come.

  Lexicon of Archaic Terms

  Anthropocene The geological period, beginning in approximately 1750 with the start of the Industrial Revolution, when humans have become geological agents whose activities effectively compete with, and begin to overwhelm, geophysical, geochemical, and biological processes.

 

‹ Prev