Book Read Free

The Collapse of Western Civilization

Page 3

by Naomi Oreskes


  The net result? Fossil fuel production escalated, greenhouse gas emissions increased, and climate disruption accelerated. In 2001, the IPCC had predicted that atmospheric CO2 would double by 2050.24 In fact, that benchmark was met by 2042. Scientists had expected a mean global warming of 2 to 3 degrees Celsius; the actual figure was 3.9 degrees. Though originally merely a benchmark for discussion with no particular physical meaning, the doubling of CO2 emissions turned out to be quite significant: once the corresponding temperature rise reached 4 degrees, rapid changes began to ensue.

  By 2040, heat waves and droughts were the norm. Control measures—such as water and food rationing and Malthusian “one-child” policies—were widely implemented. In wealthy countries, the most hurricane- and tornado-prone regions were gradually but steadily depopulated, putting increased social pressure on areas less subject to those hazards. In poor nations, conditions were predictably worse: rural portions of Africa and Asia began experiencing significant depopulation from out-migration, malnutrition-induced disease and infertility, and starvation. Still, sea level had risen only 9 to 15 centimeters around the globe, and coastal populations were mainly intact.

  Then, in the Northern Hemisphere summer of 2041, unprecedented heat waves scorched the planet, destroying food crops around the globe. Panic ensued, with food riots in virtually every major city. Mass migration of undernourished and dehydrated individuals, coupled with explosive increases in insect populations, led to widespread outbreaks of typhus, cholera, dengue fever, yellow fever, and viral and retroviral agents never before seen. Surging insect populations also destroyed huge swaths of forests in Canada, Indonesia, and Brazil. As social order began to break down in the 2050s, governments were overthrown, particularly in Africa, but also in many parts of Asia and Europe, further decreasing social capacity to deal with increasingly desperate populations. As the Great North American Desert surged north and east, consuming the High Plains and destroying some of the world’s most productive farmland, the U.S. government declared martial law to prevent food riots and looting. A few years later, the United States announced plans with Canada for the two nations to begin negotiations toward the creation of the United States of North America, to develop an orderly plan for resource-sharing and northward population relocation. The European Union announced similar plans for voluntary northward relocation of eligible citizens from its southernmost regions to Scandinavia and the United Kingdom.

  While governments were straining to maintain order and provide for their people, leaders in Switzerland and India—two countries that were rapidly losing substantial portions of their glacially-sourced water resources—convened the First International Emergency Summit on Climate Change, organized under the rubric of Unified Nations for Climate Protection (the former United Nations having been discredited and disbanded over the failure of the UNFCCC). Political, business, and religious leaders met in Geneva and Chandigarh to discuss emergency action. Many said that the time had come to make the switch to zero-carbon energy sources. Others argued that the world could not wait the ten to fifty years required to alter the global energy infrastructure, much less the one hundred years it would take for atmospheric CO2 to diminish. In response, participants hastily wrote and signed the Unified Nations Convention on Climate Engineering and Protection (UNCCEP), and began preparing blueprints for the International Climate Cooling Engineering Project (ICCEP).

  Mass migration of undernourished and dehydrated individuals, coupled with explosive increases in insect populations, led to widespread outbreaks of typhus, cholera, dengue fever, yellow fever, and viral and retroviral agents never before seen.

  As a first step, ICCEP launched the International Aerosol Injection Climate Engineering Project (IAICEP, pronounced ay-yi-yi-sep) in 2052.25 Sometimes called the Crutzen Project after the scientist who first suggested the idea in 2006, projects like this engendered heated public opposition when first proposed in the early twenty-first century but had widespread support by mid-century—from wealthy nations anxious to preserve some semblance of order, from poor nations desperate to see the world do something to address their plight, and from frantic low-lying Pacific Island nations at risk of being submerged by rising sea levels.26

  IAICEP began to inject submicrometer-size sulfate particles into the stratosphere at a rate of approximately 2.0 teragrams per year, expecting to reduce mean global temperature by 0.1 degrees Celsius annually from 2059 to 2079. (In the meantime, a substantial infrastructural conversion to renewable energy could have been achieved.) Initial results were encouraging: during the first three years of implementation, temperature decreased as expected and the phase-out of fossil fuel production commenced. However, in the project’s fourth year, an anticipated but discounted side effect occurred: the shutdown of the Indian monsoon. (By decreasing incoming solar radiation, IAICEP also decreased evaporation over the Indian Ocean, and hence the negative impact on the monsoon.) As crop failures and famine swept across India, one of IAICEP’s most aggressive promoters now called for its immediate cessation.

  IAICEP was halted in 2063, but a fatal chain of events had already been set in motion. It began with termination shock—that is, the abrupt increase in global temperatures following the sudden cessation of the project. Once again, this phenomenon had been predicted, but IAICEP advocates had successfully argued that, given the emergency conditions, the world had no choice but to take the risk.27 In the following eighteen months, temperature rapidly rebounded, regaining not just the 0.4 degrees Celsius that had been reduced during the project but an additional 0.6 degrees. This rebound effect pushed the mean global temperature increase to nearly 5 degrees Celsius.

  Whether it was caused by this sudden additional heating or was already imminent is not known, but the greenhouse effect then reached a global tipping point. By 2060, Arctic summer ice was completely gone. Scores of species perished, including the iconic polar bear—the Dodo bird of the twenty-first century. While the world focused on these highly visible losses, warming had meanwhile accelerated the less visible but widespread thawing of Arctic permafrost. Scientists monitoring the phenomenon observed a sudden increase in permafrost thaw and CH4 release. Exact figures are not available, but the estimated total carbon release of Arctic CH4 during the next decade may have reached over 1,000 gigatonnes, effectively doubling the total atmospheric carbon load.28 This massive addition of carbon led to what is known as the Sagan effect (sometimes more dramatically called the Venusian death): a strong positive feedback loop between warming and CH4 release. Planetary temperature increased by an additional 6 degrees Celsius over the 5-degree rise that had already occurred.

  The ultimate blow for Western civilization came in a development that, like so many others, had long been discussed but rarely fully assimilated as a realistic threat: the collapse of the West Antarctica Ice Sheet. Technically, what happened in West Antarctica was not a collapse; the ice sheet did not fall in on itself, and it did not happen all at once. It was more of a rapid disintegration. Post hoc failure analysis shows that extreme heat in the Northern Hemisphere disrupted normal patterns of ocean circulation, sending exceptionally warm surface waters into the southern ocean that destabilized the ice sheet from below. As large pieces of ice shelf began to separate from the main ice sheet, removing the bulwark that had kept the sheet on the Antarctic Peninsula, sea level began to rise rapidly.

  Social disruption hampered scientific data-gathering, but some dedicated individuals—realizing the damage could not be stopped—sought, at least, to chronicle it. Over the course of the next two decades (from 2073 to 2093), approximately 90 percent of the ice sheet broke apart, disintegrated, and melted, driving up sea level approximately five meters across most of the globe. Meanwhile, the Greenland Ice Sheet, long thought to be less stable than the Antarctic Ice Sheet, began its own disintegration. As summer melting reached the center of the Greenland Ice Sheet, the east side began to separate from the west. Massive ice breakup ensued, adding another two meters to mean globa
l sea level rise.29 These cryogenic events were soon referred to as the Great Collapse, although some scholars now use the term more broadly to include the interconnected social, economic, political, and demographic collapse that ensued.

  These cryogenic events were soon referred to as the Great Collapse, although some scholars now use the term more broadly to include the interconnected social, economic, political, and demographic collapse that ensued.

  Analysts had predicted that an eight-meter sea level rise would dislocate 10 percent of the global population. Alas, their estimates proved low: the reality was closer to 20 percent. Although records for this period are incomplete, it is likely that during the Mass Migration 1.5 billion people were displaced around the globe, either directly from the impacts of sea level rise or indirectly from other impacts of climate change, including the secondary dislocation of inland peoples whose towns and villages were overrun by eustatic refugees. Dislocation contributed to the Second Black Death, as a new strain of the bacterium Yersinia pestis emerged in Europe and spread to Asia and North America. In the Middle Ages, the Black Death killed as much as half the population of some parts of Europe; this second Black Death had similar effects.30 Disease also spread among stressed nonhuman populations. Although accurate statistics are scant because twentieth-century scientists did not have an inventory of total global species, it is not unrealistic to estimate that 60 to 70 percent of species were driven to extinction. (Five previous mass extinctions were known to scientists of the Penumbral Period, each of which was correlated to rapid greenhouse gas level changes, and each of which destroyed more than 60 percent of identifiable species—the worst reached 95 percent. Thus, 60–70 percent is a conservative estimate insofar as most of these earlier mass extinctions happened more slowly than the anthropogenic mass extinction of the late Penumbral Period.)31

  Dislocation contributed to the Second Black Death, as a new strain of the bacterium Yersinia pestis emerged in Europe and spread to Asia and North America. In the Middle Ages, the Black Death killed as much as half the population of Europe; this second Black Death had similar effects.

  There is no need to rehearse the details of the human tragedy that occurred; every schoolchild knows of the terrible suffering. Suffice it to say that total losses—social, cultural, economic, and demographic—were greater than any in recorded human history. Survivors’ accounts make clear that many thought the end of the human race was near. Indeed, had the Sagan effect continued, warming would not have stopped at 11 degrees, and a runaway greenhouse effect would have followed.

  However, around 2090 (the date cannot be determined from extant records), something occurred whose exact character remains in dispute. Japanese genetic engineer Akari Ishikawa developed a form of lichenized fungus in which the photosynthetic partner consumed atmospheric CO2 much more efficiently than existing forms, and was able to grow in a wide diversity of environmental conditions. This pitch-black lichen, dubbed Pannaria ishikawa, was deliberately released from Ishikawa’s laboratory, spreading rapidly throughout Japan and then across most of the globe. Within two decades, it had visibly altered the visual landscape and measurably altered atmospheric CO2, starting the globe on the road to atmospheric recovery and the world on the road to social, political, and economic recovery.

  In public pronouncements, the Japanese government has maintained that Ishikawa acted alone, and cast her as a criminal renegade. Yet many Japanese citizens have seen her as a hero, who did what their government could not, or would not, do. Most Chinese scholars reject both positions, contending that the Japanese government, having struggled and failed to reduce Japan’s own carbon emissions, provided Ishikawa with the necessary resources and then turned a blind eye toward its dangerous and uncertain character. Others blame (or credit) the United States, Russia, India, or Brazil, as well as an international consortium of financiers based in Zurich. Whatever the truth of this matter, Ishikawa’s actions slowed the increase of atmospheric CO2 dramatically.

  Humanity was also fortunate in that a so-called “Grand Solar Minimum” reduced incoming solar radiation during the twenty-second century by 0.5%, offsetting some of the excess CO2 that had accumulated, and slowing the rise of surface and oceanic temperatures for nearly a century, during which time survivors in northern inland regions of Europe, Asia, and North America, as well as inland and high-altitude regions of South America, were able to begin to regroup and rebuild. The human populations of Australia and Africa, of course, were wiped out.

  3

  Market Failure

  New York City in the twenty-fourth century Once the financial capital of the world, New York began in the early twenty-first century to attempt to defend its elaborate and expensive infrastructure against the sea. But that infrastructure had been designed and built with an expectation of constant seas and was not easily adapted to continuous, rapid rise. Like the Netherlands, New York City gradually lost its struggle. Ultimately, it proved less expensive to retreat to higher ground, abandoning centuries’ worth of capital investments.

  To the historian studying this tragic period of human history, the most astounding fact is that the victims knew what was happening and why. Indeed, they chronicled it in detail precisely because they knew that fossil fuel combustion was to blame. Historical analysis also shows that Western civilization had the technological know-how and capability to effect an orderly transition to renewable energy, yet the available technologies were not implemented in time.1 As with all great historical events, there is no easy answer to the question of why this catastrophe occurred, but key factors stand out. The thesis of this analysis is that Western civilization became trapped in the grip of two inhibiting ideologies: positivism and market fundamentalism.

  Twentieth-century scientists saw themselves as the descendants of an empirical tradition often referred to as positivism—after the nineteenth-century French philosopher, Auguste Comte, who developed the concept of “positive” knowledge (as in, “absolutely, positively true”)—but the overall philosophy is more accurately known as Baconianism. This philosophy held that through experience, observation, and experiment, one could gather reliable knowledge about the natural world, and that this knowledge would empower its holder. Experience justified the first part of the philosophy (we have recounted how twentieth-century scientists anticipated the consequences of climate change), but the second part—that this knowledge would translate into power—proved less accurate. Although billions of dollars were spent on climate research in the late twentieth and early twenty-first centuries, the resulting knowledge had little impact on the crucial economic and technological policies that drove the continued use of fossil fuels.

  A key attribute of the period was that power did not reside in the hands of those who understood the climate system, but rather in political, economic, and social institutions that had a strong interest in maintaining the use of fossil fuels. Historians have labeled this system the carbon-combustion complex: a network of powerful industries comprising fossil fuel producers, industries that served energy companies (such as drilling and oil field service companies and large construction firms), manufacturers whose products relied on inexpensive energy (especially automobiles and aviation, but also aluminum and other forms of smelting and mineral processing), financial institutions that serviced their capital demands, and advertising, public relations, and marketing firms who promoted their products. Maintaining the carbon-combustion complex was clearly in the self-interest of these groups, so they cloaked this fact behind a network of “think tanks” that issued challenges to scientific knowledge they found threatening.2 Newspapers often quoted think tank employees as if they were climate researchers, juxtaposing their views against those of epistemologically independent university or government scientists. This practice gave the public the impression that the science was still uncertain, thus undermining the sense that it was time to act.3 Meanwhile, scientists continued to do science, believing, on the one hand, that it was inappropriate for them to spe
ak on political questions (or to speak in the emotional register required to convey urgency) and, on the other hand, that if they produced abundant and compelling scientific information (and explained it calmly and clearly), the world would take steps to avert disaster.

  Many scientists, to their credit, recognized the difficulties they were facing, and grappled with how to communicate their knowledge effectively.4 Some tried to create institutional structures to support less reductionist modes of inquiry that analyzed broad patterns and the interactions between natural and social systems. While they were making some headway, a large part of Western society was rejecting that knowledge in favor of an empirically inadequate yet powerful ideological system. Even at the time, some recognized this system as a quasi-religious faith, hence the label market fundamentalism.

  Market fundamentalism—and its various strands and interpretations known as free market fundamentalism, neoliberalism, laissez-faire economics, and laissez-faire capitalism—was a two-pronged ideological system. The first prong held that societal needs were served most efficiently in a free market economic system. Guided by the “invisible hand” of the marketplace, individuals would freely respond to each other’s needs, establishing a net balance between solutions (“supply”) and needs (“demand”). The second prong of the philosophy maintained that free markets were not merely a good or even the best manner of satisfying material wants: they were the only manner of doing so that did not threaten personal freedom.

 

‹ Prev