The Collapse of Western Civilization
Page 1
THE COLLAPSE OF WESTERN CIVILIZATION
Choice manifests itself in society in small
increments and moment-to-moment
decisions as well as in loud
dramatic struggles.
—Lewis Mumford,
Technics and Civilization (1934)
THE COLLAPSE OF WESTERN CIVILIZATION
A View from the Future
Naomi Oreskes and Erik M. Conway
COLUMBIA UNIVERSITY PRESS
NEW YORK
Columbia University Press
Publishers Since 1893
New York Chichester, West Sussex
cup.columbia.edu
Copyright © 2014 Naomi Oreskes and Erik M. Conway
All rights reserved
E-ISBN 978-0-231-53795-7
Library of Congress Cataloging-in-Publication Data
Oreskes, Naomi.
The collapse of western civilization : a view from the future / Naomi Oreskes and Erik M. Conway.
pages cm
Includes bibliographical references.
ISBN 978-0-231-16954-7 (pbk. : alk. paper) — ISBN 978-0-231-53795-7 (ebook)
1. Civilization, Western—Forecasting. 2. Civilization, Western—21st century. 3. Science and civilization. 4. Progress—Forecasting. 5. Twenty-first century—Forecasts. I. Conway, Erik M., 1965– II. Title.
CB158.064 2014
909'.09821—dc23
2013048899
A Columbia University Press E-book.
CUP would be pleased to hear about your reading experience with this e-book at cup-ebook@columbia.edu.
COVER DESIGN: Milenda Nan Ok Lee
COVER ART: Colin Anderson © Getty Images
References to websites (URLs) were accurate at the time of writing. Neither the authors nor Columbia University Press are responsible for URLs that may have expired or changed since the manuscript was prepared.
This book is based on the essay of the same name that was originally published in Daedalus (Winter 2013), the journal of The American Academy of Arts and Sciences. That essay has been slightly expanded and modified from its original publication, and the lexicon and interview are new to this book.
Contents
ACKNOWLEDGMENTS
INTRODUCTION
1
The Coming of the Penumbral Age
2
The Frenzy of Fossil Fuels
3
Market Failure
Epilogue
Lexicon of Archaic Terms
Interview with the Authors
NOTES
ABOUT THE AUTHORS
MAPS
Amsterdam
Bangladesh
New York City
Florida
Acknowledgments
We are grateful to Robert Fri, Stephen Ansolabehere, and the staff at the American Academy of Arts and Sciences for commissioning the original version of this work; to the Institute of Advanced Studies at the University of Western Australia where that version was first written; and to Patrick Fitzgerald, Roy Thomas, Milenda Lee, and the diligent and creative team at Columbia University Press for turning it into a book.
We also thank our agent, Ayesha Pande, without whom our work would be written but not necessarily read; Kim Stanley Robinson for inspiration; and the audience member at the Sydney Writers’ Festival who asked one of us: “Will you write fiction next?”
Introduction
Science fiction writers construct an imaginary future; historians attempt to reconstruct the past. Ultimately, both are seeking to understand the present. In this essay, we blend the two genres to imagine a future historian looking back on a past that is our present and (possible) future. The occasion is the tercentenary of the end of Western culture (1540–2093); the dilemma being addressed is how we—the children of the Enlightenment—failed to act on robust information about climate change and knowledge of the damaging events that were about to unfold. Our historian concludes that a second Dark Age had fallen on Western civilization, in which denial and self-deception, rooted in an ideological fixation on “free” markets, disabled the world’s powerful nations in the face of tragedy. Moreover, the scientists who best understood the problem were hamstrung by their own cultural practices, which demanded an excessively stringent standard for accepting claims of any kind—even those involving imminent threats. Here, our future historian, living in the Second People’s Republic of China, recounts the events of the Period of the Penumbra (1988–2093) that led to the Great Collapse and Mass Migration (2073–2093).
1
The Coming of the Penumbral Age
The nation formerly known as the Netherlands Once referred to as the “Low Countries” of Europe, much of the land area of this nation had been reclaimed from the sea by extensive human effort from the sixteenth through the twentieth centuries. The unexpectedly rapid rise of the seas of the Great Collapse overwhelmed the Dutch citizens. The descendants of their survivors largely reside in the Nordo-Scandinavian Union, while the rusting skyscrapers of their drowned cities are a ghostly reminder of a glorious past.
In the prehistory of “civilization,” many societies rose and fell, but few left as clear and extensive an account of what happened to them and why as the twenty-first-century nation-states that referred to themselves as Western civilization. Even today, two millennia after the collapse of the Roman and Mayan empires and one millennium after the end of the Byzantine and Inca empires, historians, archaeologists, and synthetic-failure paleoanalysts have been unable to agree on the primary causes of those societies’ loss of population, power, stability, and identity. The case of Western civilization is different because the consequences of its actions were not only predictable, but predicted. Moreover, this technologically transitional society left extensive records both in twentieth-century-style paper and in twenty-first-century electronic formats, permitting us to reconstruct what happened in extraordinarily clear detail. While analysts differ on the exact circumstances, virtually all agree that the people of Western civilization knew what was happening to them but were unable to stop it. Indeed, the most startling aspect of this story is just how much these people knew, and how unable they were to act upon what they knew. Knowledge did not translate into power.
For more than one hundred years before its fall, the Western world knew that carbon dioxide (CO2) and water vapor absorbed heat in the planetary atmosphere. A three-phase Industrial Revolution led to massive release of additional CO2, initially in the United Kingdom (1750–1850); then in Germany, the United States, the rest of Europe, and Japan (1850–1980); and finally in China, India, and Brazil (1980–2050). (Throughout this essay, I will use the nation-state terms of the era; for the reader not familiar with the political geography of Earth prior to the Great Collapse, the remains of the United Kingdom can be found in present-day Cambria; Germany in the Nordo-Scandinavian Union; and the United States and Canada in the United States of North America.) At the start of the final phase, in the mid-twentieth century, some physical scientists—named as such due to the archaic Western convention of studying the physical world in isolation from social systems—recognized that the anthropogenic increment of CO2 could theoretically warm the planet. Few were concerned; total emissions were still quite low, and in any case, most scientists viewed the atmosphere as an essentially unlimited sink. Through the 1960s, it was often said that “the solution to pollution is dilution.”
Things began to change as planetary sinks approached saturation and “dilution” was shown to be insufficient. Some chemical agents had extremely powerful effects even at very low concentrations, such as organochlorine insecticides (most famously the pesticide dichloro
diphenyltrichloroethane, or DDT) and chlorinated fluorocarbons (CFCs). The former were shown in the 1960s to disrupt reproductive function in fish, birds, and mammals; scientists correctly predicted in the 1970s that the latter would deplete the stratospheric ozone layer. Other saturation effects occurred because of the huge volume of materials being released into the planetary environment. These materials included sulfates from coal combustion, as well as CO2 and methane (CH4) from a host of sources including fossil fuel combustion, concrete manufacture, deforestation, and then-prevalent agricultural techniques, such as growing rice in paddy fields and producing cattle as a primary protein source.
In the 1970s, scientists began to recognize that human activities were changing the physical and biological functions of the planet in consequential ways—giving rise to the Anthropocene Period of geological history.1 None of the scientists who made these early discoveries was particularly visionary: many of the relevant studies were by-products of nuclear weapons testing and development.2 It was the rare man—in those days, sex discrimination was still widespread—who understood that he was in fact studying the limits of planetary sinks. A notable exception was the futurist Paul Ehrlich, whose book The Population Bomb was widely read in the late 1960s but was considered to have been discredited by the 1990s.3
Nonetheless, enough research accumulated to provoke some response. Major research programs were launched and new institutions created to acknowledge and investigate the issue. Culturally, celebrating the planet was encouraged on an annual Earth Day (as if every day were not an Earth day!), and in the United States, the establishment of the Environmental Protection Agency formalized the concept of environmental protection. By the late 1980s, scientists had recognized that concentrations of CO2 and other greenhouse gases were having discernible effects on planetary climate, ocean chemistry, and biological systems, threatening grave consequences if not rapidly controlled. Various groups and individuals began to argue for the need to limit greenhouse gas emissions and begin a transition to a non-carbon-based energy system.
In the 1970s, scientists began to recognize that human activities were changing the physical and biological functions of the planet in consequential ways—giving rise to the Anthropocene Period of geological history.
Historians view 1988 as the start of the Penumbral Period. In that year, world scientific and political leaders created a new, hybrid scientific-governmental organization, the Intergovernmental Panel on Climate Change (IPCC), to communicate relevant science and form the foundation for international governance to protect the planet and its denizens. A year later, the Montreal Protocol to Control Substances that Deplete the Ozone Layer became a model for international governance to protect the atmosphere, and in 1992, based on that model, world nations signed the United Nations Framework Convention on Climate Change (UNFCCC) to prevent “dangerous anthropogenic interference” in the climate system. The world seemed to recognize the crisis at hand, and was taking steps to negotiate and implement a solution.
But before the movement to change could really take hold, there was backlash. Critics claimed that the scientific uncertainties were too great to justify the expense and inconvenience of eliminating greenhouse gas emissions, and that any attempt to solve the problem would cost more than it was worth. At first, just a handful of people made this argument, almost all of them from the United States. (In hindsight, the self-justificatory aspects of the U.S. position are obvious, but they were not apparent to many at the time.) Some countries tried but failed to force the United States into international cooperation. Other nations used inertia in the United States to excuse their own patterns of destructive development.
By the end of the millennium, climate change denial had spread widely. In the United States, political leaders—including the president, members of Congress, and members of state legislatures—took denialist positions. In Europe, Australia, and Canada, the message of “uncertainty” was promoted by industrialists, bankers, and some political leaders. Meanwhile, a different version of denial emerged in non-industrialized nations, which argued that the threat of climate change was being used to prevent their development. (These claims had much less environmental impact, though, because these countries produced few greenhouse gas emissions and generally had little international clout.)
There were notable exceptions. China, for instance, took steps to control its population and convert its economy to non-carbon-based energy sources. These efforts were little noticed and less emulated in the West, in part because Westerners viewed Chinese population control efforts as immoral, and in part because the country’s exceptionally fast economic expansion led to a dramatic increase in greenhouse gas emissions, masking the impact of renewable energy. By 2050, this impact became clear as China’s emissions began to fall rapidly. Had other nations followed China’s lead, the history recounted here might have been very different.4
But as it was, by the early 2000s, dangerous anthropogenic interference in the climate system was under way. Fires, floods, hurricanes, and heat waves began to intensify. Still, these effects were discounted. Those in what we might call active denial insisted that the extreme weather events reflected natural variability, despite a lack of evidence to support that claim. Those in passive denial continued life as they had been living it, unconvinced that a compelling justification existed for broad changes in industry and infrastructure. The physical scientists studying these steadily increasing disasters did not help quell this denial, and instead became entangled in arcane arguments about the “attribution” of singular events. Of course the threat to civilization inhered not in any individual flood, heat wave, or hurricane, but in the overall shifting climate pattern, its impact on the cryosphere, and the increasing acidification of the world ocean. But scientists, trained as specialists focused on specific aspects of the atmosphere, hydrosphere, cryosphere, or biosphere, found it difficult to articulate and convey this broad pattern.
The year 2009 is viewed as the “last best chance” the Western world had to save itself, as leaders met in Copenhagen, Denmark, to try, for the fifteenth time since the UNFCCC was written, to agree on a binding, international law to prevent disruptive climate change. Two years before, scientists involved in the IPCC had declared anthropogenic warming to be “unequivocal,” and public opinion polls showed that a majority of people—even in the recalcitrant United States—believed that action was warranted. But shortly before the meeting, a massive campaign was launched to discredit the scientists whose research underpinned the IPCC’s conclusion. This campaign was funded primarily by fossil fuel corporations, whose annual profits at that time exceeded the GDPs of most countries.5 (At the time, most countries still used the archaic concept of a gross domestic product, a measure of consumption, rather than the Bhutanian concept of gross domestic happiness to evaluate well-being in a state.) Public support for action evaporated; even the president of the United States felt unable to move his nation forward.
By the early 2000s, dangerous anthropogenic interference in the climate system was under way. Fires, floods, hurricanes, and heat waves began to intensify. Still, these effects were discounted.
Meanwhile, climate change was intensifying. In 2010, record-breaking summer heat and fires killed more than 50,000 people in Russia and resulted in more than $15 billion (in 2009 USD) in damages. The following year, massive floods in Australia affected more than 250,000 people. In 2012, which became known in the United States as the “year without a winter,” winter temperature records, including for the highest overnight lows, were shattered—something that should have been an obvious cause for concern. A summer of unprecedented heat waves and loss of livestock and agriculture followed. The “year without a winter” moniker was misleading, as the warm winter was largely restricted to the United States, but in 2023, the infamous “year of perpetual summer” lived up to its name, taking 500,000 lives worldwide and costing nearly $500 billion in losses due to fires, crop failure, and the deaths of livestock and companion anima
ls.
The loss of pet cats and dogs garnered particular attention among wealthy Westerners, but what was anomalous in 2023 soon became the new normal. Even then, political, business, and religious leaders refused to accept that what lay behind the increasing destructiveness of these disasters was the burning of fossil fuels. More heat in the atmosphere meant more energy had to be dissipated, manifesting as more powerful storms, bigger deluges, deeper droughts. It was that simple. But a shadow of ignorance and denial had fallen over people who considered themselves children of the Enlightenment. It is for this reason that we now know this era as the Period of the Penumbra.
The loss of pet cats and dogs garnered particular attention among wealthy Westerners, but what was anomalous in 2023 soon became the new normal. … A shadow of ignorance and denial had fallen over people who considered themselves children of the Enlightenment.
It is clear that in the early twenty-first century, immediate steps should have been taken to begin a transition to a zero-net-carbon world. Staggeringly, the opposite occurred. At the very time that the urgent need for an energy transition became palpable, world production of greenhouse gases increased. This fact is so hard to understand that it calls for a closer look at what we know about this crucial juncture.
2
The Frenzy of Fossil Fuels
Bangladesh Among North Americans, Bangladesh—one of the poorest nations of the world—served as an ideological battleground. Self-described “Climate Hawks” used it to levy moral demands for greenhouse gas reductions so that it would not suffer inundation, while so-called “Climate Realists” insisted that only economic growth powered by cheap fossil fuels would make Bangladeshis wealthy enough to save themselves. In reality, “unfettered economic growth” made a handful of Bangladeshis wealthy enough to flee. The poor were left to the floods.