Book Read Free

End Times: A Brief Guide to the End of the World

Page 15

by Bryan Walsh


  That it didn’t was thanks in part to his efforts, and those of Paul Crutzen and Mario Molina, who shared the 1995 Nobel Prize in Chemistry with Rowland for their research connecting CFCs and ozone depletion. But science alone wasn’t enough. In the 1980s governments began to take steps to ban CFCs from use, efforts that accelerated after a shocking hole in the ozone over Antarctica was discovered in 1985. By 1987 the international community had adopted the Montreal Protocol, a global agreement to phase out ozone-depleting substances, including CFCs.43

  The Montreal Protocol was the first global treaty to be ratified by every country in the world, and it has been called “the single most successful international agreement to date” by former UN secretary-general Kofi Annan. Thanks to the protocol, depletion of the ozone layer was eventually halted. By 2050 the ozone layer should be mostly restored, preventing more than 1.6 million skin cancer deaths in the United States alone.

  The Montreal Protocol represents the ideal response to man-made existential risks: recognize the science, come up with a technical and political solution, implement it. And do it quickly—a little more than a decade passed between Rowland, Crutzen, and Molina’s work on the ozone-depleting effects of CFCs and the adoption of the Montreal Protocol.

  This is exactly what environmentalists wanted to do with climate change at Copenhagen—and they failed. Now all but the most optimistic environmental activists have given up hope that we could ever adopt a binding global treaty that would limit CO2 emissions as clearly as the Montreal Protocol did for CFCs. And the primary reasons lay not in political failures or public apathy, but in the “super-wicked” nature—an actual scientific term—of climate change.

  While CFCs were largely limited to refrigerants and aerosol sprays—a small part of the economy44—fossil fuels are everywhere. While the effects of ozone depletion could be seen immediately and undeniably—most of all in a giant ozone hole that had opened up over the South Pole—the effects of climate change are time delayed and muddled, easily lost amid the normal swings of extreme weather. While many of the major companies manufacturing CFCs had begun work on effective substitutes years before the Montreal Protocol was adopted45—which helped ensure that it cost the United States just $21 billion to comply with the treaty46—there is no simple and inexpensive technical substitution for oil, coal, and natural gas.

  The political leaders of 1987 weren’t wiser than those of 2009—it was under President Reagan, no one’s idea of an environmentalist, that the United States negotiated the Montreal Protocol. And it wasn’t that the companies that made billions off CFCs necessarily cared about the planet more than oil and coal companies—DuPont, the main manufacturer of CFCs, resisted the science linking the chemicals to ozone depletion for years. It was simply much, much easier to phase out CFCs than it has been to eliminate fossil fuels. And that will not change anytime soon.

  You may never have heard of Vaclav Smil, a professor emeritus at the University of Manitoba, but no less than Bill Gates believes he’s one of the smartest people in the world. The Microsoft founder once wrote: “I wait for new Smil books the way some people wait for the next Star Wars movie.”47 Smil has written dozens of books, but he is best known for his work on energy, especially the social and technological transition from one source of energy to another. That is essentially what the campaign against climate change amounts to—accelerating the transition from carbon-intensive fossil fuels to carbon-free sources like solar.

  Many environmentalists believe that the only thing preventing that transition is a lack of political will. In 2008, former vice president Al Gore called on the United States to make the transition to a carbon-free electricity system within ten years. More recently the Green New Deal championed by progressives like Democratic representative Alexandria Ocasio-Cortez called for moving the United States to 100 percent renewable electricity by 2035.48 “This goal is achievable, affordable, and transformative,” Gore said in 2008. “The future of human civilization is at stake.”

  Smil might agree with the latter statement—but not the former. And the reason why boils down to a simple factor: energy density. Fossil fuels have it—renewable energy sources, for the most part, do not.

  From our time as hunter-gatherers to our current lives as office workers, humans have repeatedly transitioned from one source of energy to another. Every time that transition took place, we shifted to an energy source that has greater density, meaning it can release more power per unit of fuel. From human muscle power to harnessing animals, from burning wood to burning coal, from refining oil to splitting the atom through nuclear fission—with each transition we earned more bang for our energy buck. Energy transitions take a long time, much longer than you might expect—almost the entire nineteenth century passed before ancient biofuels like wood were replaced as the major global energy source by fossil fuels, mainly coal.49 That’s largely because when an energy source becomes dominant, an entire infrastructure is built to produce it and service it, and that infrastructure takes time to tear down and replace with something new—even if that something new is superior.

  When it comes to the full transition to renewable energy, that means we could be in for a long wait. Wind and solar only began to be a measurable part of the global energy mix in the 1980s.50 Even today—despite years of rapid growth—non-hydro renewables account for less than 4 percent of global energy consumption.51 A 2018 report by the International Energy Agency found that as fast as renewables are growing—displacing fossil fuels like coal while they do so—it’s not nearly fast enough to prevent dangerous, perhaps even catastrophic levels of warming.52 This is exactly what Smil has predicted.

  And the renewable energy transition must overcome another obstacle, one that previous transitions didn’t face. Instead of shifting to a fuel source that has greater energy density than the current incumbent, the renewable revolution demands that we move to a less dense energy source. Wind and solar and biofuels require more land and produce less energy per unit than fossil fuels like oil and coal, which is a problem, since land is one of the few resources on this planet that are truly scarce. Renewables are also intermittent—wind turbines only produce power in the breeze, and solar panels only produce electricity when the sun shines—while fossil fuels can be stored and burned anytime and anywhere. Scientists are hard at work developing technologies that could economically store the energy produced by wind and solar so it can be used around the clock, but a barrel of crude oil is already a stable, movable battery in liquid form, one that is especially useful for powering transportation, which accounts for 14 percent of global greenhouse gas emissions.53 “Give me mass-scale storage and I don’t worry at all. With my wind and [solar] photovoltaics I can take care of everything,” Smil told Science in 2018. But, he added, “we are nowhere close to it.”54

  Smil believes the transition away from fossil fuels must happen and will happen, both because he is worried about climate change and because while we’ve gotten better and better at digging them out of the Earth, fossil fuels are ultimately finite in a way that the wind and the sun aren’t. Fossil fuels also have dire costs besides climate change, like the air pollution they create that hastens the deaths of millions of people per year.55 (The smoggy skies of New Delhi or Beijing—or Los Angeles on a bad day—are largely the side effect of burning oil and coal.) There are security and performance benefits to switching to renewables, especially solar, which doesn’t require a sprawling and vulnerable electrical grid.

  All of these factors—in addition to the drive to reduce carbon emissions—explain why more solar power capacity was added globally in 2017 than fossil fuels and nuclear power combined.56 The price of solar power has fallen by an incredible 99 percent over the last four decades, driven by a mix of government policy, technological improvements, and simple experience.57 But the basic barriers Smil identified haven’t been broken. “We are an overwhelmingly fossil-fueled civilization,” he has written. “Given the slow pace of major resource substitutions, t
here are no practical ways to change this reality for decades to come.”58

  It’s not for lack of trying—not exactly, at least. According to a 2017 report from Stanford University, the world is spending about $746 billion a year on renewable energy, energy efficiency, and other low-carbon technologies.59 More money is probably earmarked to fight climate change than every other existential risk in this book—combined. We’re just not spending enough, if enough is defined as forcing an energy transition that is fast enough to eliminate the risk of dangerous climate change. The Stanford report calculated that if we want to keep global carbon concentrations below 450 ppm, the private sector needs to start spending $2.3 trillion a year, approximately three times what we spend now and more than the global budget for defense.60 And many scientists and environmentalists believe we actually need to bring carbon concentration down to 350 ppm to ensure our safety, closer to the level the Earth experienced during the preindustrial age, which would surely require trillions upon trillions of dollars more.

  Could we do it? Sure—if we decided to mobilize the entire global economy and population around the singular goal of cutting carbon as fast as possible. That’s what the United States did during World War II, only with the aim of defeating Japan and Germany. Washington nationalized much of the steel, coal, and transportation industries, and required car companies, for example, to stop making vehicles for domestic consumption and instead churn out planes and tanks. It worked because the government forced business to do its will, with the support of virtually the entire American population, a unity that is unimaginable today. Meat, sugar, gasoline, and more were strictly rationed. The United States became the arsenal of democracy and won the war chiefly because it could outproduce its enemies—though the labor of some physicists at Los Alamos played a part, too.

  Bill McKibben, the author and environmentalist who helps lead the 350.org climate activist movement, cited the example of World War II in a 2016 article for the New Republic.61 McKibben wrote: “If Nazis were the ones threatening destruction [from climate change] on such a global scale today, America and its allies would already be mobilizing for a full-scale war.” But World War II lasted only a few years, after which the government loosened the reins and the private sector got back to the business of business. A war against climate change, by contrast, would be open-ended. And Nazi Germany represented a clear existential threat to the world, at least the world as we valued it. Can we say climate change is an existential threat in the same way?

  That the natural world is degraded from what it once was is indisputable. If Christopher Columbus were to arrive in the Americas today aboard his Nina, Pinta, and Santa Maria, he would find 30 percent less biodiversity than in 1492.62 The global population of vertebrates has declined by an estimated 52 percent between 1970 and 2010.63 The current extinction rate is 100 to 1,000 times higher than it has been during normal—meaning non-mass extinction—periods in biological history,64 with amphibians going extinct 45,000 times faster than the norm.65 One point eight trillion pieces of plastic trash, weighing 79,000 tons, now occupies an area three times the size of France in the Pacific Ocean66—and this Great Pacific Garbage Patch is expected to grow 22 percent by 2025.67 And of course there is the climate change that has happened and the climate change that is to come.

  Yet amid all this natural loss, human beings, on the aggregate, have largely thrived. That’s definitely true compared to the days of Columbus—the economist Angus Maddison estimates that between 1500 and 2008, global average per-capita gross domestic product (GDP) multiplied by more than thirteenfold.68 Much of that gain has occurred in recent years, as globalization helped lift more than a billion people out of extreme poverty in the developing world since 1990 alone69—the same years when environmental damage, including the first signs of climate change, began compounding. Nor is this simply a story of GDP. The Human Development Index (HDI) is a composite statistic developed by the Pakistani economist Mahbub ul Haq to track life expectancy and education, as well as per capita income. Graph every country for HDI since 1990 and you’ll see—with the occasional ups and down of individual nations—a decidedly rising trend that shows no signs of reversing.70

  Perhaps a better way to understand that picture is to look at those years when economic growth temporarily halted, and rolled backward: the 2008 global financial crisis and its immediate aftermath. The Great Recession led to a loss of more than $2 trillion in economic growth globally, a reduction of nearly 4 percent.71 By one estimate the crisis cost every American nearly $70,000 in lifetime income on average.72 But that’s just money. The real human cost of the financial crisis was in broken families, sickness, even death. One study found that the crisis was associated with at least 260,000 excess cancer-related deaths around the world, many of them treatable.73 As unemployment rates rose, so did suicide rates—a correlation that has been seen in past financial crises.74 Nothing we’ve experienced with climate change or any other environmental threat compares—yet—to the sheer suffering that was inflicted on the world when the wheels of growth temporarily stopped. “We’re still dealing with the political and economic blowback from the 2008 financial crisis, and everything about it makes climate change seem like a walk in the park,” Ted Nordhaus told me.

  This picture—a degrading natural world poised against a generally improving humanity—has a name: the environmentalist’s paradox. In 2010, a team of researchers led by Ciara Raudsepp-Hearne at McGill University tried to figure out what could explain it. They came up with four hypotheses. It could be that humanity only appears to be better off—but the numbers belie the conclusion that improved human well-being is an illusion, including those, like the HDI, that measure more than just raw economic growth. It may be that the remarkable improvements in food production over the past century are so important that they simply outweigh any environmental drawbacks, however severe. For most of human history deadly famine was just one bad harvest away. That’s no longer the case for most of the world.

  It may be that technological advances and increased wealth could make us less dependent on a healthy ecosystem. That third hypothesis—a brand of technoptimism—is implicit in climate economics. The Stern Review, put out by the British economist Lord Nicholas Stern in 2006, is one of the most exhaustive explorations ever completed on how climate change might damage the global economy. It’s also one of the most pessimistic. Yet even the most extreme scenario that Stern outlines has climate change triggering a 32 percent decline in economic output relative to the baseline in 2200. That sounds like a lot, and it is—but since Stern also assumes that economic per capita growth will continue through all of those years, humans at the dawn of the twenty-third century will still be nine times richer per person than they were when the Stern Review was published.75 In other words, climate change will make us poorer than we would be in its absence—but we’ll keep getting richer, much richer, than we are today.

  If that really is how climate change will unfold, then it is not an existential risk. The natural world will continue to suffer and degrade, and we’ll be affected—some of us far more than others, especially those who have done the least to contribute to the problem—but humanity on the whole will be okay. More than okay, at least by our generally grim historical standards.

  But there’s a fourth possible answer to the environmentalist’s paradox: the worst is yet to come. In this hypothesis, the material benefits that humanity has enjoyed over the last few decades have been purchased on overextended carbon credit, and just as previous credit bubbles have inevitably led to painful economic contractions, so it will be with our subprime environmental loans. In this scenario the desire of 7-plus billion human beings for an American-style middle-class lifestyle, with all the energy and meat and pollution that entails, is not sustainable, environmentally or economically. The bill will come due, and when it does, no degree of innovation will save us from collapse. Climate change in this scenario is very much an existential risk. Not only are there environmental limi
ts that we can’t innovate our way around, but there are also tipping points for the climate—and we exceed them at our existential peril.

  The uncertainty at the core of the environmentalist’s paradox is emblematic of global warming science more generally. Why provide only one answer when you can have four—or more? Climate scientists try to predict the future by constructing models that anticipate both how humans will react to warming—by reducing carbon emissions or substituting energy sources—and how the climate itself will respond to those efforts. Those models spit out an array of different scenarios for the future—more than a thousand in the IPCC’s most recent assessment76—that are further whittled down into projections of a range of possible future climates.

  Skeptics often seize on the uncertainty as evidence that climate change isn’t real or isn’t something we need to worry about. They’re dead wrong—the physical basics behind the science of man-made climate change are well established, and existing climate models have been confirmed by real-world warming over the past several decades. But in part to avoid giving skeptics more ammunition, scientists tend to emphasize the most likely, middle-of-the-road projections for future warming, rather than the extremes.

  In its 2014 assessment of climate science, the IPCC reported that without significant emissions reductions, we’re most likely to experience warming in the range of 7 degrees above preindustrial levels by the end of the century.77 Seven degrees Fahrenheit would be almost twice the 3.6-degree rise that the UN climate system was originally formed to prevent, all the way back in 1992. It would be a climate warmer than anything Homo sapiens has ever experienced before, warmer than it was during the Pliocene epoch more than 3 million years ago, when sea levels were as much as eighty-two feet higher than they are today.78 Would that represent an existential threat? Perhaps—although some of the worst impacts, like the coastline-reshaping rise in sea level, would still take centuries more to play out, giving us time to adapt, perhaps in ways we can’t begin to anticipate now. “Will the planet still be around then, too?” Gernot Wagner asked himself when I visited his office in Cambridge, Massachusetts. “Yes. Will society as we know it? No. The rich will be fine. They’ll buy a second air conditioner and fly their private jet to Aspen. The poor as usual will suffer extraordinarily more than the rich. But is it an existential risk for them? No.”

 

‹ Prev