Book Read Free

End Times: A Brief Guide to the End of the World

Page 14

by Bryan Walsh


  A year and a half after my trip to the Arctic, I traveled to Copenhagen for the 2009 UN climate change summit. The annual meeting hosts the more than 190 countries that are party to the United Nations Framework Convention on Climate Change (UNFCCC), the environmental treaty that launched international climate diplomacy more than twenty-five years before. If it was in Greenland that I witnessed climate change with my own eyes, it was in Copenhagen that I began to understand that we would not solve climate change—not in the way we hoped, at least.

  It was the second UN climate summit I’d attended as the environment correspondent for Time magazine. The first, two years earlier, had been held on the considerably more pleasant Indonesian island of Bali, where if nothing else the sweltering tropical temperatures were proof that we all really were working on global warming. But Bali had been a bust. In one of his earliest acts in office, then-president George W. Bush withdrew the United States from the Kyoto Protocol, the first global deal mandating specific reductions in greenhouse gas emissions. For the remainder of their eight years in office, Bush’s team acted as spoilers, blocking most meaningful efforts to create a stronger international climate pact and ensuring that summits like Bali were largely exercises in futility. Though I did enjoy witnessing stuffy UN technocrats sweating in the Indonesian batik shirts that were included in the summit’s formal dress code.19

  But the Copenhagen conference was going to be different. Climate change had become a global priority. Everyone, it seemed, from activists to corporations, wanted to be seen as good and green. Most promising of all, there was a new U.S. president, Barack Obama, who had come into office determined to restore American leadership on climate change. Obama would personally attend the conference, along with more than one hundred other world leaders.20 By the time they were scheduled to arrive in the last days of the summit, diplomats would have hammered out a successor treaty to the expiring Kyoto Protocol, one that would have committed the world—including both the United States, which had left Kyoto, and major developing nations like China, which had been exempt from the treaty’s mandated cuts—to reducing carbon emissions. Copenhagen was set to be the place where we turned the tide on climate change—or so I was told again and again that week, as I trudged through the city’s snowy streets from event to event.

  It didn’t work out that way. By the time Obama had arrived for the final official day of the summit, the talks had all but collapsed. The major sticking point was the responsibility of those big developing nations. UN climate talks had always divided the world into developed and developing nations. In the UN’s phrase, those two blocs had “common but differentiated responsibilities” toward climate change action. The common part was that global warming was a global problem with global causes and global effects, and therefore every country had a common responsibility to do something about it. The differentiated part recognized that nations that had industrialized first—the bloc of developed countries—had historically contributed much more to warming, simply because they had been emitting carbon at greater levels and for a longer period of time. That meant they had a different responsibility to cut carbon before developing nations.

  It sounded fair, and it was, if you were comparing the United States with its 16.5 tons of CO2 per person to, say, Rwanda with its 0.1 tons of CO2 per capita.21 But the bloc of developing nations included not just desperately poor countries but major industrializing powers like China that were growing explosively, with all the carbon emissions that growth demanded. China had passed the United States as the world’s top carbon emitter as early as 2005, and by the Copenhagen summit in 2009 China’s emissions were 65 percent greater than America’s.22 New administration or not, U.S. diplomats still insisted that a new climate deal had to include limits on those major developing nations, but their representatives refused to accept any restrictions on their nations’ right to emit greenhouse gases in the future, pointing to the fact that developed countries had only become developed by treating the atmosphere as a free carbon dump.

  Each side had a point: developed nations like the United States had emitted the most carbon historically, but major developing nations like China were projected to produce the bulk of future carbon emissions23—the only kind of carbon emissions, after all, that a treaty could try to restrict. The result was paralysis. As the days and nights in Copenhagen ticked on without progress, it seemed possible that the entire UN climate system, which had been in place since the original UNFCCC treaty had been negotiated at the Earth Summit in Rio de Janeiro in 1992, would disintegrate. And if that happened, there was no plan B for the planet.

  It is to President Obama’s credit that he helped salvage the day. While reporters stalked the halls of Copenhagen’s Bella Center, trolling for rumors and leaks, Obama personally negotiated with other world leaders, at one point crashing a meeting between China, Brazil, and India.24 Together they finally managed to hammer out what became known as the Copenhagen Accord—an agreement, essentially, to agree about the importance of fighting climate change and allow individual countries to make their own pledges to mitigate carbon emissions in the future.25 It sidestepped the common but differentiated responsibilities question by allowing nations to decide their own responsibilities, while putting in place a global system that would regularly check the progress of those pledges.

  The Copenhagen Accord fully satisfied no one—especially not the most ardent environmentalists or the representatives of island nations that would be swallowed up by rising seas—but it kept the game going. Copenhagen ultimately helped pave the way for a better outcome six years later at the 2015 UN climate summit in Paris, where all major emitters—including China, which now emits more CO2 than the United States and the European Union (EU) combined26—made voluntary pledges to reduce carbon emissions and work to keep global temperatures from rising more than 2.7 degrees above preindustrial levels.27

  By 2015 I had been promoted to become Time’s international editor in New York, so I missed the UN climate summit in Paris that year, which is a shame because I hear the French capital is lovely in the fall, and because it might have been nice to experience just a measure of optimism as a climate journalist. Obama was certainly optimistic after Paris. “We came together around the strong agreement the world needed,” he said from the White House after the summit concluded. “We met the moment.”28

  But I think it’s the mood of Copenhagen in 2009—dark, despairing, wintry—that better captures the state of the global fight against climate change, even years later. That last night in Denmark, after Obama left on Air Force One—to get ahead of a blizzard bearing down on the east coast,29 one of the many weather ironies of an ice-cold global warming summit—it fell to the delegates at the Conference of Parties to debate the finer points of the Copenhagen Accord, vote on the thing, and let us all go home. And yet the talks stretched on for hours upon hours, through the night and into the following morning, while bleary-eyed reporters watched. Final agreement remained elusive as the proceedings bogged down in UN legalese. I remember one delegate, from Saudi Arabia of all places, finally exploding in frustration: “This is without exception the worst plenary I have ever attended, including the management of the process, the timing, everything.”30

  And this was at a conference where almost everyone attending believed that climate change was real and must be addressed. The point was hammered home again and again: the time to act was now. We could delay no longer. Our very future was at stake. Yet inside the Bella Center, inside the corridors of power, where it mattered, no one really wanted to do all that much to slow climate change. Not if it carried any political or economic risk. Not if it could cost them their job, or restrict their citizens in almost any way. Climate change was important, sure—but not that important. Not if you judged political and business leaders—and ordinary people, too—on what they did, not what they said.

  Copenhagen represented the death of a specific kind of environmental dream: that we would come together and in a single broad
stroke legislate the planetary threat of climate change out of existence. We would not save the Earth at a conference, after all. That was true as well of 2015’s Paris Agreement, which allowed countries to pick and choose their own carbon emission targets, and which unlike the Kyoto Protocol was nonbinding. Just how nonbinding became clear less than a year after the agreement was signed, when a climate-change-denying Donald Trump became president. True to a pledge he made during his campaign, on June 1, 2017, Trump announced his intention to withdraw the United States from the Paris Agreement.31

  Even before Trump, however, all the world’s good intentions weren’t nearly good enough to prevent dangerous global warming. Tally up the pledges made by countries at Paris—including the United States—and experts say that greenhouse gas emissions in 2030 will exceed the level needed to keep eventual global temperature rise below 3.6 degrees by 12 to 14 billions tons of CO2.32 A 3.6-degree increase has long been considered a rough red line for dangerous climate change—while not precise, scientists believe that anything greater than that is more likely to lock in devastating environmental damage, including enough sea level rise to eventually erase entire island nations.33 The actual gap between what we need to do and what we will actually do is likely even greater—as of 2017 not a single major industrial country was on track to meet its Paris pledges.34 After three years of remaining roughly flat, global carbon emissions rose 1.6 percent in 2017 and an estimated 2.7 percent in 2018, reaching a historic high.35 While clean renewable energy sources like solar are growing rapidly, so is overall demand for energy—and the vast majority of that energy demand is still being met by fossil fuels, especially in developing countries.36 Polluting fuels like coal and oil still make up about 80 percent of global energy consumption, the same rough percentage they did more than thirty years ago,37 before climate change had begun to penetrate the public consciousness.

  Sixteen years ago, a climate scientist named Ken Caldeira calculated that the world would need to add about 1,100 megawatts’ worth of clean-energy capacity—roughly what’s produced by a single nuclear power plant—every day between 2000 and 2050 to avoid dangerous climate change.38 In 2018 Caldeira checked on our progress and found that we were falling more than 85 percent short of where we needed to be.

  In an article that same year in MIT Technology Review, the journalist James Temple also calculated that at the pace we’re on, it would take not three decades to remake our energy system, but nearly four centuries.39 In a 2018 report from the IPCC, scientists concluded that net greenhouse gas emissions had to reach zero by midcentury in order to have a decent shot at keeping temperature increase below 2.7 degrees, which means we have at most fourteen years of current emissions levels before that becomes impossible.40 It has been more than a century since the basics of the greenhouse effect were discovered by the Swedish chemist Svante Arrhenius, and nearly thirty years since the 1992 Earth Summit in Rio de Janeiro. Yet we’re being borne backward against the current of our carbon emissions. Optimism in the light of these facts is ignorance.

  Part of the problem is that despite the apocalyptic rhetoric, we treat climate change not as the existential risk it may be, but as the inconvenience we hope it will be. This is true not only of our world leaders, but also many of the environmentalists, activists, and scientists who passionately care about climate change. Ted Nordhaus, the executive director of the Breakthrough Institute, an environmental and energy think tank with a sometimes contrarian bent, has noted a disconnect between the apocalyptic rhetoric of hard-core climate activists and what they actually do with their own lives. While they warn that climate change is an existential threat and that the world will quite literally end if we don’t take radical actions immediately, they don’t act as if they’re in a fight to the death. They fly to international climate conferences like the one I attended in Copenhagen; they eat meat (if organic and grass raised); and they oppose nuclear power, which emits virtually zero carbon emissions, because it conflicts with conventional environmental opinion. That’s not how you would act if you truly believed that a climate asteroid was on its way. “For all the discourse about climate change being an existential threat,” Nordhaus told me, “when you look at what people actually do, it doesn’t look like that at all.”

  That might seem like hypocrisy, but it’s more accurately a measure of how deeply challenging climate change is, how embedded the causes of global warming are within our daily lives. It’s easy to blame climate-change-denying politicians or big oil companies—and there’s no doubt they play an outsize role in opposing commonsense climate solutions, even to the point of spreading outright lies. But the real fault lies in ourselves, and in the systems we’ve built.

  Copenhagen taught me one thing: we will not, as countries or as individuals, make the radical changes in our lifestyles and our energy use that would be needed to fully eliminate the risk from global warming using current technology. That means we will almost certainly blow past the various red lines climate scientists have set. We won’t keep atmospheric carbon concentrations below the 450 parts per million that climate diplomats have aimed for, not when levels have already passed 410 ppm and are rising by the year. We won’t keep global temperatures from rising above the 2.7-degree limit that the Paris Agreement calls for, and we’ll probably miss the 3.6-degree limit established by the original UN climate agreement. Without an ethical or political or technological revolution, seas will rise, forests will burn, superstorms will strike, tens of millions of people will be displaced, and who knows how many people will die. Whatever we might say about it, whatever our fears are, our ultimate attitude toward climate change has been this: we hope it won’t be that bad. But hope isn’t a policy—not when the world might be at stake.

  We can’t say we weren’t warned. All the way back in 1861, the Irish physicist John Tyndall established that gases like CO2 could warm the planet. By the 1950s, scientists were noticing a pronounced global warming trend, and began to build crude computer models of the climate. In 1965, a group of experts now known as the President’s Council of Advisors on Science and Technology (PCAST) reported to President Johnson that growing levels of atmospheric CO2 produced by the burning of fossil fuels would almost certainly cause significant changes in the climate, changes that “could be deleterious from the point of view of human beings.” In the decades that followed there would be millions of hours and hundreds of millions of dollars spent further refining climate science, but the basics were already established.

  So was the political reaction: don’t do much. The idea of deliberately reducing fossil fuel emissions wasn’t even suggested to Johnson in that 1965 report. Instead his advisers thought that spreading reflective particles over 5 million square miles of ocean, in order to repel one percent of incoming sunlight away from the Earth, might do the trick. Nothing came of it, though this was one of the first times the possibility was raised of engaging in what is now called geoengineering, the attempt to intentionally shape the Earth’s climate through direct technological intervention—a last-ditch option we’ll return to later. It was also an idea typical of the high nuclear age, a time when scientists were far less humble about tinkering with the machinery of the planet. In 1971, the Soviet Union even attempted to use three underground nuclear explosions to build a massive canal in an effort to reconstruct the Volga River basin. It didn’t work, though the effort had the unexpected side effect of convincing the United States to launch a climate change research program of its own.

  From the beginning, everything about the way climate change worked as a physical process made it easy to ignore politically. One ton of CO2 burned will on average lead to approximately 0.0000000000027 degrees Fahrenheit of warming, according to a 2009 study by scientists at Concordia University in Montreal.41 There are other greenhouse gases, like methane, and thousands of other factors—water vapor, ocean heat circulation, El Niños—that influence changes in the climate. But at its foundation, man-made climate change really is that straightforward. And
that’s a problem, because carbon is everywhere in the modern world.

  We expel carbon every time we breathe, and when we burn plant life to make room for farms or cities or anything else we need space for, carbon is released into the atmosphere. The fossil fuels like coal and oil that have made the modern world go since the dawn of the Industrial Revolution are mostly carbon. And unlike other potential pollutants—the radiation from nuclear explosions, for example—the sources of fossil fuel emissions are everywhere, from everyone. The reason Johnson’s scientific advisers didn’t even consider calling for a reduction in fossil fuel emissions in 1965 is that at that point, reducing the supply of fossil fuels would have been like reducing the supply of air. We’ve made progress—renewables contributed almost half of the growth in global energy generation in 2017, and are poised to keep growing as technologies improve and prices drop.42 But not nearly enough.

  Perhaps the best way to understand the unique challenge of climate change is to compare it to another environmental threat that once imperiled the world, and which is now largely forgotten: the destruction of the ozone layer. In the 1930s, the chemicals known as chlorofluorocarbons (CFCs) were developed as refrigerants. The chemicals quickly found their way into air conditioners, refrigerators, and aerosol sprays, in part because they were nontoxic to humans, unlike earlier artificial refrigerants. But in the early 1970s scientists discovered that when CFCs reached the stratosphere, they triggered a chemical reaction that eroded the ozone layer. When ozone is stripped away, more of the sun’s harmful ultraviolet radiation reaches the Earth’s surface. That radiation—the same radiation we use sunscreen to shield our skin from—can damage DNA, increase the chance of skin cancer, and harm animals and plants. Without the ozone layer, life as we know it would not exist.

  This was bad, existentially bad, as the University of California, Irvine chemist F. Sherwood Rowland realized during his work on CFCs in the early 1970s. Rowland told his wife one day, when she inquired about his research, “The work is going very well. But it may mean the end of the world.”

 

‹ Prev