Saving Us
Page 5
At the same time in the U.K., an Irish scientist named John Tyndall, with the greater benefits and resources the educational system provided to the men of that era, was inventing the delicate scientific instruments needed to measure precisely how much heat was absorbed by carbon dioxide and “coal gas” (primarily methane). By the late 1800s, scientists could calculate exactly how much the planet would warm as carbon dioxide in the atmosphere increased. By the 1930s, British engineer Guy Callendar could actually measure how temperature had changed since the 1880s due to burning fossil fuels.
By 1965, White House science advisors were confident enough that climate change was real, humans were causing it, and the impacts were serious to formally warn a U.S. president, Lyndon Johnson, of the dangers increasing atmospheric carbon dioxide posed to Earth’s climate. “Within a few short centuries, we are returning to the air a significant part of the carbon that was extracted by plants and buried in the sediments during half a billion years,” scientists wrote in their report. “By the year 2000 the increase in CO2 will be close to 25 percent [relative to pre-industrial times]. This may be sufficient to produce measurable and perhaps marked changes in climate.” As with many early projections, this proved remarkably accurate.
In 1987, TIME magazine put a burning planet in a greenhouse on its cover. The next year, NASA scientist Jim Hansen testified to Congress that global warming was real. That same year, the United Nations Intergovernmental Panel on Climate Change (IPCC) was formed, and in 1990 they released the first of their now six exhaustive and ever-expanding assessment reports. These reports document everything scientists know about how climate is changing and how it will impact our world.
The Earth Summit was held in Rio de Janeiro in 1992. Nearly every country in the world, including the U.S., signed the resulting U.N. Framework Convention on Climate Change. In it, they agreed to “prevent dangerous anthropogenic [i.e., human] interference with the climate system”—but they couldn’t agree on what’s considered dangerous until they’d met twenty-one more times. Not until the Paris climate conference in 2015 was the world able to agree to keep “global temperature rise this century well below 2 degrees Celsius above pre-industrial levels and to pursue efforts to limit the temperature increase even further to 1.5 degrees Celsius.” This is what’s known as the Paris Agreement.
In 2016, thirty-one scientific organizations sent a letter to Congress: “We, as leaders of major scientific organizations, write to remind you of the consensus scientific view of climate change,” they wrote. “Observations throughout the world make it clear that climate change is occurring, and rigorous scientific research concludes that the greenhouse gases emitted by human activities are the primary driver. This conclusion is based on multiple independent lines of evidence and the vast body of peer-reviewed science.” By 2020, eighteen scientific societies in the United States, from the American Geophysical Union to the American Medical Association, had issued official statements on climate change and one hundred and ninety-eight scientific organizations worldwide had formally stated that climate change has been caused by humans. That’s how sure we scientists are.
RULING OUT THE OTHER SUSPECTS
Despite this overwhelming history of scientific understanding and global consensus, I still hear objections to the science every day. They don’t come from other scientists, but from people asking, “Don’t you know it’s been warmer before? Humans weren’t causing climate change millions of years ago. So why would you think we’re responsible now?”
What many don’t realize is that scientists don’t automatically assume it’s human-caused without checking any other options first. Just as a responsible and knowledgeable physician would first rule out all common causes of a persistent low-grade fever—infection? autoimmune disease? cancer?—so, too, have scientists rigorously examined and tested all other reasons why climate could be changing naturally. That’s why we’re so sure it’s humans this time: because natural factors all have an alibi. Here’s how we know.
IS IT THE SUN?
The Sun is the first and biggest “natural suspect.” That’s because the Earth gets nearly all of its energy from our nearest star. The Sun’s brightness fluctuates over time—a lot, over astronomical timescales, and a little, over human timescales. When energy from the Sun increases, the planet warms up slightly. This is similar to how a room brightens when you turn up the dimmer on a lamp. When the Sun’s energy decreases over decades to centuries, the Earth gets slightly cooler.
During some past cooler periods, such as the little Ice Age from the 1400s to the 1800s, the Sun’s energy was slightly below average. In the northern hemisphere, winter temperatures dropped by 1 to 2°C (1.8 to 3.6°F), for several centuries. In London, it became common for the Thames to freeze solid, and Londoners held “frost fairs” out on the ice. Some winters, so much sea ice encircled Iceland that the island was unreachable by shipping.
For today’s warming to be due to the Sun, though, its energy would have to be increasing—and it’s not. Since the 1970s, satellite radiometer data show that the Sun’s energy has been decreasing. So if the Sun were controlling our climate right now, we’d be getting cooler. Instead, Earth’s temperature continues to increase. The Sun has an alibi.
WHAT ABOUT VOLCANOES?
There’s a popular myth that one volcanic eruption produces ten times more carbon pollution then all 8 billion of us humans put together. In reality, though, volcanic eruptions don’t warm the Earth: they mainly cool it. When volcanoes erupt, they expel vast clouds of sulfur dioxide into the atmosphere. These molecules combine with water vapor to create sulfuric acid “aerosols.” These aerosol particles absorb some of the Sun’s rays and reflect some back to space, acting like an umbrella to cool the Earth.
One of the largest eruptions in human history occurred in 1815 when Mount Tambora, on the Indonesian island of Sumbawa, spewed out over 60 megatons of sulfur dioxide. For three years afterward, global temperatures dropped noticeably. This eruption has been blamed for everything from massive crop failures across the northeastern United States to famine in Europe to the disruption of the Southeast Asian monsoon season. The year 1816 became known as “the year without a summer.” Mary Shelley spent much of that dreary summer indoors in Switzerland, and the gloom inspired her to write Frankenstein and likely The Last Man, an apocalyptic and eerily prescient novel that covers plague, climate refugees, and yes, reports of a “black sun” that leads to mass panic. The Mount Tambora eruption spawned outbreaks of typhus across Europe and cholera in India, and even the heavy rainfall and flooding that was ultimately responsible for Napoleon’s defeat at Waterloo. But it didn’t heat the Earth.
It’s true that in geologically active regions such as Iceland, Sicily, and Yellowstone National Park, heat-trapping gases seep from the Earth’s crust into the atmosphere. These do have a warming effect. A small amount of heat-trapping gases are released by eruptions as well. But natural geologic emissions amount to around 1 percent of the carbon dioxide and less than 15 percent of the methane that human activities contribute to the atmosphere every year. All geologic emissions put together are equivalent to the human emissions of about three midsized U.S. states, such as Virginia, Tennessee, and Oklahoma. That’s minimal compared to how much carbon humans have been pumping into the atmosphere every year. So no, volcanoes aren’t causing the planet to warm, either.
COULD IT BE ORBITAL CYCLES?
A third legitimate climate change suspect is orbital cycles. These are caused by periodic variations in the Earth’s orbit around the Sun. Orbital cycles are responsible for the ice ages, or glacial maxima, that our planet has experienced in the distant past. They’re also responsible for the warm interglacial periods, such as the one the Earth has been experiencing for the last twelve thousand years or so. But are they responsible for the warming over the past one and a half centuries? No, and here’s why.
In the early 1800s, scientists realized that large sheets of ice once covered Europe and North Ame
rica. For a long time, though, they didn’t know what had caused these ice ages. It wasn’t until the 1920s that a brilliant young Serbian civil engineer and mathematician named Milutin Milanković figured it out. Over time, the varying gravitational pull of the larger planets stretches the Earth’s orbit around the Sun from a circle to an ellipse and back again. The axis of Earth’s own rotation also wobbles like a top. Charting six hundred thousand years of these variations by hand, he discovered that their cumulative effect creates cycles of about one hundred thousand years—the same length as the longest ice age cycles. These cycles alter how sunlight falls on the Earth, which in turn triggers the growth and retreat of the ice sheets.
The last major glacial maximum was twenty thousand years ago, so people often wonder if the planet is warming today because it’s still recovering from the most recent ice age. Sadly, it isn’t. The warming due to orbital cycles peaked about six to eight thousand years ago, in the early days of human civilization. At that point, the Earth’s temperature started very gradually decreasing. According to orbital cycles, the next major glacial maximum was due to begin in about fifteen hundred years from now. Was due, that is; because about a hundred and fifty years ago, the planet started getting warmer instead.
COULD IT BE A NATURAL CYCLE?
If the Sun, volcanoes, and orbital cycles cannot be the cause of the current warming, that leaves just one more main natural suspect, and it’s the one that’s most commonly invoked: natural cycles. But what exactly is a natural cycle, and how does it warm or cool the planet?
Natural cycles can’t create heat out of nothing. Rather, they help distribute energy around the planet by moving heat between the ocean and the atmosphere, or from east to west, and back again. They warm one part of the planet while simultaneously cooling another.
Some of the most well-known cycles even have names you might have heard of, like El Niño. During an El Niño episode, such as occurred in 2015, ocean temperatures off the coast of Peru and westward across the Pacific are warmer than average. As a result, the ocean releases heat into the atmosphere, which slightly raises the average global air temperature. It also typically brings drier-than-usual conditions to Australia and India, and wetter-than-usual conditions to the southern U.S.
In contrast, during a La Niña episode, such as occurred in 2020, cooler-than-average waters in the tropical Pacific absorb more heat from the atmosphere. Average global temperature drops slightly, flipping the pattern. It also brings wetter conditions to southeast Asia and much of Australia and drier conditions to the southern U.S.
During the so-called Medieval Warm Period, temperatures over the North Atlantic were about half a degree to one degree Celsius warmer than average for several centuries. This helped the Vikings to settle Greenland and reach northeastern Canada. Why “so-called”? Because it depends on your perspective: in Siberia during that time, it was actually the Medieval Cold Period. Temperatures over Siberia were colder than average, by about the same amount that temperatures over the North Atlantic were warmer. That’s what a natural cycle looks like.
Today, however, the entire planet is warming, particularly the oceans; so it isn’t just a natural cycle moving heat around. Over the last fifty years, the oceans have absorbed more than 90 percent of the heat being trapped inside the climate system. This means ocean heat content has increased about fifteen times more than that of the Earth’s atmosphere, land surface, and cryosphere (the Earth’s total ice and snow) combined.
Using the change in ocean heat content as a measure of climate change is far more accurate than tracking changes in air temperature. There’s little year-to-year variability when you’re looking at the steady increase in the heat content of the entire climate system rather than just one part of it, the atmosphere. So why do we hear so much about the increase in global air temperature, and not the ocean? Once again, it’s because of our perspective. What’s happening in the ocean—warming, acidification, and more—is even bigger and more alarming than what’s happening on land. If we lived underwater, we’d realize that.
HUMANS ARE RESPONSIBLE
The bottom line is this: scientists have known since the 1850s that carbon dioxide traps heat. It’s been building up in the atmosphere from all the coal, oil, and gas we’ve burned since the start of the Industrial Revolution to generate electricity, heat our homes, power our factories, and, eventually, run our cars, ships, and planes. Dozens of studies indicate that the most likely amount of warming humans are responsible for is more than 100 percent. How could it be more than 100 percent? Because according to natural factors, the planet should be cooling, not warming. We are the cause of all of the observed warming—and then some.
The planet has experienced warmer and colder times before. But as far back as we can look—and through paleoclimate records, we’re able to look back millions of years—our present-day situation is unprecedented. The development of agriculture, with its large-scale deforestation and growing herds of methane-belching cows and other ruminants, was already likely enough to stave off the next ice age and stabilize climate. And that was a good thing: we don’t want an ice-covered planet. But the Industrial Revolution kicked climate change into overdrive, and now the situation is dire.
Before the dawn of the Industrial Revolution, carbon dioxide levels in the atmosphere averaged around 280 parts per million (ppm), according to ice core records. Now, carbon dioxide levels are already more than 420 ppm, a 50 percent jump. The last time carbon dioxide levels in the atmosphere were this high was most likely over 15 million years ago. And the last time climate warmed at a similar pace to today was some 55 million years ago, during what scientists term the Paleocene-Eocene Thermal Maximum. That’s when global temperatures rose by 5 to 8°C (9–14°F) over about one hundred thousand years, and sea level was over sixty meters (two hundred feet) higher than today. Scientists consider that period to be one of the most extreme examples of natural climate change on record. Today, it’s estimated that we are currently emitting carbon into the atmosphere at ten times the pace of the natural emissions that drove this previous change.
What is the best temperature for humans? Neither hotter nor colder: it’s the Goldilocks temperature we’ve had up until now. That’s the temperature during which human civilization developed. It’s the temperature during which we allocated our water resources, designed and built our infrastructure, and parceled out our agricultural land. These are the conditions under which we have developed our socioeconomic systems, outlined our political boundaries, and staked our ownership of natural resources.
We don’t want another ice age, but today we’ve left it far behind. We are heading way too quickly in the opposite direction. Our climate isn’t changing right now because of the Sun, or volcanoes, or natural cycles. Our change is human-caused. We humans are conducting a truly unprecedented experiment with the only home we have.
I. The Sun is very hot, so most of the energy it gives off is in the shorter visible and infrared wavelengths. Heat-trapping gases don’t absorb much energy at these wavelengths. In contrast, the Earth is a lot cooler so most of the energy it gives off is in the longer infrared wavelengths. That’s exactly where heat-trapping or “greenhouse” gases absorb the most heat.
5 THE PROBLEM WITH FACTS
“We are so locked into our political identities that there is virtually no candidate, no information, no condition that can force us to change our minds.”
EZRA KLEIN, WHY WE’RE POLARIZED
“I’d like to agree with you. But if I agree with you, I have to agree with Al Gore, and I could never do that.”
FARMER SPEAKING TO KATHARINE
“Yes they are!”
“No they’re not!”
At the podium was a meteorologist from the National Oceanic and Atmospheric Administration (NOAA) hurricane division. He was showing data indicating that hurricane frequency had not changed, long-term. Usually congenial, the meteorologist was clearly getting hot under the collar because shouting at him fr
om the other side of the platform was a scientist from a NOAA research lab. This scientist’s presentation, given immediately before, had showed exactly the opposite.
It was the annual meeting of the American Meteorological Society in January 2006. The record-breaking 2005 hurricane season had just ended. There’d been so many storms that, for the first time since the U.S. began to name Atlantic hurricanes in 1953, scientists had reached the end of the alphabet and had to wrap all the way around and begin with Greek letters instead.I The eleventh named storm of 2005, Hurricane Katrina, was the most expensive tropical storm on record. It caused over eighteen hundred deaths, $125 billion in damages, and countless personal losses for the inhabitants of New Orleans and the surrounding area. So it makes sense that scientists were wondering: where does climate change come in?
I was mesmerized. I’d never seen scientists spar like this before. Scientific disagreements are usually conducted via penned barbs and sarcastic one-way remarks—not face-to-face confrontations. But even though the two scientists were red-faced and loud-voiced, no fists flew, and no personal insults were hurled. Data was the only weapon they used; and eventually, data resolved the argument.
Today scientists know that the overall number of hurricanes isn’t increasing, but the number of strong hurricanes is. We know why, too: warmer ocean waters fuel bigger, stronger storms and cause them to intensify faster. It turned out both scientists were correct; they were just looking at the data from different angles. Now we have a clearer picture of the whole. In the world of science, facts usually do win the day.
DEBUNKING FAKE NEWS
I spend a lot of time debunking science myths on social media. There, unfortunately, the same rules don’t apply. In 2020, when wildfires destroyed record areas of the western U.S., people were claiming that climate change had nothing to do with it. “It’s all arson,” they argued, “or lack of forest management; or maybe even the fires aren’t even real. After all, just look at Canada on this map—there’s no fires there!”II