The End of Doom
Page 27
The Emerging Climate and Energy Consensus
“The Kyoto Protocol is dead. There will be no further global treaties that set binding limits on the emissions of greenhouse gases (GHG) after Kyoto runs out in 2012.” That’s what I wrote back in 2004 when I was reporting on the 10th Conference of the Parties to the UNFCCC in Buenos Aires, Argentina. I also cited the prediction of Taishi Sugiyama, a senior researcher at Japan’s Central Research Institute of Electric Power Industry, who flatly stated that Kyoto signatories Canada, Japan, and Russia would withdraw from the treaty after 2012. As noted earlier, Sugiyama’s prediction has come true.
Instead of UN carbon-rationing schemes, Sugiyama recommended in 2004 that a clean energy technology-push approach be formalized in a Zero Emissions Technology Treaty. Such a treaty would have greater appeal because it avoids the inevitable conflicts over allocating emissions targets and because most countries recognize the importance of long-term technological progress. Sugiyama presciently argued that a global cap-and-trade system is way too premature for developing countries to join because effective low-cost ways to cut carbon emissions simply don’t exist. “I cannot imagine a cap-and-trade system over the whole globe without low-cost energy and emissions control technologies,” said Sugiyama. Ten years later, Sugiyama’s insight that the Kyoto Protocol is a dead end and that the best way to address man-made global warming is to develop clean energy sources so that they become cheaper than fossil fuels is emerging as the new energy technology consensus.
In May 2014, former undersecretary for global affairs Timothy Wirth, who was the Clinton administration’s lead negotiator for the Kyoto Protocol, and former South Dakota senator Thomas Daschle conceded that “the international community should stop chasing the chimera of a binding treaty to limit CO2 emissions.” They note that more than two decades of UN climate negotiations have failed because “nations could not agree on who is to blame, on how to allocate emissions, or on projections for the future.” Both firmly believe that man-made global warming poses significant risks to humanity.
To address those risks, Wirth and Daschle now advocate that the climate negotiators adopt a system of “pledge and review” at the 2015 Paris conference of the parties to the UNFCCC. In such a scheme, nations would make specific pledges to cut their carbon emissions, to adopt clean energy technologies, and to wring more GDP out of each ton of carbon emitted. The parties would review their progress toward reducing greenhouse gas emissions every three years and make further pledges as necessary to achieve the goal of keeping the increase in average global temperature under 2°C. Since there would be no legally binding targets, there would be no treaty that would require politically difficult ratification. If insufficient progress is being made by 2020, they argue that countries should consider adopting a globally coordinated price on carbon.
Wirth and Daschle note that the markets for renewable energy, especially wind and solar, as well as natural gas, the least carbon-intensive fossil fuel, are expanding. Crucially the two believe that the 2015 UN climate change conference in Paris should aim to “help accelerate the pace of technological adoption and change, toward the day when the cleanest energy sources are also the cheapest and thus become dominant.” Clearly they have joined the emerging consensus that schemes to prevent climate change by rationing carbon—for example, by imposing a cap-and-trade scheme or taxation—are doomed to failure.
Why failure? Because of the “iron law of climate policy,” argues University of Colorado political scientist Roger Pielke Jr. Pielke’s iron law declares that “when policies focused on economic growth confront policies focused on emissions reductions, it is economic growth that will win out every time.” People and their governments are very reluctant to give up the immediate benefits of economic growth—more goods and services, jobs, better education, and improved health—that access to modern fuels make possible in order to avert the distant harms of climate change.
Make Clean Energy Cheaper Than Fossil Fuels
“The paramount goal of climate policy should be to make the unsubsidized cost of clean energy cheaper than fossil fuels so that all countries deploy clean energy because it makes economic sense,” is how the Information Technology and Innovation Foundation sums up the new consensus. This perspective is also endorsed by many other policy groups, including the Breakthrough Institute and the Brookings Institution.
“Societies that are able to meet their energy needs become wealthier, more resilient, and better able to navigate social and environmental hazards like climate change,” correctly notes the Breakthrough Institute’s 2014 Our High Energy Planet report. Keeping people in developing countries in comparative energy poverty will only slow energy innovation. “The way we produce and use energy will become increasingly clean not by limiting its consumption, but by using expanded access to energy to unleash human ingenuity in support of innovating toward an equitable, low-carbon global energy system,” asserts the Breakthrough Institute report.
The first plank of the new consensus is that it is wrong to try to restrain the growth of greenhouse gas emissions by denying adequate access to modern fuels to the poor. For example, the Breakthrough Institute report rejects the International Energy Agency’s anemic recommendation that annual access to 100 kilowatt-hours of electricity per person is sufficient. That is the amount of electricity that the average American burns in three days and the average European consumes in five days. One reasonable threshold might be 8,000 kilowatt-hours, which is the quantity that the average Japanese citizen uses in a year.
Second, activist opposition to safe hydraulic fracturing to release vast quantities of natural gas trapped in deep underground shale formations is counterproductive. Burning natural gas releases about half the carbon dioxide that burning coal does. In fact, the 2013 IPCC Physical Science report identifies power generation using natural gas as a “bridge technology” that can be deployed now. Consequently, the IPCC report notes, “Greenhouse gas emissions from energy supply can be reduced significantly by replacing current world average coal‐fired power plants with modern, highly efficient natural gas combined‐cycle power plants.” Coal-fired electric power plants that emit lots of carbon dioxide are largely being shut down in the United States because they cost more than plants that emit far less carbon dioxide by burning cheap natural gas produced through fracking.
Third, environmentalist hostility to all forms of nuclear power is similarly perverse. Generating electricity using nuclear power emits almost no greenhouse gases while assuring a stable supply of baseload power. Activist resistance to nuclear power may be lessening. No one would accuse climate researchers James Hansen, Kerry Emanuel, Ken Caldeira, and Tom Wigley of excessive moderation when it comes to banging the climate crisis drum. In November 2013 the four joined the new consensus by issuing an open letter challenging the broad environmental movement to stop fighting nuclear power and embrace it as a crucial technology for averting the possibility of a climate catastrophe by supplying zero-carbon energy. The letter point-blank states that “continued opposition to nuclear power threatens humanity’s ability to avoid dangerous climate change.” The four add, “While it may be theoretically possible to stabilize the climate without nuclear power, in the real world there is no credible path to climate stabilization that does not include a substantial role for nuclear power.”
The fourth and most provocative plank of the new energy technology consensus is that government research and development spending on zero-carbon forms of energy supply must be dramatically ramped up. “Robust government support, including significant investment for clean energy research, development, and demonstration (RD&D), is necessary to make energy technologies cheaper than fossil fuels,” argues the ITIF.
Those of us who appreciate the power of competition and market incentives to call forth new technologies and drive down prices must recognize that governments have been massively meddling in energy markets for more than a century. Consequently, it’s really impossible to know what the
actual price of energy supplies would be in a free market. Notorious examples include the attempts at petroleum cartelization by the Organization of Petroleum Exporting Countries (OPEC) and Russia’s constant jiggering of the price of natural gas sold to European countries. In many countries, electricity is generated and distributed by government agencies that are not accountable to consumers.
For example, in the United States, a system of electricity regulation that chiefly benefited producers at the expense of consumers was established at end of the nineteenth and the beginning of the twentieth centuries. And no segment of energy supply has gone unmolested by the federal government. A comprehensive 2011 analysis of US federal government energy tax, regulatory, and research and development incentives finds that they have amounted to more than $837 billion (2010 constant dollars) since 1950. Of that amount, $153 billion was spent on energy research and development. By 2010, nuclear energy had received $74 billion in R&D funding, coal, $36 billion, and renewables, $24 billion. Federal R&D for oil, natural gas, and geothermal totaled $21 billion. These subsidies undoubtedly distorted both the supply and demand for these forms of energy, thus masking the actual comparative costs and benefits of each.
The better course would be to establish a level playing field by eliminating all energy subsidies and incentives and letting the cheapest technologies developed by innovators win in the marketplace. Proponents of markets must continue to push policy in this direction, but given the history of pervasive government intervention in energy markets, it is unlikely that governments will suddenly step back and allow markets to decide how to innovate and produce energy in the future. Energy production, especially for electricity, approximates a government-sanctioned monopoly that has the unfortunate side effect of stifling private innovation in energy production technology. Given that situation, the new consensus in favor of government-subsidized energy production research and development that aims to make zero-carbon energy supplies cheaper than fossil fuels looks like the least bad likely policy option for addressing concerns about climate change.
How much do proponents of the new consensus want to spend on clean energy R&D? The ITIF report suggests investing $70 billion per year globally, which amounts to less than 13 percent of the funds spent worldwide on fossil fuel subsidies now. In addition, $70 billion in R&D funding represents only about a quarter of the $254 billion spent in 2013 on deploying currently expensive and technologically clunky renewable power technologies. Given those figures, the ITIF’s estimate of what it would take to develop cheap zero-carbon technologies looks like a bargain.
The Climate Change Bottom Line
Despite the current pause in global warming and the real failings in climate computer model projections, the balance of the scientific evidence suggests that man-made climate change could become a significant problem by the end of this century. As we have seen, political progressives and environmentalists like Naomi Klein fervently promote the “climate crisis” as a pretext for radically transforming the world’s economy in ways that ratify their own ideological predilections. Thus they advocate the imposition of vast top-down regulatory schemes that ultimately amount to various forms of carbon and energy rationing.
As a response, lots of supporters of free markets and economic growth tend to underplay the science that suggests the possibility that continued unrestrained emissions of greenhouse gases could have really undesirable effects on the planet’s climate by the end of the century. Why? Because they have fallen for the false dilemma posed by progressive environmentalists of supposedly having to choose between economic growth and averting the possibility of disruptive climate change. A far better strategy for challenging radical progressive proposals is to advocate policies that further enable market-driven advances in science and technology to cut through the climate/energy conundrum. Among other things, this would include eliminating all energy subsidies, most especially those to fossil fuels.
It is surely the case that if one wants to help future generations deal with climate change, the best policies are those that encourage rapid economic growth. This would endow future generations with the wealth and superior technologies necessary to handle whatever comes at them, including climate change. In other words, in order to truly address the problem of climate change, responsible policymakers should select courses of action that move humanity from a slow-growth trajectory to a high-growth trajectory, especially for the poorest developing countries. Whatever slows down economic growth will also slow down environmental cleanup and renewal.
7
Is the Ark Sinking?
WHEN I WAS A BOY, MY FASCINATION WITH the plight of the whooping cranes was kindled by the book The Whooping Crane by National Audubon Society ornithologist Robert P. Allen. Allen was the man who is most responsible for bringing America’s tallest bird back from the brink of extinction. The total population in the wild had fallen from an estimated 10,000 before European settlement to just 15 birds by the 1930s. I was so taken by Allen’s intrepid and passionate story that during a mid-1960s visit to my Texas grandparents I whined and wheedled my parents into taking me to the San Antonio Zoo to see the two captive whoopers, Rosie and her mate, Crip. Fortunately, I was not viewing the last representatives of a species on its way out, but one on its way back. The good news is that the wild migratory population has recovered to around 280 birds and some 290 others are captive or part of reintroduction efforts. Biologists believe that nurturing the species to 1,000 birds with 250 breeding pairs would pull the whooping cranes safely back from the threshold of extinction.
While the fortunes of the whoopers may be improving, many biologists and conservationists are urgently warning that humanity is on the verge of wiping out hundreds of thousands of other species in this century. “A large fraction of both terrestrial and freshwater species faces increased extinction risk under projected climate change during and beyond the 21st century,” states the 2014 IPCC Adaptation report. “Current rates of extinction are about 1000 times the likely background rate of extinction,” starkly asserts a May 2014 review article in Science by Duke University biologist Stuart Pimm and his colleagues. “Scientists estimate we’re now losing species at 1,000 to 10,000 times the background rate, with literally dozens going extinct every day,” warns the Center for Biological Diversity. The CBD adds, “It could be a scary future indeed, with as many as 30 to 50 percent of all species possibly heading toward extinction by mid-century.” Eminent Harvard University biologist E. O. Wilson agrees. “We’re destroying the rest of life in one century. We’ll be down to half the species of plants and animals by the end of the century if we keep at this rate.” University of California at Berkeley biologist Anthony Barnosky similarly notes, “It looks like modern extinction rates resemble mass extinction rates.” Assuming that species loss continues unabated, Barnosky adds, “The sixth mass extinction could arrive within as little as three to 22 centuries.”
The Sixth Mass Extinction?
Barnosky is comparing contemporary estimates of species loss to the five prior mass extinctions that occurred during the past 540 million years in which around 75 percent of all then-living species died off each time. The most famous extinction episode—likely triggered by an asteroid crashing into the earth—killed off the dinosaurs 65 million years ago. The asserted cause of the sixth extinction event is human activity, chiefly the result of cutting down forests and warming the planet. About 1.9 million species have so far been described by researchers, whose estimates of the total number of species on the planet range from 3 to 10 million.
Let’s assume 5 million species. If Wilson is right that half could be gone by the middle of this century, that implies that species are disappearing at a rate of 71,000 per year, or just under 200 per day. Contrast this implied extinction rate with Pimm and his colleagues, who estimate that the background rate of extinction without human influence is about 0.1 species per million species years. This means that if one followed the fates of 1 million species, one would expect to ob
serve about one species going extinct every 10 years. Their new estimate is 100 species going extinct per million species years. So if the world contains 5,000,000 species, then that suggests that 500 are going extinct every year. Obviously, there is a huge gap between Wilson’s off-the-cuff estimate and Pimm’s more cautious calculations, but both assessments are troubling.
Earlier Extinction Predictions
However, this is not the first time that biologists have sounded the alarm over purportedly accelerated species extinctions. In 1970, Dr. S. Dillon Ripley, secretary of the Smithsonian Institution, predicted that in twenty-five years, somewhere between 75 and 80 percent of all the species of living animals would be extinct. That is, 75 and 80 percent of all species of animals would be extinct by 1995. Happily, that did not happen. In 1975, Paul Ehrlich and his biologist wife, Anne Ehrlich, predicted that “since more than nine-tenths of the original tropical rainforests will be removed in most areas within the next thirty years or so, it is expected that half of the organisms in these areas will vanish with it.” It’s now nearly forty years later and nowhere near 90 percent of the rain forests have been cut and no one thinks that half of the species inhabiting tropical forests have vanished.
In 1979, Oxford University biologist Norman Myers stated in his book The Sinking Ark that 40,000 species per year were going extinct and that 1 million species would be gone by the year 2000. Myers suggested that the world could “lose one-quarter of all species by the year 2000.” At a 1979 symposium at Brigham Young University, Thomas Lovejoy, who is the former president of the H. John Heinz III Center for Science, Economics, and the Environment, announced that he had made “an estimate of extinctions that will take place between now and the end of the century. Attempting to be conservative wherever possible, I still came up with a reduction of global diversity between one-seventh and one-fifth.” Lovejoy drew up the first projections of global extinction rates for the Global 2000 Report to the President in 1980. If Lovejoy had been right, between 15 and 20 percent of all species alive in 1980 would be extinct right now. No one believes that extinctions of this magnitude have occurred over the last three decades.