Pandora's Seed
Page 19
During the First World War, armies around the world manufactured the gunpowder used for ammunition using a slightly refined version of a centuries-old method. Combine charcoal (carbon), saltpeter (potassium or sodium nitrate, a naturally occurring source of nitrogen), and sulfur (this was not used by the early twentieth century) in a particular ratio and voilà: a mixture that burns rapidly and explosively when ignited. The problem was that while charcoal was easy to obtain, most of the world’s saltpeter came from sources halfway around the globe, particularly in Chile. During the war, Germany was cut off from these supplies and had to find another source or face defeat. As it happened, in 1910 a German chemist named Fritz Haber had developed a way of manufacturing ammonia (which also contains nitrogen) from gases in the air. By scaling up this method, known as the Haber-Bosch process, German workers were able to produce a substitute for Chilean saltpeter and continue the war effort. Similarly, when subject to Allied fuel blockades in the Second World War, the Nazis developed a way of manufacturing a gasoline substitute from coal. In both cases, the old maxim about necessity being the mother of invention proved true.
The point of the anecdote is not to illustrate how clever the German war machine was in these two devastating conflicts but to show that circumstances can dictate when change is necessary. If the cost is high enough, it pays to innovate. Humans, uniquely among all of the earth’s species, have developed cultural inventions that allow them to adapt to virtually any circumstance. No oil? Make it from another source. Want to walk on the moon? Build a rocket to take you there. Too many people to feed? Breed higher-yielding crop varieties and create government policies that reduce the population growth rate. The more intense the crisis, the greater the incentive to develop a solution.
Now, as we move into the twenty-first century, we are facing another period of crisis. The last episode of intense climate change we experienced as a species, during the Younger Dryas, led a few hunter-gatherers living in the right locations to start cultivating crops, which enabled all of the subsequent technological innovations of the Neolithic era. These included significant genetic changes to the crops themselves, the domestication of animals, the development of complex irrigation systems, and the rise of urbanism and multilevel government. All were developed in response to changes ultimately set in motion by the climatic shifts at the end of the last ice age. The question facing us now concerns what will be developed in this new era of climatic change—whether you believe that humans are responsible or we are simply contributing to a longer-term warming trend. In other words, what is the opportunity in this crisis?
It is clear that modern humans live an incredibly energy-intensive lifestyle. Americans use an average of more than 270 kilowatt-hours (kWh) each day, while people in India and Africa average just over 10—the energy content of a liter of oil. Not surprisingly, hunter-gatherers use even less, perhaps under half of this amount. As a species, though, we have been in an accelerating trend of energy consumption since the dawn of the Industrial Revolution. This has brought us many of the conveniences that we cherish in modern life—particularly mobility, as around 40 percent of our energy consumption goes to power automobiles, airplanes, and the transport of goods. Authors and politicians bemoan our “energy addiction,” but it is unlikely to go away anytime soon. So are we like heroin addicts, heading toward rock bottom as we burn through the last of our hydrocarbon sources in the next century, oblivious to the carbon emissions they are spewing into the atmosphere?
Fortunately, thanks to the laws of economics, the answer seems likely to be no. As with other crisis points in the past, once the costs become great enough, we begin to look for alternatives. Like the Natufians forced to roam farther and farther from their permanent encampments in the Fertile Crescent 12,000 years ago, we are beginning to feel the costs associated with using a nonrenewable resource like hydrocarbons. This is the thinking behind Kyoto and other proposals for taxing carbon emissions: to assign an environmental cost to the burning of fossil fuels. It seems, too, with the age of what has come to be called “peak oil,” that the marketplace may be stepping in to regulate what governments cannot.
“Peak oil” refers to the point at which oil is maximally available from any particular source—when it is at its most plentiful. First proposed by the geologist M. King Hubbert in 1956, it accurately predicted that United States oil production would peak in the late 1960s. Supplies in other countries have followed suit, and even Saudi Arabia may have passed its peak. Meanwhile, world demand for oil is increasing—particularly in the developing world, where India, China, and others are speeding toward higher levels of oil consumption. Inevitably, according to the law of supply and demand, the price of oil must increase.
New sources may come online soon, and there are so-called unconventional reserves—tar sands, oil shale, and the like—but these are far more expensive to exploit than the giant underground petroleum bubbles that have fueled the industrial explosion of the twentieth century. The new hydrocarbon economics leave us with two possibilities: either we change our lifestyles radically or we find some other way to power our energy-guzzling society.
Several technologies are currently vying for a role in the new world of “alternative energy,” as nonhydrocarbon energy sources are called. The chemical bonds between the atoms in petroleum compounds are simply nature’s way of storing energy, like batteries. Plants living millions of years ago captured energy from the sun, and when they died some of it was trapped in ancient swamps and bogs. Over time, under the influence of intense heat and pressure deep in the earth’s crust, these stored organic compounds underwent complex transformations into the substances we know as petroleum and coal. They are the most energy-rich naturally occurring substances on earth; as Thomas Homer-Dixon points out in his compelling book The Upside of Down, “when we fill our car with gas we are pouring into the tank the energy equivalent of about two years of human manual labor.” It’s no wonder that we’re so addicted to them—nothing else even comes close in the energy sweepstakes.
The irony is that the original source of the energy in hydrocarbons is still ubiquitous and free: solar energy. While we may still have a way to go in developing high-efficiency solar collectors, the way forward seems clear: improve our ability to harness the ultimate source of energy on earth, the sun. Whether through photovoltaic cells, concentrating solar power (in which mirrors are used to focus the sun onto pipes that generate steam for electrical generators), wind farms, or wave-powered generators, all of the energy we hope to collect ultimately traces back to the sun and its powerful stream of free photons barreling down on us every day.
What even the most ardent supporters of solar energy admit, of course, is that it alone cannot power our current consumption patterns, at least not yet. In fact, even a combination of solar and other renewable energy sources wouldn’t allow us to keep up the highly mobile, energy-guzzling lifestyles those of us in the developed world enjoy today. There are two solutions to this problem: use less energy or find another source. Energy efficiency is improving all the time—with the exception of average gas mileage for America’s automobiles, which in 2007 was actually less than a Ford Model T produced a century before—but it’s unlikely that this will lead to enough of a reduction in consumption to allow us to live entirely off solar and other renewables. You also need quite a bit of energy to manufacture solar panels and wind turbines, which reduces their green credentials somewhat.
One possible answer is to move one step up the food chain from solar—to nuclear, the ultimate source of the sun’s power. Though power produced through the fusion of hydrogen atoms, which is what takes place in the sun, is unlikely to become a viable model of power generation here on earth anytime soon, the splitting of heavy atoms into lighter ones, or fission, has already been applied successfully for over half a century. France produces around 80 percent of its electricity through nuclear fission, South Korea and Japan over 30 percent, and the United States nearly 20 percent. Most countries,
however, concerned about the dangers of another Chernobyl-like disaster, as well as the political difficulties of waste disposal (who wants to live near a nuclear waste storage facility?), have not been as pro-nuclear, and overall only around 15 percent of the world’s electricity comes from nuclear power.
This looks set to change over the next century, as nuclear waste disposal methods become increasingly sophisticated and power plants become safer and more efficient. One of the most promising new technologies is currently being tried in China and South Africa, a variant known as the pebble bed reactor. Most of the immense bulk of a conventional nuclear reactor is actually devoted to the cooling system, which circulates water around the fuel rods and through a complex systems of pipes and towers in order to dissipate the heat absorbed from the fission reaction. The pebble bed reactor obviates the need for this complexity (and substantially reduces the size of the plant) by using an inert gas such as helium as the coolant, passing it around pebbles containing a mix of the nuclear fuel (e.g., uranium) and graphite. These pebbles generate the energy, and the heat is dissipated into the gas. The design is inherently safer than a conventional water-cooled design because the higher the temperature gets, the less power is generated, until eventually the fuel stops reacting altogether.
Other advances, such as hydrogen cells (generating and burning hydrogen as a fuel) and ocean thermal energy conversion (making use of the difference in temperature between deep and shallow water), seem to be less promising and have substantial technical hurdles to overcome. In the case of the former, the problem is how to efficiently generate the large quantities of hydrogen that will be needed—currently the best source is hydrocarbons obtained from oil. In the case of the latter, the amount of energy obtained is so small that it would merit application only in a very limited number of cases—on atoll islands, for instance. Similarly, geothermal power is useful in places like Iceland—a small country that sits on a gap in the earth’s crust, allowing easy access to the hot magma that can be harnessed to power steam-driven electrical generators—but not in Kansas.
Advances such as the pebble bed reactor, improved battery technology allowing all-electric cars with reasonable power and range, more efficient solar panels, and wind turbines that rotate in lighter breezes promise to replace much of our current reliance on fossil fuels. As the price of oil and natural gas rises over the next several decades, it will eventually become cheaper to use these alternative forms of energy, speeding their adoption. It is possible to foresee a combination of all of them being used in varying degrees—solar panels supplying a substantial amount of power at the household level in sunny areas, backed-up wind power where it is easy to install turbines (in the midwestern United States, for instance), with nuclear providing the balance and generating enough electricity to recharge battery-powered cars and light machinery. Only aircraft and heavy machinery will likely need to use petroleum, because of their huge power needs, and even these may one day run on electricity.
While this scenario may sound like the naïve musings of an eco-nut to some, and surely leaves out the complexity of the transition from petroleum that will certainly take place gradually throughout this century, it does seem like the only sustainable solution long-term. Eventually the oil supply will run out, or will become so expensive to extract from the ground that alternative forms of energy will be the only logical choice. We are just beginning to sense the long-term costs of inaction on climate change and peak oil, and the ease of modern communication is increasing the rate of information transfer between far-flung places. In a tentative but perceptible way, we are coming to see ourselves as part of a global community, not simply as Americans or Indians. If the world is flat, as Thomas Friedman famously pointed out, its residents are also becoming more aware of their rather perilous shared future.
It is possible that the sticking point in the Kyoto negotiations—carbon-emitting countries in the developing world—could actually end up showing us the way forward with sustainable energy. Recent announcements by the Chinese government suggest a move in this direction. And in a world of expensive hydrocarbon fuels, perhaps it will be easier to implement alternative solutions in places with less of a deep, systemic commitment to a petroleum-based lifestyle. Like cell phones, which leapfrogged landline usage in the developing world throughout late 1990s and early 2000s because cellular networks were cheaper and easier to install and there was little competition from landlines, easy-to-implement alternative energy solutions may first find their greatest audience in the less developed parts of the globe. In Mongolia, I’ve encountered nomads who obtain all of their electricity from solar panels; there simply is no power grid to plug into on the steppe. Tuvalu is experimenting with a method of generating methane for cooking stoves from pig manure. Yes, it’s a hydrocarbon, but at least it’s more sustainable than shipping in natural gas from thousands of miles away. Necessity, to reiterate, is often the mother of invention.
The coming fuel and climate crisis may, in fact, yield an unparalleled era of innovation as a dividend, one unlike any other we’ve experienced. During this new era, we will start to innovate with the realization that we’re all connected, that what we do “here” affects someone else “over there,” as well as future generations. It is notable that President Obama chose as his energy secretary Steven Chu, a Nobel Prize–winning physicist known for his commitment to alternative energy research. If this and other appointments are indicative of a new resolution to channel research dollars into sustainable energy, it should have a huge impact on the development of these technologies.
BACK TO THE SEA
Australia is the driest contiguous landform on earth. It is also the most urbanized, with around 90 percent of its population living in cities. Despite its vast size, which gives it a relatively modest population density of 6.4 people per square mile (the United States, in contrast, has 76, while Bangladesh has around 2,200), most people are actually packed into a narrow strip of land along the coasts, with one-quarter of them living in Sydney alone. The reason is the harsh climate of the interior, which makes life all but impossible without complex systems to bring in water from the coasts.
People have long been coastal dwellers, for many reasons. A steady supply of protein can be obtained from coastal resources—fish and shellfish—that can’t always be matched inland. The earliest evidence of human exploitation of coastal resources comes from Eritrea, where around 125,000 years ago people were already eating enough shellfish to produce a pile of empty shells large enough to survive to the present day. It’s not only reliable protein sources that drive people to the coast, though—as in Australia, water has long played an important role. Moisture from the ocean falls as rain on the windward side of coastal mountains, filling rivers and allowing plants—and humans—to thrive.
With the current trends in climate change, more and more people will be forced to leave behind unproductive land in continental interiors. Drought, famine, and disease, as well as the opportunities on offer in the world’s cities, will serve as powerful incentives to migration. The world is becoming more urbanized every year, and more than half of us now live in cities, most of which are located in coastal regions. This massive migration to the sea promises to overwhelm already highly stretched water supplies, leading to a full-scale water crisis. California estimates that dwindling supplies, population growth, and climate change will lead to a massive water shortfall by 2020, one for which the government currently has no solution.
The earth has a finite supply of fresh water, and much of it is wasted. While California faces the specter of widespread drought and draconian water rationing within the next decade, developers in parts of the western United States continue to build golf courses in the desert. Suburban home owners water lawns that are completely alien to the arid environments where they are being grown. Aquifers are drained to support farms whose long-term sustainability is in doubt. The Colorado River, which used to empty into the Gulf of California, is now depleted of water long bef
ore it reaches Mexico, except in unusually wet years. This is not just a North American problem, though—China and India are draining their own aquifers to irrigate farmland for their growing populations. The Aral Sea, in central Asia, has lost 80 percent of its volume as water from the two rivers that feed it has been diverted for irrigating cotton fields. The Sahel region in Africa, an arid grassland just to the south of the Sahara Desert, is experiencing significantly more drought years than in the past, and climate projections suggest that the trend will continue.
The response to such worrying trends, at least for the western United States, might be to use less water. Clearly, conservation is important, but conservation alone can’t provide enough water for everyone. Purifying wastewater, which is currently thrown away in most countries, is one possibility, but it is expensive and difficult to implement. Since much of the world’s existing fresh water is tied up in the ice sheets at the poles, theoretically they could be tapped for human consumption. However, the transportation issues involved in this would be staggeringly expensive. Ultimately, new supplies must be found or people must be relocated to areas with more reliable water supplies. While the latter is virtually impossible to envision on political grounds, the former may be possible in some situations through the application of technology, making use of the huge water reservoir in the world’s oceans. The only problem, of course, is how to get rid of the salt.
Desalination has a relatively recent history. Though it was suggested by Thomas Jefferson over two centuries ago, and was used in the nineteenth century to generate fresh water for steamship boilers when they were out at sea, it was not investigated seriously as a way to provide drinking water until the middle of the twentieth century. It is still incredibly expensive and uses enormous amounts of energy, which means that it has been widely applied only in relatively wealthy places like the Persian Gulf states or on large ships. Worldwide it accounts for only around 12 billion gallons of fresh-water production every day—less than half of the residential water consumption in the United States, and far less than the total amount used once you include industry and agriculture. Given that global daily fresh-water usage, including all industrial and agricultural consumption, is around 800 trillion gallons a day, the 12 billion currently produced by desalination plants is literally a drop in the bucket.