Enlightenment Now

Home > Nonfiction > Enlightenment Now > Page 10
Enlightenment Now Page 10

by Steven Pinker


  Writing in 2000, the economist Stephen Devereux summarized the world’s progress in the 20th century:

  Vulnerability to famine appears to have been virtually eradicated from all regions outside Africa. . . . Famine as an endemic problem in Asia and Europe seems to have been consigned to history. The grim label “land of famine” has left China, Russia, India and Bangladesh, and since the 1970s has resided only in Ethiopia and Sudan.

  [In addition,] the link from crop failure to famine has been broken. Most recent drought- or flood-triggered food crises have been adequately met by a combination of local and international humanitarian response. . . .

  If this trend continues, the 20th century should go down as the last during which tens of millions of people died for lack of access to food.9

  Figure 7-4: Famine deaths, 1860–2016

  Sources: Our World in Data, Hasell & Roser 2017, based on data from Devereux 2000; Ó Gráda 2009; White 2011, and EM-DAT, The International Disaster Database, http://www.emdat.be/; and other sources. “Famine” is defined as in Ó Gráda 2009.

  So far, the trend has continued. There is still hunger (including among the poor in developed countries), and there were famines in East Africa in 2011, the Sahel in 2012, and South Sudan in 2016, together with near-famines in Somalia, Nigeria, and Yemen. But they did not kill on the scale of the catastrophes that were regular occurrences in earlier centuries.

  None of this was supposed to happen. In 1798 Thomas Malthus explained that the frequent famines of his era were unavoidable and would only get worse, because “population, when unchecked, increases in a geometrical ratio. Subsistence increases only in an arithmetic ratio. A slight acquaintance with numbers will show the immensity of the first power in comparison with the second.” The implication was that efforts to feed the hungry would only lead to more misery, because they would breed more children who were doomed to hunger in their turn.

  Not long ago, Malthusian thinking was revived with a vengeance. In 1967 William and Paul Paddock wrote Famine 1975!, and in 1968 the biologist Paul R. Ehrlich wrote The Population Bomb, in which he proclaimed that “the battle to feed all of humanity is over” and predicted that by the 1980s sixty-five million Americans and four billion other people would starve to death. New York Times Magazine readers were introduced to the battlefield term triage (the emergency practice of separating wounded soldiers into the savable and the doomed) and to philosophy-seminar arguments about whether it is morally permissible to throw someone overboard from a crowded lifeboat to prevent it from capsizing and drowning everyone.10 Ehrlich and other environmentalists argued for cutting off food aid to countries they deemed basket cases.11 Robert McNamara, president of the World Bank from 1968 to 1981, discouraged financing of health care “unless it was very strictly related to population control, because usually health facilities contributed to the decline of the death rate, and thereby to the population explosion.” Population-control programs in India and China (especially under China’s one-child policy) coerced women into sterilizations, abortions, and being implanted with painful and septic IUDs.12

  Where did Malthus’s math go wrong? Looking at the first of his curves, we already saw that population growth needn’t increase in a geometric ratio indefinitely, because when people get richer and more of their babies survive, they have fewer babies (see also figure 10-1). Conversely, famines don’t reduce population growth for long. They disproportionately kill children and the elderly, and when conditions improve, the survivors quickly replenish the population.13 As Hans Rosling put it, “You can’t stop population growth by letting poor children die.”14

  Looking at the second curve, we discover that the food supply can grow geometrically when knowledge is applied to increase the amount of food that can be coaxed out of a patch of land. Since the birth of agriculture ten thousand years ago, humans have been genetically engineering plants and animals by selectively breeding the ones that had the most calories and fewest toxins and that were the easiest to plant and harvest. The wild ancestor of corn was a grass with a few tough seeds; the ancestor of carrots looked and tasted like a dandelion root; the ancestors of many wild fruits were bitter, astringent, and more stone than flesh. Clever farmers also tinkered with irrigation, plows, and organic fertilizers, but Malthus always had the last word.

  It was only at the time of the Enlightenment and the Industrial Revolution that people figured out how to bend the curve upward.15 In Jonathan Swift’s 1726 novel, the moral imperative was explained to Gulliver by the King of Brobdingnag: “Whoever makes two ears of corn, or two blades of grass to grow where only one grew before, deserves better of humanity, and does more essential service to his country than the whole race of politicians put together.” Soon after that, as figure 7-1 shows, more ears of corn were indeed made to grow, in what has been called the British Agricultural Revolution.16 Crop rotation and improvements to plows and seed drills were followed by mechanization, with fossil fuels replacing human and animal muscle. In the mid-19th century it took twenty-five men a full day to harvest and thresh a ton of grain; today one person operating a combine harvester can do it in six minutes.17

  Machines also solve an inherent problem with food. As any zucchini gardener in August knows, a lot becomes available all at once, and then it quickly rots or gets eaten by vermin. Railroads, canals, trucks, granaries, and refrigeration evened out the peaks and troughs in the supply and matched it with demand, coordinated by the information carried in prices. But the truly gargantuan boost would come from chemistry. The N in SPONCH, the acronym taught to schoolchildren for the chemical elements that make up the bulk of our bodies, stands for nitrogen, a major ingredient of protein, DNA, chlorophyll, and the energy carrier ATP. Nitrogen atoms are plentiful in the air but bound in pairs (hence the chemical formula N2), which are hard to split apart so that plants can use them. In 1909 Carl Bosch perfected a process invented by Fritz Haber which used methane and steam to pull nitrogen out of the air and turn it into fertilizer on an industrial scale, replacing the massive quantities of bird poop that had previously been needed to return nitrogen to depleted soils. Those two chemists top the list of the 20th-century scientists who saved the greatest number of lives in history, with 2.7 billion.18

  So forget arithmetic ratios: over the past century, grain yields per hectare have swooped upward while real prices have plunged. The savings are mind-boggling. If the food grown today had to be grown with pre-nitrogen-farming techniques, an area the size of Russia would go under the plow.19 In the United States in 1901, an hour’s wages could buy around three quarts of milk; a century later, the same wages would buy sixteen quarts. The amount of every other foodstuff that can be bought with an hour of labor has multiplied as well: from a pound of butter to five pounds, a dozen eggs to twelve dozen, two pounds of pork chops to five pounds, and nine pounds of flour to forty-nine pounds.20

  In the 1950s and ’60s, another giga-lifesaver, Norman Borlaug, outsmarted evolution to foment the Green Revolution in the developing world.21 Plants in nature invest a lot of energy and nutrients in woody stalks that raise their leaves and blossoms above the shade of neighboring weeds and of each other. Like fans at a rock concert, everyone stands up, but no one gets a better view. That’s the way evolution works: it myopically selects for individual advantage, not the greater good of the species, let alone the good of some other species. From a farmer’s perspective, not only do tall wheat plants waste energy in inedible stalks, but when they are enriched with fertilizer they collapse under the weight of the heavy seedhead. Borlaug took evolution into his own hands, crossing thousands of strains of wheat and then selecting the offspring with dwarfed stalks, high yields, resistance to rust, and an insensitivity to day length. After several years of this “mind-warpingly tedious work,” Borlaug evolved strains of wheat (and then corn and rice) with many times the yield of their ancestors. By combining these strains with modern techniques of irrigation, fertilization, and crop
management, Borlaug turned Mexico and then India, Pakistan, and other famine-prone countries into grain exporters almost overnight. The Green Revolution continues—it has been called “Africa’s best-kept secret”—driven by improvements in sorghum, millet, cassava, and tubers.22

  Thanks to the Green Revolution, the world needs less than a third of the land it used to need to produce a given amount of food.23 Another way of stating the bounty is that between 1961 and 2009 the amount of land used to grow food increased by 12 percent, but the amount of food that was grown increased by 300 percent.24 In addition to beating back hunger, the ability to grow more food from less land has been, on the whole, good for the planet. Despite their bucolic charm, farms are biological deserts which sprawl over the landscape at the expense of forests and grasslands. Now that farms have receded in some parts of the world, temperate forests have been bouncing back, a phenomenon we will return to in chapter 10.25 If agricultural efficiency had remained the same over the past fifty years while the world grew the same amount of food, an area the size of the United States, Canada, and China combined would have had to be cleared and plowed.26 The environmental scientist Jesse Ausubel has estimated that the world has reached Peak Farmland: we may never again need as much as we use today.27

  Like all advances, the Green Revolution came under attack as soon as it began. High-tech agriculture, the critics said, consumes fossil fuels and groundwater, uses herbicides and pesticides, disrupts traditional subsistence agriculture, is biologically unnatural, and generates profits for corporations. Given that it saved a billion lives and helped consign major famines to the dustbin of history, this seems to me like a reasonable price to pay. More important, the price need not be with us forever. The beauty of scientific progress is that it never locks us into a technology but can develop new ones with fewer problems than the old ones (a dynamic we will return to here).

  Genetic engineering can now accomplish in days what traditional farmers accomplished in millennia and Borlaug accomplished in his years of “mind-warping tedium.” Transgenic crops are being developed with high yields, lifesaving vitamins, tolerance of drought and salinity, resistance to disease, pests, and spoilage, and reduced need for land, fertilizer, and plowing. Hundreds of studies, every major health and science organization, and more than a hundred Nobel laureates have testified to their safety (unsurprisingly, since there is no such thing as a genetically unmodified crop).28 Yet traditional environmentalist groups, with what the ecology writer Stewart Brand has called their “customary indifference to starvation,” have prosecuted a fanatical crusade to keep transgenic crops from people—not just from whole-food gourmets in rich countries but from poor farmers in developing ones.29 Their opposition begins with a commitment to the sacred yet meaningless value of “naturalness,” which leads them to decry “genetic pollution” and “playing with nature” and to promote “real food” based on “ecological agriculture.” From there they capitalize on primitive intuitions of essentialism and contamination among the scientifically illiterate public. Depressing studies have shown that about half of the populace believes that ordinary tomatoes don’t have genes but genetically modified ones do, that a gene inserted into a food might migrate into the genomes of people who eat it, and that a spinach gene inserted into an orange would make it taste like spinach. Eighty percent favored a law that would mandate labels on all foods “containing DNA.”30 As Brand put it, “I daresay the environmental movement has done more harm with its opposition to genetic engineering than with any other thing we’ve been wrong about. We’ve starved people, hindered science, hurt the natural environment, and denied our own practitioners a crucial tool.”31

  One reason for Brand’s harsh judgment is that opposition to transgenic crops has been perniciously effective in the part of the world that could most benefit from it. Sub-Saharan Africa has been cursed by nature with thin soil, capricious rainfall, and a paucity of harbors and navigable rivers, and it never developed an extensive network of roads, rails, or canals.32 Like all farmed land, its soils have been depleted, but unlike those in the rest of the world, Africa’s have not been replenished with synthetic fertilizer. Adoption of transgenic crops, both those already in use and ones customized for Africa, grown with other modern practices such as no-till farming and drip irrigation, could allow Africa to leapfrog the more invasive practices of the first Green Revolution and eliminate its remaining undernourishment.

  For all the importance of agronomy, food security is not just about farming. Famines are caused not only when food is scarce but when people can’t afford it, when armies prevent them from getting it, or when their governments don’t care how much of it they have.33 The pinnacles and valleys in figure 7-4 show that the conquest of famine was not a story of steady gains in agricultural efficiency. In the 19th century, famines were triggered by the usual droughts and blights, but they were exacerbated in colonial India and Africa by the callousness, bungling, and sometimes deliberate policies of administrators who had no benevolent interest in their subjects’ welfare.34 By the early 20th century, colonial policies had become more responsive to food crises, and advances in agriculture had taken a bite out of hunger.35 But then a horror show of political catastrophes triggered sporadic famines for the rest of the century.

  Of the seventy million people who died in major 20th-century famines, 80 percent were victims of Communist regimes’ forced collectivization, punitive confiscation, and totalitarian central planning.36 These included famines in the Soviet Union in the aftermaths of the Russian Revolution, the Russian Civil War, and World War II; Stalin’s Holodomor (terror-famine) in Ukraine in 1932–33; Mao’s Great Leap Forward in 1958–61; Pol Pot’s Year Zero in 1975–79; and Kim Jong-il’s Arduous March in North Korea as recently as the late 1990s. The first governments in postcolonial Africa and Asia often implemented ideologically fashionable but economically disastrous policies such as the mass collectivization of farming, import restrictions to promote “self-sufficiency,” and artificially low food prices which benefited politically influential city-dwellers at the expense of farmers.37 When the countries fell into civil war, as they so often did, not only was food distribution disrupted, but both sides could use hunger as a weapon, sometimes with the complicity of their Cold War patrons.

  Fortunately, since the 1990s the prerequisites to plenty have been falling into place in more of the world. Once the secrets to growing food in abundance are unlocked and the infrastructure to move it around is in place, the decline of famine depends on the decline of poverty, war, and autocracy. Let’s turn to the progress that has been made against each of these scourges.

  CHAPTER 8

  WEALTH

  Poverty has no causes,” wrote the economist Peter Bauer. “Wealth has causes.” In a world governed by entropy and evolution, the streets are not paved with pastry, and cooked fish do not land at our feet. But it’s easy to forget this truism and think that wealth has always been with us. History is written not so much by the victors as by the affluent, the sliver of humanity with the leisure and education to write about it. As the economist Nathan Rosenberg and the legal scholar L. E. Birdzell Jr. point out, “We are led to forget the dominating misery of other times in part by the grace of literature, poetry, romance, and legend, which celebrate those who lived well and forget those who lived in the silence of poverty. The eras of misery have been mythologized and may even be remembered as golden ages of pastoral simplicity. They were not.”1

  Norberg, drawing on Braudel, offers vignettes of this era of misery, when the definition of poverty was simple: “if you could afford to buy bread to survive another day, you were not poor.”

  In wealthy Genoa, poor people sold themselves as galley slaves every winter. In Paris the very poor were chained together in pairs and forced to do the hard work of cleaning the drains. In England, the poor had to work in workhouses to get relief, where they worked long hours for almost no pay. Some were instructed to crush dog, horse and cat
tle bones for use as fertilizer, until an inspection of a workhouse in 1845 showed that hungry paupers were fighting over the rotting bones to suck out the marrow.2

  Another historian, Carlo Cipolla, noted:

  In preindustrial Europe, the purchase of a garment or of the cloth for a garment remained a luxury the common people could only afford a few times in their lives. One of the main preoccupations of hospital administration was to ensure that the clothes of the deceased should not be usurped but should be given to lawful inheritors. During epidemics of plague, the town authorities had to struggle to confiscate the clothes of the dead and to burn them: people waited for others to die so as to take over their clothes—which generally had the effect of spreading the epidemic.3

  The need to explain the creation of wealth is obscured yet again by political debates within modern societies on how wealth ought to be distributed, which presuppose that wealth worth distributing exists in the first place. Economists speak of a “lump fallacy” or “physical fallacy” in which a finite amount of wealth has existed since the beginning of time, like a lode of gold, and people have been fighting over how to divide it up ever since.4 Among the brainchildren of the Enlightenment is the realization that wealth is created.5 It is created primarily by knowledge and cooperation: networks of people arrange matter into improbable but useful configurations and combine the fruits of their ingenuity and labor. The corollary, just as radical, is that we can figure out how to make more of it.

  The endurance of poverty and the transition to modern affluence can be shown in a simple but stunning graph. It plots, for the past two thousand years, a standard measure of wealth creation, the Gross World Product, measured in 2011 international dollars. (An international dollar is a hypothetical unit of currency equivalent to a US dollar in a particular reference year, adjusted for inflation and for purchasing-power parity. The latter compensates for differences in the prices of comparable goods and services in different places—the fact that a haircut, for example, is cheaper in Dhaka than in London.)

 

‹ Prev