An Edible History of Humanity

Home > Other > An Edible History of Humanity > Page 15
An Edible History of Humanity Page 15

by Tom Standage


  In this sense the industrialized countries have not escaped Malthus’s trap after all, but have merely exchanged one crisis, in which the limiting factor was agricultural land, for another, in which the limiting factor is the atmosphere’s ability to absorb carbon dioxide. The possibility that the switch to fossil fuels might provide only a temporary respite from Malthusian pressures occurred even to nineteenth-century writers, notably William Stanley Jevons, an English economist and author of The Coal Question, published in 1865. “For the present,” he wrote, “our cheap supplies of coal and our skill in its employment, and the freedom of our commerce with other wider lands, render us independent of the limited agricultural area of these islands, and apparently take us out of the scope of Malthus’s doctrine.” The word apparently did not appear in the first edition of the book, but Jevons added it to a later edition shortly before his death in 1882.

  He was right to worry. In the early twenty-first century, renewed concerns about the connection between energy supplies and the availability of sufficient land for food production have been raised once again by the growing enthusiasm for biofuels, such as ethanol made from maize and biodiesel made from palm oil. Making fuel from such crops is appealing because it is a renewable source of energy (you can grow more next year) and over its life cycle it can produce fewer carbon emissions than fossil fuels. As plants grow, they absorb carbon dioxide from the air; they are then processed into biofuel, and the carbon dioxide goes back into the atmosphere when the fuel is burned. The whole process would be carbon neutral, were it not for the emissions associated with growing the crops in the first place (fertilizer, fuel for tractors, and so on) and then processing them into biofuels (something that usually requires a lot of heat). But exactly how much energy is required to produce various biofuels, and the level of associated carbon emissions, varies from crop to crop. So some biofuels make more sense than others.

  The type that makes least sense is ethanol made from maize (corn), which is, unfortunately, the predominant form of biofuel, accounting for 40 percent of world production in 2007, most of it in the United States. The best-guess figures suggest that burning a gallon of corn ethanol produces only about 30 percent more energy than was needed to produce it, and reduces greenhouse-gas emissions by about 13 percent compared with conventional fossil fuel. That may sound impressive, but the corresponding figures for Brazilian sugarcane ethanol are about 700 percent and 85 percent respectively; for biodiesel made in Germany they are 150 percent and 50 percent. Put another way, making a gallon of corn ethanol requires four fifths of a gallon of fossil fuel (not to mention hundreds of gallons of water), and does not reduce greenhouse-gas emissions by very much. America’s corn-ethanol drive makes even less sense on economic grounds: To achieve these meager reductions in emissions, the United States government subsidizes corn-ethanol production to the tune of some seven billion dollars a year, and also imposes a tariff on sugarcane ethanol from Brazil to discourage imports. Corn ethanol seems to be an elaborate scheme to justify farming subsidies, rather than a serious effort to reduce greenhouse-gas emissions. En gland abolished its farmer-friendly Corn Laws in 1846, but America has just introduced new ones.

  Enthusiasm for corn ethanol and other biofuels is one of the factors that has helped to drive up food prices as crops are diverted to make into fuel, so that they are in effect fed to cars, not people. Opponents of biofuels like to point out that the maize needed to fill a vehicle’s twenty-five-gallon tank with ethanol would be enough to feed one person for a year. Since maize is also used as an animal feed, its higher price makes meat and milk more expensive, too. And as farmers switch their land from growing other crops to growing corn instead, those other crops (such as soy) become scarcer, and their prices also rise. Food and fuel are, it seems, once again competing for agricultural land. Cheap coal meant that English landowners in the eighteenth century realized their land was more valuable for growing food than fuel; concern about expensive oil today means American farmers are making the opposite choice, and growing crops for fuel rather than for food.

  Biofuels need not always compete with food production, however. In some cases, it may be possible to grow biofuel feedstocks on marginal land that is unsuitable for other forms of agriculture. And those feedstocks need not be food crops. One potentially promising approach is that of cellulosic ethanol, in which ethanol is made from fast-growing, woody shrubs, or even from trees. In theory, this would be several times more energy efficient even than sugarcane ethanol, could reduce greenhouse-gas emissions by almost as much (a reduction of around 70 percent compared with fossil fuels), and would not encroach upon agricultural land. The problem is that the field is still immature, and expensive enzymes are needed to break down the cellulose into a form that can be made into ethanol. Another approach involves making biofuel from algae, but again the technology is still in its early days.

  What is clear is that the use of food crops for fuel is a step backward. The next logical step forward, after the Neolithic and Industrial revolutions, is surely to find new ways to harness solar energy beyond growing crops or digging up fossil fuels. Solar panels and wind turbines are the most obvious examples, but it may also be possible to tinker with the biological mechanism of photosynthesis to produce more efficient solar cells, or to create genetically engineered microbes capable of churning out biofuels. The trade-off between food and fuel has resurfaced in the present, but it belongs in the past.

  PART V

  FOOD AS A WEAPON

  9

  THE FUEL OF WAR

  Amateurs talk tactics, but professionals talk logistics.

  —ANONYMOUS

  The fate of Europe and all further calculations depend upon the question of food. If only I have bread, it will be child’s play to beat the Russians.

  —NAPOLEON BONAPARTE

  “MORE SAVAGE THAN THE SWORD”

  What is the most devastating and effective weapon in the history of warfare? It is not the sword, the machine gun, the tank, or the atom bomb. Another weapon has killed far more people and determined the outcomes of numerous conflicts. It is something so obvious that it is easy to overlook: food, or more precisely, control of the food supply. Food’s power as a weapon has been acknowledged since ancient times. “Starvation destroys an army more often than does battle, and hunger is more savage than the sword,” noted Vegetius, a Roman military writer who lived in the fourth century A.D. He quoted a military maxim that “whoever does not provide for food and other necessities, is conquered without fighting.”

  For most of human history, food was literally the fuel of war. In the era before firearms, when armies consisted of soldiers carrying swords, spears, and shields, food sustained them on the march and gave them the energy to wield their weapons in battle. Food, including fodder for animals, was in effect both ammunition and fuel. Maintaining the supply of food was therefore critical to military success; a lack of food, or its denial by the enemy, would lead swiftly to defeat. Before the advent of mechanized transport, keeping an army supplied with food and fodder often imposed significant constraints on where and when it could fight, and on how fast it could move. Although other aspects of warfare changed dramatically from ancient times to the Napoleonic era, the constraints imposed by food persisted. Soldiers could only carry a few days’ worth of supplies on their backs; using pack animals or carts allowed an army to carry more supplies and equipment, but fodder for the animals was then needed, and the army’s speed and mobility suffered.

  This was recognized in the fourth century B.C. by Philip II of Macedonia, who introduced a number of reforms that were extended by his son, Alexander, to create the fastest, lightest, and most agile force of its day. Families, servants, and other followers, who sometimes equalled the soldiers in number, were restricted to an absolute minimum, allowing the army to throw off its immense tail of slow-moving people and carts. Soldiers were also required to carry much of their own equipment and supplies, with pack animals rather than carts carrying
the rest. With fewer animals there was less need to find fodder, and the army became more mobile, particularly over difficult terrain. All this gave Alexander’s army a clear advantage, allowing him to launch lightning strikes that struck fear into his enemies, according to Greek historians. Satibarzanes, a Persian governor, “learning of Alexander’s proximity and astounded at the swiftness of his approach, fled with a few Arian horsemen.” The Uxians, a Persian hill tribe, were “astounded by Alexander’s swiftness, and fled without so much as coming to close quarters.” And Bessus, a treacherous Persian nobleman, was “greatly terrified by Alexander’s speed.” Alexander’s mastery of the mechanics of supplying his army—a field known today as logistics—enabled him to mount one of the longest and most successful military campaigns in history, conquering a swath of territory from Greece to the Himalayas.

  Armies in history rarely brought along all of their own food supplies, however, and Alexander’s was no exception. Food and fodder were also drawn from the surrounding country as the soldiers marched through. Such foraging could be an efficient way to feed an army, but it had the disadvantage that if the soldiers stopped moving, the local area would be rapidly depleted. Initially the army would have plenty of food at its disposal, but on each successive day foraging parties would have to travel farther to reach areas that had not yet been stripped of food. Alexander’s rule of thumb, which was still valid centuries later, was that an army could only forage within a four-day radius of its camp, because a pack animal devours its own load within eight days. An animal that travels four days through barren country to gather food must carry four days’ worth of food for its outward journey; it can then load up with eight days’ worth of food, but will consume half of this on the return journey, leaving four days’ worth—in other words, the amount it started off with. The length of time an army could stay in one place therefore depended on the richness of the surrounding country, which in turn depended on the population density (more people would generally have more food that could be appropriated) and the time of year (there would be plenty of food available just after the harvest, and very little available just before it). Alexander and other generals had to take these factors into account when choosing the routes of marches and the timing of campaigns.

  Delivering supplies in bulk to an army on campaign was best done by ship, which was the only way to move large quantities of food quickly in the ancient world. Pack animals or carts could then carry supplies the last few miles from the port to the army’s inland bases when necessary. This compelled armies to operate relatively close to rivers and coasts. As Alexander conquered the lands around the Mediterranean he was able to rely on his fleet to deliver supplies, provided his soldiers secured the ports along the coast beforehand. Moving from port to port, the soldiers carried a few days’ worth of supplies and supplemented them by living off the land when possible. In the centuries after Alexander’s death, the Romans took his logistic prowess a stage further. They established a network of roads and supply depots throughout their territory to ensure that supplies could be moved quickly and in quantity when needed. Their depots were resupplied by ship, which made it difficult for Roman armies to operate more than seventy-five miles from a coast or a large river. This helps to explain why Rome conquered the lands around the Mediterranean, and why the northern boundaries of its territory were defined by rivers. Maintaining permanent supply depots meant that a large force could move quickly through Roman territory without having to worry about finding food or fodder. The Roman army also introduced rules to govern the process of foraging while on campaign.

  In enemy territory, demanding food requisitions from the surrounding area served two purposes: It fed the invading army and impoverished the local community. Food in such situations was literally a weapon: A marauding army could strip a region bare and cause immense hardship. As a medieval Chinese military handbook puts it, “If you occupy your enemies’ store houses and granaries and seize his accumulated resources in order to provision your army continuously, you will be victorious.” Sometimes merely the threat of seizure was enough. In Alexander’s case, local officials often surrendered to him before he entered their territory and agreed to provide food for his army, in return for more lenient treatment. As Alexander advanced into the Persian Empire, this was a deal that local governors were increasingly happy to agree to.

  Conversely, removing or destroying all food and fodder in the path of an advancing army (a so-called scorched-earth policy) provided a way to use food defensively. An early example came during the Second Punic War between Rome and Carthage, during which Hannibal, the Carthaginian general, humiliated the Romans by rampaging around Italy with his army for several years. In an effort to stop him, a proclamation was issued that “all the population settled in the districts through which Hannibal was likely to march should abandon their farms, after first burning their houses and destroying their produce, so that he might not have any supplies to fall back upon.” This ploy failed, but on other occasions in history it was highly effective. Another defensive strategy was to deny the enemy access to food-processing equipment. In order to delay the advance of Spanish troops in 1636, French generals were instructed to “send out before them seven or eight companies of cavalry in a number of places, with workers to break all the ovens and mills in an area stretching from their own fronts to as close as possible to the enemy.” Without ovens and mills, seized grain could not be turned into bread, and soldiers would have to make camp for a couple of days to set up portable ovens.

  All these food-related constraints on the waging of war persisted throughout most of human history, despite the emergence of new technologies such as firearms. But over time the supply systems used by armies invariably became more elaborate. In particular, warfare in eighteenth-century Europe became increasingly formalized, and armies came to rely less on requisitions and foraging, which they regarded as old-fashioned and uncivilized, and more on supplies amassed in depots and delivered by wagon trains. Professional soldiers expected to be fed and paid while on campaign; they did not expect to have to forage for food. The resulting need to build up supplies beforehand meant that campaigns had to be planned long in advance. With armies tethered to their supply depots, lightning strikes or long marches were out of the question. One historian has likened wars of this period to “the jousting of turtles.”

  The American Revolutionary War of 1775–1783 provides a microcosm of how logistical considerations could still be crucial in determining the outcome of a conflict, centuries after Alexander and Hannibal. In theory, the British should easily have been able to put down the rebellion among their American colonists. Britain was the greatest military and naval power of its day, presiding over a vast empire. In practice, however, supplying an army of tens of thousands of men operating some three thousand miles away posed enormous difficulties. Britain’s 35,000 soldiers required 37 tons of food a day among them (a pound of beef each, plus some peas, bread, and rum); their 4,000 horses needed a further 57 tons.

  To start with, the British commanders expected their soldiers’ reliance on supplies delivered across the Atlantic by ship to be temporary. They hoped that American loyalists would rally to their cause, allowing the army to draw food and fodder from the country in loyalist areas. But this proved to be impractical, both because of the quantities required and because requisitioning food alienated the loyalists on whose support the British strategy depended. Many of the British troops, accustomed to Europe’s more formal style of warfare, lacked experience in foraging and felt that it was beneath them. The British troops found themselves penned up near ports for security, dependent on supplies brought in by sea and unable to move very far inland. Attempts to enlarge the area under control provided a larger area in which to forage, but it caused resentment among the colonists, who refused to continue food production or mounted guerrilla resistance. Foraging expeditions sent beyond the British lines required covering forces of hundreds of troops. A small group of rebels could harass a m
uch larger foraging party, picking off men using ambushes and snipers. The British lost as many men in such skirmishes as they did in larger pitched battles.

  Unwilling to venture inland, where their movements would end up being determined by the needs of supply rather than military strategy, the British concluded that they would need to build up a reserve of at least six months’ worth of food (and ideally a year’s worth) before mounting a major offensive, a condition that was met only twice over the course of the eight-year war. The shortage of supplies also meant that the British were unable to press their advantage when the opportunity arose, repeatedly giving their opponents the chance to regroup. The British failed to strike a decisive blow in the early years of the conflict, and after other European powers entered the war on America’s side it became clear that Britain could not win.

 

‹ Prev