The Locavore's Dilemma
Page 16
Another problem is that reliance on long distance trade only proved a less desirable alternative than autarky in the context of prolonged blockades and stationary fronts. In the context of aerial bombing, however, the destruction of critical local infrastructure would soon cripple locavore communities with no capacity to tap into the agricultural surplus of more distant regions. Autarkic policies also always mandate the use of more marginal agricultural lands whose long-term productivity is more likely to be affected by erosion and other problems. Finally, post-war reconstruction is much easier when distant resources can be accessed. While future world conflicts are always possible, distant and hopefully remote possibilities should not be invoked to maintain hundreds of millions of people in a state of hunger and malnutrition in the present time. In our opinion, it would be better to work towards world peace and greater global economic integration. Countless city walls were torn down in the last two centuries. Agricultural tariffs, quotas, and subsidies should meet the same fate.
This being said, the study of past military blockades and agricultural countermeasures is helpful in terms of getting a more concrete appreciation of the real world consequences of locavorism. True, in wartime many valuable resources are diverted towards destructive goals that would be put to good agricultural use in the locavores’ utopia. Nonetheless, the basic trade-offs for increasing local food production would remain the same. For instance, in Europe over the last two centuries, the following substitutions in terms of both food production and consumption were widespread among warring countries that had been cut off to a large degree from international trade:56 • a switch from livestock and fruit production to high yielding crops (the conversion of grassland and orchards to cropland devoted to grain and potato production);
• a switch from beef to dairy cattle (the replacement of meat by dairy production);
• the culling of chicken and hogs whose feed could be consumed by humans;
• the elimination of a large number of “luxury animals” such as horses and dogs;
• a significant reduction in the volume of grain used for brewing, distilling, and luxury products, such as pastries;
• the replacement of white bread by bread made with whole wheat flour and further diluted with potato and barley flour.
A brief discussion of specific cases can also provide further insights. We begin with the Allied Blockade against Germany during the First World War,57 an Empire whose technical and scientific capabilities were comparable to any other country at the time; whose landscape was reasonably large and varied; and whose territory remained virtually untouched by opposing forces during the First World War. Most notable, though, is that while German political authorities had built up some food stocks before the beginning of the hostilities, they had not launched large-scale autarkic agricultural efforts.
In spite of some protectionist policies, before the beginning of the hostilities, the Reich’s agricultural sector was, in the words of agricultural economist Karl Brandt, very much integrated in the “international co-operation and division of labor,”58 and, as a result, in 1913, German agriculture was more productive than ever before, some commentators even suggesting that the subjects of the Kaiser were perhaps even “better or more richly fed than the inhabitants of France or England.”59 To give a sense of progress in previous decades, between 1878 and 1912, Germany’s total rye production rose from 6.9 million metric tons (MMT) to 11.6 MMT; wheat from 2.6 to 4.4 MMT; oats from 5 to 8.5MMT; barley from 2.3 to 3.5 MMT; and potatoes from 23.6 to 50.2 MMT. While some of this progress could be attributed to increased acreage, most of it could be traced back to greater yields. Because the German population grew from 41 million people in 1871 to 68 million in 1913, the Reich’s inhabitants relied on foreign imports for approximately a third of their direct food supplies, and, through the importation of large quantities of fertilizers and high-protein animal feed, for much of their “local” crop and livestock production. Like many of their Northern European counterparts, many German producers had reoriented their activities towards livestock and, when possible, fruits and vegetables.
Although they had made preparations for blockades and counter blockades, German military leaders never envisaged a prolonged conflict and were soon unable to find substitutes for imported phosphates (a critical fertilizer) while the absence of imported concentrated feeding stuff for livestock not only drastically reduced milk and meat production, but also the volume and quality of animal manure. Many animals eventually died from inanition despite attempts to develop substitutes for forage that included “drying lees, grinding straw, weeds, carcasses, fish [and] working up food refuse.”60
Making matters even worse were weather conditions so bad they “would have caused a large diminution in the agricultural yield of the country even if all other conditions had been normal.”61 Largely because of this, in 1916 the potato crop failed at a time when it occupied about a fifth of German arable land. From a prewar figure of 52 million tons, only 21 million tons were harvested which, due to bad handling and inadequate storage by government authorities, translated into only 17 million usable tons of potatoes. By 1917 they had been replaced by turnips and cabbage. Despite the introduction of meat ration cards and meatless days, valuable breeding stock that had been developed over decades was gradually gobbled up by starving individuals, often in a deliberate attempt to preserve for human consumption what had previously been considered substandard fare. All sorts of leaves, berries, roasted acorns, and beans were also at the time being steeped in hot water as tea and coffee substitutes.
By late 1917, a two-pound can of preserved marmalade made of apples, carrots, and pumpkins was sold to each German household during the Christmas season and was expected to serve the family for the following year. By the end of the war, malnutrition might have killed as many as 700,000 civilians on top of the Empire’s 1,800,000 battlefield casualties. As one contemporary observer commented, in “the condition of dull apathy and mental prostration resulting from the deprivation of food the course of the War no longer seemed of importance. Food filled [Germans’] thoughts by day and their dreams by night, and the only desire was to end the War by any possible means that might lead to a slackening of the blockade and the free entry of food into the country.” 62 Things got even worse for the German population after the armistice as Allied occupying forces presented numerous obstacles to the re-opening of trading activities. While the partition of Germany that came out of the Treaty of Versailles is often blamed for later German belligerent attitudes and the rise of Nazism, the Allied food blockade both during and after the war played a crucial part in fueling future German aggressions.
Lest one thinks that the previous example was deliberately chosen because of its dire outcome, we will now turn to a brief discussion of what is generally acknowledged to have been the most successful case of local agricultural production under wartime conditions: Denmark during World War II.63
In their massive study on the Management of Agriculture and Food in the German-Occupied and Other Areas of Fortress Europe, Karl Brandt and his collaborators, Otto Schiller and Franz Ahlgrimm, observed that Denmark “received by far the most gentle treatment of all the German-occupied countries;” that there was no German military government; that “the Danish contributions to the German war-food economy were among the most important;” that the “production record of Danish agriculture in World War II is one of the most remarkable in the annals of world agriculture,” comparable to and probably even superior to “the achievements of American farmers;” and that the war actually “became a period of extraordinary prosperity for [Danish] agriculture.”64 This outcome can undoubtedly be traced back to the German “quasi-benevolent” attitude towards the country because of its racial profile (Danes were said to be Aryans), the country’s military insignificance, and the fact that, proportionally speaking at least, the Danes did not need to devote many resources to the direct war efforts. Yet, this case is hardly a vindication of locavorism.
>
Like their counterparts in Great Britain, the Netherlands, and Belgium, 19th century Danish political authorities did not react to the “invasion” of cheap North American and Eastern European grains by imposing higher tariffs to protect their domestic producers. On the contrary, a majority of Danish farmers were opposed to the idea as they realized that cheap foreign animal feed would give them the opportunity to specialize in more lucrative livestock and dairy farming, making it possible to expand dairy production from summer to year-round. Although mostly comprised of small- and medium-sized independent operations, the creation of large agricultural cooperatives gave Danish farmers the capacity to develop significant economies of scale and a reputation for excellence that made their products highly sought after in lucrative foreign markets such as Great Britain. The result was that, from the mid-1870s to the mid-1920s, the Danish cattle herd doubled, the pig herd multiplied by more than six, and the chicken flock by four. Even more remarkable, Danish crop growers expanded their production by a factor of almost three as the five-year average of all crops expressed in a measurement known as barley equivalent (i.e., as if they had been converted to barley) went from around 27 million tons to no less than 74 million tons during this period.65 By 1938, the British and German markets absorbed more than 76% of Danish exports, which were mostly made up of livestock products such as butter, eggs, lard, and bacon.
As Karl Brandt argued in 1945, far from proving the assertions of agricultural protectionists that all trade liberalization “would financially ruin millions of European family farms and reduce the farmers to abject misery and poverty… [or a] general depression of their living standards,” Danish agriculture from the middle of the 19th century to the eve of World War II illustrated that, by embracing free trade, Danish farmers had not only learned to “discover the fields of production in which they had the best opportunity to compete successfully with the farmers of the world, but they also were able to develop their own abilities, their agricultural production and marketing plants to almost functional perfection,” the result being “a most remarkable degree of culture and the art of decent living.”66
The Nazi invasion of Denmark in April 1940, however, quickly cut off the importation of foreign oilseeds and grains along with access to the British market. Based on their past experience, German administrators devised plans for the massive slaughter of dairy cows, hogs, and chickens in order to save as much grain as possible for human consumption. Danish farmers overwhelmingly opposed these measures and threatened to engage in general passive resistance if they were enforced. Meanwhile, Danish authorities let it be known to their occupiers that they weren’t concerned with where food surpluses would end up (i.e., feeding members of the Third Reich) as long as the local population did not go hungry. Because of this and the difficulty of effectively rationing a large agricultural sector made up of numerous small production units, German administrators settled instead upon economic incentives such as export quotas, boosting the prices of farm products, and keeping the price of farm inputs such as nitrogen and potash fertilizers artificially low, thus ensuring that “farming was made exceptionally profitable.”67
As could be expected, Danish farmers responded by maintaining the land area devoted to grain to prewar average; increasing slightly the area devoted to root crops; converting perhaps as much as 16% of their pastureland to flax, vegetable seed, and vegetable production; reducing fallow land by about two-thirds; and hiring individuals previously unemployed or active in other lines of work. The end result was that, in terms of barley equivalent, the harvests during the last three years of the war were actually higher than the prewar average.
Reminiscent of the situation of the Second Reich during the previous conflict, however, phosphate fertilizer use fell to around one-tenth of the prewar level while the quality of animal manure declined because of poorer nutrition. German authorities also enforced a number of restrictions to ensure greater volumes of export surpluses, such as prohibiting the manufacture of margarine, full-fat cheese, fluid cream, and beer with a high alcohol content; increased rates of extraction in flour mills; and a reduction in the fat content of fluid milk. The reduced availability of animal feed also meant that by 1942 the culling of two thirds of the hog and chicken population had become unavoidable, although by 1945 production levels eventually came back to above half of prewar levels.
At the end of the war, the per capita food consumption in Denmark was about 20% less than at its beginning and the country had experienced a decrease of 5% of its overall national wealth. True, these results were not bad considering the context, but still the Danish case does not support the notion that increased self-sufficiency delivers economic and security benefits. As one University of Copenhagen economist observed at the time, the Danish agricultural performance was only as good as it turned out to be because the “Germans paid for the war effort,” and, overall, the Danish economy consumed some of its accumulated capital and suffered “heavy financial loss[es].”68 In Denmark, as elsewhere, increased national self-sufficiency would have been unsustainable in the long run and the adoption of this policy on an ever smaller geographical scale (i.e., locavorism), even less so.
Peak Oil and Locavorism69
Another common belief among locavores in terms of food security is that their prescription prepares us for the unavoidable re-localization of our food system that will follow the imminent peak and later depletion of our supply of “cheap petroleum” in the next century. This argument, however, is fallacious, whether or not one believes in the peak oil rhetoric. For starters, even assuming a world in which hostile aliens have emptied all of our best oil fields, all credible analysts (there are always a few pessimistic outliers) tell us that, with minimal efforts to look beyond the ample economically recoverable reserves available at the moment, we could easily have enough coal to last us several centuries.70 It is true that rebuilding our global food supply chain around (mostly liquefied) coal would be more expensive, inconvenien,t and environmentally damaging than around petroleum-derived liquid fuels (which is why they displaced coal in the first place), but it does not present any insurmountable problem—indeed, South Africa’s Sasol, the world’s largest producer of coal-based liquids, already manufactures a completely synthetic jet fuel. Locavores should remember that relatively inefficient and expensive coal-powered railroads and steamships laid the foundations of our global food supply chain. The world didn’t simply go from a hearty diet of local organic deliciousness to eating globalized petroleum in one fell swoop—there was a substantial course of coal in-between.
Reverting to coal in the 21st century would also not mean reverting to 19th century engine technologies, but simply to a more expensive and inconvenient fuel that would compete with other unconventional sources (from shale oil to Canadian oil sands). Furthermore, because the liquid fraction of petroleum used to power container ships is for the most part the dirtiest and cheapest (so-called “bunker fuel”) for which there is at present little other demand,71 higher crude prices would have a much more pronounced effect on other segments of the food supply chain than on long distance maritime transportation.
At any rate, the Peak Oil rhetoric should not be taken seriously. Pessimistic energy forecasts have a long history—predictions of imminent petroleum shortages were even made before the first oil well was drilled in Western Pennsylvania in the middle of the 19th century—and a truly awful track record.72 The main problem historically is that energy doomsayers never quite understood that humans are not only mouths to feed, but also brains to think and arms to work, along with the fact that resources are not fixed and permanent things that exist in and of themselves, but instead are created by always renewable human intellect and labor.
Despite the obvious finiteness of their surroundings, humans have long been able to develop profitable new technologies to achieve the same or better results while using less resources; to extract valuable materials from more remote locations (for example, offshore drilling) or from
less interesting materials (such as less concentrated ores); and to create substitutes out of previously worthless raw materials and industrial residuals that proved their advantage over previous alternatives, such as being more powerful and/or abundant; stronger and/or lighter; and/or easier to produce, handle, transport, and/or store. For instance, because they did not burn cleanly without modern technologies, coal and petroleum were not very valuable for most of human history. In recent years, in North America alone, the advent of shale gas, increased onshore oil production from shale rock, new recovery techniques that make it economical to extract oil left in old wells, new oil field discoveries in the Gulf of Mexico, and advances that have drastically reduced production costs in the Canadian oil sands, have all ensured an abundant supply of affordable transportation fuels for at least several decades, if not for a few centuries.73
Sure, one day humanity will move beyond fossil fuels, but it will not be because we have run out of them, but rather because better alternatives have come along. Justifying a preemptive move towards locavorism because of irrational fears of an imminent energy shortage is an untenable proposition. At any rate, if one truly believes that economic resources are finite, then sustainable development becomes a theoretical impossibility, for humanity will unavoidably run out of everything and collapse at some future date. Saving for the future in this context is therefore nonsensical, for it at best delays the unavoidable. Better then to save farmers from labor-intensive toil now while also sparing future generations a horrible fate by making sure that we consume as much energy as we can in the present and collapse in style as soon as possible!