Book Read Free

Sex, Bombs and Burgers

Page 3

by Peter Nowak


  Once stability and exports were established, profit became the main motivator for continued innovation in the food industry. This led to consolidation into fewer and fewer major powerhouses over the second half of the twentieth century. These huge, publicly owned companies must earn profits for shareholders or risk being consolidated themselves. Many of them are based in the United States and several, including Pepsi, Kraft, Coca-Cola, Tyson Foods (the world’s biggest chicken processor) and McDonald’s, are Fortune 500 companies. Together, they comprise the largest industry in the world, dwarfing even the military, with annual revenue of about $4.8 trillion, or 10 percent of global economic output.32 With that much money at stake, the industry is incredibly competitive—companies must continually come up with new ways to generate profits. There are two ways to do this: squeeze new operating efficiencies into the business or think up new products.

  When it comes to improving efficiencies, the same rules apply as in boosting economic productivity. In economics, a worker’s productivity is measured in units produced per hour worked. Economists agree that there are basically two ways to boost this output—either a business hires more employees to spread the workload or it invests in technology that allows each individual worker to do more. This rule also applies to agriculture, more specifically to farmland. The only way to boost agricultural output is to increase the amount of land being used or invest in more efficient technology. Since populations and cities are growing and arable land is decreasing proportionately, well ... there’s really only one option. During the Green Revolution, this meant hybrid plants that made more efficient use of the land. But it hasn’t been enough, so biotechnology companies such as Monsanto have turned to genetically modified plants and animals to boost efficiency once again. If you can grow a stalk of corn that produces more useful kernels and fewer useless leaves and stalks on the same amount of land, why not do so? Or so goes the rationale.

  At the product level, processors are continually looking for new ways to cater to customers’ wants, which have typically centred on food that’s cheaper or takes less time to make. Recently, consumers have also started demanding healthier foodstuffs. Companies such as Hormel and Unilever, for example, are incorporating a processing method that uses highly pressurized water in conjunction with microwaves to precook meats and vegetables. This new method cooks much faster than the old steam-based system and the food retains more of its natural taste and nutrients. The process was originally developed for the military but is now going mainstream because it gives the companies a competitive edge—healthier foods—over rivals. “They put in a lot of effort because they see there’s a benefit for their civilian market,” says Patrick Dunne, a food scientist at the U.S. military’s Natick food lab, which co-developed the technology with the companies.33

  Every food company is looking to save money in production, which is why, to take just one example, you may have noticed a steady increase in the number of foods at the grocery store that now come packaged in pouches rather than cans. These “retort” or flexible pouches, again developed by the military, are lighter than cans, which cuts down on shipping costs.

  During the second half of the twentieth century, many families saw both parents go off to work for the first time, so less time was available to prepare meals. Not only did this force companies to come up with quick and easy-to-prepare meals, it also spurred the rise of fast-food chains. Technology figured prominently in both. Through the power of processed foods and new cooking innovations such as the microwave oven, a meal that took one hour to produce in 1965 was shaved down to only thirty-five minutes by the mid-nineties.34 Fast-food chains such as McDonald’s built themselves on technology great and small—from big innovations such as frozen burgers and fries down to the humble ketchup-squirting gun—to provide inexpensive and easy meals for consumers in a hurry. All of this technology dramatically changed eating patterns—by 2007, percent of Americans ate food produced outside of the home every day, much of it fast food.35

  McDonald’s, the largest restaurant chain in the world with more than 31,000 outlets and annual revenue exceeding $20 billion, is constantly searching for ways to improve food safety and quality and customer satisfaction, in order to ensure the return visits it relies on. The company recently introduced automatic fry dispenser baskets and automated drink-pouring machines to help cut seconds off order times. McDonald’s has also established a Technology Leadership Board, where store managers contribute ideas for improving the system. Customers are hired to visit its test lab in Romeoville, a suburb of Chicago, to try out new inventions. “Our customers don’t come in for technology, they come in to get great service and great food in a clean environment,” says Dave Rogers, senior director of the restaurant solutions group for McDonald’s Canada. “We look at it as an incredibly important enabler to run our restaurants more efficiently. We’re always looking for ways to save seconds.”36 Given that McDonald’s is the world’s largest purchaser of beef, pork and potatoes and the second-biggest buyer of chicken (KFC is first), its technological advances have major impacts, not just on the rest of the fast-food industry but on the world’s entire food-production system.

  There is also one other reason to invest in food innovation: hunger is a major motivator for turning people toward war and terrorism. When you and your family are starving and the local army or al Qaeda offers you food, you probably won’t ask too many questions. With the world’s population set to double in the next fifty years and global warming expected to create food shortages for an additional 200 million to 600 million people in the developing world, the problem is only going to get worse.37 Part of the solution will be to move food around better (including making use of the staggering amount being thrown out by Americans) and part of it will involve more efficient food production. Both will involve more technology, not less.

  Side Effects

  The technologies of war, sex and food haven’t just changed the goods we buy, they have also changed us as people and as a society—sometimes for the better, sometimes for the worse—in ways we rarely consider. These issues are not new. Archeological evidence suggests that one of humanity’s first inventions, fire, was used for the purposes of war, sex and food. Prehistoric man used fire to frighten enemies and animals. He also used it to illuminate caves so that he could paint, among other things, depictions of sexual acts on the walls. And cave men (and women) used fire to sterilize their food, perhaps as Fred Flintstone did in cooking his Brontosaurus burgers. Moving forward through time, ancient societies invented iron and used it to create weapons, while the development of shipbuilding allowed for the spread of empires. Gutenberg’s first printed book may have been the Bible, but it was soon followed by erotically charged works such as The Decameron and The Canterbury Tales, which helped pay his bills. The invention of canning, meanwhile, helped Napoleon march his troops around Europe. Had the microwave oven been invented, he might even have succeeded in his attempt to conquer Russia.

  The modern technological world began in the mid-twentieth century as global war erupted. The Second World War was an unprecedented conflagration involving more nations than any previous conflict. The stakes had never been so high—the Axis powers were bent on genocidal world domination while the Allies were determined to prevent a future under Fascism. Both sides looked to technology to gain an advantage over their enemies, and both sides invested heavily. A host of worldchanging technologies emerged from this deadly competition, from jet airplanes to computers to rockets that took us to the moon. Technology ultimately provided the exclamation point to the end of the war, with the atomic bomb explosions in Japan heralding a new age of future hopes—and fears. The Second World War set in motion our continual, modern-age quest for new and better technology, a turbo-charged sprint to a world that is more advanced than even the most imaginative of sciencefiction writers envisioned a hundred years ago. The effects of this epic war are around us everywhere today, so it’s a good place to begin our exploration of the technolog
y of sex, bombs and burgers.

  1

  WEAPONS OF MASS

  CONSUMPTION

  And he shall judge among the nations, and shall rebuke

  many people: and they shall beat their swords into

  plowshares, and their spears into pruning hooks: nation shall

  not lift up sword against nation, neither shall they learn war

  any more.

  —ISAIAH 2:4

  After total war ... total living.

  —AD IN LIFE MAGAZINE FOR REVERE COPPER AND BRASS

  INCORPORATED, OCTOBER 1942

  The view from atop Coventry Cathedral differs from that of many of Europe’s old towers. Rather than cobblestone streets and centuries-old edifices, Coventry’s oldest structure is surrounded by shopping malls. Off in one direction you can see a soccer stadium, in another a big blue Ikea store. There are few tourists scrambling for photos of landmarks here, just busy locals hurrying to get some shopping done during their lunch breaks as police cars zip by on the paved roads, sirens wailing. Radiating out from the cathedral in all directions are the hallmarks of twentieth-century construction: steel, concrete and glass buildings, both low- and high-rise, housing businesses and residences. There is little trace of antiquity in this city of more than 300,000 inhabitants, smack dab in the middle of England. Coventry resembles the new, manufactured metropolises of North America more than it does the Old World. But that wasn’t always the case.

  Coventry has had three cathedrals in its history. The earliest, dedicated to St. Mary, was established in 1043 as a Benedictine community by Leofric, the Earl of Mercia, and his wife, Lady Godiva—the same woman who, according to legend, rode a horse naked through the streets to protest the excessive taxes her husband had imposed on residents. A statue commemorating her ride now stands in the middle of a pedestrianized outdoor shopping mall, just a stone’s throw from the current cathedral.

  Over the next few centuries the settlement around the church grew, mainly on the strength of its textile trade, and by 1300 Coventry was England’s fourth-largest city. During the following two centuries, it became the kingdom’s temporary capital on several occasions when the monarchy relocated to avoid rebellions in London. St. Mary’s was also replaced by a grand Gothic church, St. Michael’s, but this fell into disrepair during the sixteenth century when Britain’s monasteries were dissolved. By the early twentieth century, Coventry had evolved into a major manufacturing centre, particularly for cars—the city was the Detroit of the United Kingdom, headquarters for heavyweights Jaguar and Rover—and the population had risen to a quarter of a million. St. Michael’s church, meanwhile, was elevated in status in 1918 to become the city’s second cathedral. With the outbreak of the Second World War, Coventry, touted by its government as the best preserved medieval city in England, also became one of the country’s top producers of airplanes and tanks, a status that made it a prime target for the Nazis.1

  On the evening of November 14, 1940, German bombers commenced Operation Moonlight Sonata, Hitler’s most ambitious and vicious attack on England. Luftwaffe bombers pounded Coventry with wave after wave of high explosives and incendiary bombs from dusk till dawn, killing more than 550 civilians, injuring thousands, destroying more than 4,300 homes and damaging three-quarters of the city’s factories.2 St. Michael’s Cathedral, the city’s figurative and historical heart, was destroyed, save for the miraculous sparing of its spire. The German attack, intended to hurt England’s production capability and soften it up for an all-out invasion, was called “one of the worst bombardments from the air since the Wright brothers presented wings to mankind.”3 Hermann Göring, the Luftwaffe commander, boasted of the destruction and coined a new word to mark the occasion: Koventrieren, to “Coventrate” or raze to the ground.

  In an editorial decrying the bombing, the New York Times pointed out that the horror of the Blitz had only happened because there was no defence against such a night-time assault. Anti-aircraft guns and mines strapped to balloons, a tactic seemingly borrowed from a Road Runner cartoon, brought down only a handful of attacking planes, which meant that “other great industrial centers and ports in England are exposed to the same fate whenever the moon is bright and the weather favorable to raiders.” Until a new defence could be developed, Prime Minister Winston Churchill’s warnings that “death and sorrow will be our companions on the journey, hardship our garment, constancy and valour our only shield” would continue to ring true.4

  The development of such a defence was secretly underway in the ruins of Birmingham, which had been similarly “Coventrated.” Physicists John Randall and Harry Boot were experimenting with an improved version of the cavity magnetron, a copper tube that generated microwaves. At the magnetron’s centre was a cathode that pumped out electrons, which were spun around the tube by an attached electromagnet, a process that gave the device its name. The electrons spun past cavities drilled into the tube and produced radio waves. Those waves were then emitted and, if they hit something, bounced back to their source. This echo effect let the device operator know that something had been detected, and pinpointed the object’s position. Earlier versions of the magnetron, developed in the United States and Germany, were of limited use because they didn’t generate much power and had poor range. Randall and Boot boosted the tube’s power output a hundred-fold by drilling eight cavities into the magnetron instead of the standard two, and by adding a liquid cooling system. The result was a more powerful magnetron that was compact enough to be fitted into aircraft. The British government believed that giving planes the ability to see enemies at night would be a major advantage, perhaps enough to turn the tide of the war. The problem, however, was that Britain was cut off from its traditional European allies, now all under Hitler’s thumb, and lacked the production capacity and manpower to produce the device in the large numbers needed.

  Bring in the Yanks

  Britain turned to its long-time pal, the United States, for help. With the Nazis pressing and time running out, Henry Tizard, head of British aeronautical research, set out on a voyage across the Atlantic in late September 1940, taking with him the nation’s most valuable technological secrets, including blueprints and diagrams for explosives, rockets, self-sealing fuel tanks, the beginnings of plans for an atomic bomb and, the main attraction, the magnetron. Tizard put the magnetron’s fate into the hands of Vannevar Bush (no relation to the presidents), an inventor, electrical engineer, entrepreneur and patriot who resembled a beardless Uncle Sam.5

  In his early adulthood in the 1910s, Bush had supplemented his undergraduate studies at Tufts College near Boston by working at General Electric and as research director for American Radio, a small company started by his fellow students Charles Smith and Al Spencer. (The company achieved some minor success during the First World War with Smith’s invention of the S-Tube, which eliminated the need to use batteries in radios, but was all but wiped out by the Great Depression.) In 1917 Bush received his doctorate in electrical engineering from Harvard and the Massachusetts Institute of Technology (MIT), and by 1923 had become a professor at the latter. In 1922 Bush and fellow Tufts engineering student Laurence Marshall teamed up with Smith and set up the American Appliance Company to market another of Smith’s inventions, a refrigerator with no moving parts—its solid state making it less prone to breaking— but failed miserably when they found no takers. The trio’s backup plan was an improved version of the S-Tube. They brought Al Spencer back on board, along with his younger brother Percy, and by 1925 American Appliance was earning a profit.6 To avoid problems with a similarly named company operating out of the Midwest, the group renamed the business Raytheon Manufacturing—adding “theon,” Greek for “of the gods,” to the rays of light their tubes produced. For the beleaguered British, Raytheon proved to be a godsend indeed.

  In his public service life, Bush had helped develop a submarine detector for the American government during the First World War, but the system was never used because of bureaucratic confusion
between industry and the military. “That experience forced into my mind pretty solidly the complete lack of proper liaison between the military and the civilian in the development of weapons in time of war, and what that lack meant,” he later recalled.7 In 1932 Bush became vice-president and dean of engineering at MIT, then moved to the prestigious Carnegie Institute of Washington as president in 1939, to be closer to the corridors of government power. Lack of co-operation was something he would not tolerate during the new conflict. Along with a group of fellow science administrators, including MIT president Karl Compton, Bush pitched President Franklin D. Roosevelt on an organization that would oversee research and development work between industry and the military. Bush showed Roosevelt a letter that proposed his National Defense Research Council and the president approved it on the spot. “The whole audience lasted less than ten minutes ... I came out with my ‘OK–FDR’ and all the wheels began to turn,” he later wrote.8 On June 12, 1940, the American military-industrial complex was born, with the patriot Vannevar Bush as its beaming father.

  Bush was the chairman of the new NDRC while Compton was put in charge of developing radar. The first few meetings between Tizard’s delegation and the new American military-industrial brain trust were cautious, like a high-stakes poker game with neither side wanting to reveal its hand. Compton cautiously showed the British visitors the low-powered magnetrons developed by American scientists, which thawed the atmosphere between the two camps. After seeing that the two nations were on the same path, Tizard proudly demonstrated the high-powered British magnetron to the astonishment of his hosts, prompting the envious Compton to order the immediate establishment of the Radiation Laboratory at MIT to develop the device further. Large electronics manufacturers including General Electric, Bell Telephone and Western Electric were brought in to mass-produce the magnetron, but they encountered a problem: because the gizmo had to be machine-chiselled from a solid copper block, producing it in mass quantities was difficult, time-consuming and expensive.

 

‹ Prev