by Ted Koppel
That carnage, immortalized in Picasso’s Guernica, would have been fresh in the minds of the British, but so would earlier reports of the Italian air force spraying mustard gas on civilian targets in Abyssinia in 1935 and 1936. Consider these events the emotional framework within which the German attacks on England were anticipated. The British government knew that it had to prepare, but for what, exactly, was unclear. Poison gas had been widely used during World War I. The Italian gas attacks in Abyssinia were another clue. It was hardly unreasonable to assume that Adolf Hitler, already busily engaged in the extermination of millions in specially constructed gas chambers, might order the use of poison gas against the British population.
The resulting civil defense planning was a strange mixture of thoughtful preparation and misplaced emphasis. Parliament had passed the Air Raid Precautions Act in 1937, which provided the lion’s share of funding and an organizational structure that connected local and central government, but public bomb shelters were woefully inadequate. The great fear was poison gas. Thousands of decontamination chambers were set up, and nurses were trained to deal with the aftereffects of gas poisoning.
Great Britain declared war on Germany in September 1939, following the Nazi invasion of Poland; but Germany’s massive bombing campaign against Britain did not begin until almost a year later. That year’s grace period made a world of difference. The fact that Britain was a nation at war created an appropriate mindset; the delay provided the opportunity to prepare. “Between 1939 and 1940,” wrote Overy, “an army of regulars and volunteers was created capable of manning the front line; for the rest of the civil population habits of obedience to the blackout regulations, gas-mask drills, air-raid alerts and evacuation imposed on everyone an exceptional pattern of wartime behavior that persisted until the very end of the war.”
Both of my parents were refugees from Germany, denied British citizenship until the war was over. Until that time, my father was also denied permission to work. Age and his ambiguous national status made service in the British armed forces impossible, but he could, and did, serve in the Home Guard. And there begins my first, vague awareness of civil defense. As best I can recall, my father and a neighbor, also a volunteer in the Home Guard, would patrol the neighborhood after dark armed with a long-handled whisk broom and the metal cover of a rubbish bin. It is not a heroic image, but in the catalogue of civil defense measures implemented by the British government, even these teams may have been useful. Beyond the practical impact of saving the occasional home from fire, such programs gave participating civilians a feeling of usefulness, a sense of connection with a larger mission.
The Battle of Britain raged in the skies over London and other English cities during 1940 and 1941. That was the worst of the aerial bombardment, but even afterward German bombers carried out missions intent on destroying England’s infrastructure and terrorizing its civilian population. Among their weapons was the incendiary bomb, a small, finned tube filled with magnesium and carbide. It would explode on impact with an intense flame that burned for up to fifteen minutes. Left alone, incendiary bombs were deadly and highly destructive, but there were relatively simple measures that were sometimes enough to counteract them. An instructional film shown at civil defense training sessions and before the feature presentation at local cinemas urged audiences to acquire hand pumps. One person would immerse the pump in a bucket of water, pushing a plunger up and down, while another member of the family was shown directing the stream onto a burning incendiary bomb.
That was the high-tech approach. My father and his partner had been instructed to apply what might have been called the “sweep and smother” method. If an incendiary device landed on top of a home, it would in short order burn through the roof and set the house on fire. The broom man on my father’s team had been instructed to climb onto the roof and sweep the bomb to the ground, where his rubbish lid cohort would smother the thing. Whether my father ever had the occasion to apply this technique, I do not know. That, however, was the plan, and a plan can be a virtue in and of itself.
We had a small bomb shelter in our garden. The owner of the house must have had it installed before the war, and I believe we used it only once. It was cold and damp, and my good German mother, who rarely asserted herself, said she would rather take her chances in the house. My father had acquired a heavy metal desk, and throughout the war I slept under that desk, protected, if not from a direct hit, at least from the danger of flying glass and shrapnel.
With the clarity of hindsight we can apply clinical analysis to what worked, what didn’t, and why. That bomb shelter of ours at 19 Haslemere Gardens was an Anderson shelter, a cheap construct of curved metal panels installed several feet deep in the soil of our backyard. These were, according to Overy, “not even remotely bombproof.” My mother would have felt quietly vindicated.
There were, the government concluded, only two viable options: shelter and evacuation. By March 1940, major cities contained shelter space for almost half of their population of 27.6 million, but this was less impressive than it sounds. Thirty-nine percent of domestic shelters were regarded as likely ineffective. Nor were the public shelters much better, and Prime Minister Winston Churchill and his cabinet were initially disinclined to devote the resources or manpower that would have been necessary to improve their quality. So the emphasis was placed on evacuation. On September 1, 1939, Overy noted, 1,473,500 people—children, pregnant women, new mothers, the disabled—left England’s cities for the comparative safety of the countryside, where they would be rehoused. Despite these huge initial numbers, it turned out to be a highly unpopular option, particularly in light of the fact that German bombing had not yet begun. By January 1940, wrote Overy, around 900,000 of the evacuees had returned to their urban homes. Once the bombing did begin, evacuation was no longer an option. “On 7 September [1941], the first day and night of heavy bombing in London, several thousand Londoners bought tickets for the Underground and stayed put in the stations and tunnels.” During the early weeks of the bombing more than 120,000 Londoners took shelter in that fashion. Around 65,000 stayed there even through the winter, though the bombing declined. The platforms and tunnels had no toilet facilities and they grew increasingly filthy. There was nothing to eat or drink, nor had anyone thought to provide cots for sleeping.
There had never been a sustained bombing campaign against an urban target. With no precedent to guide its conclusion, the British government simply assumed that most German bombing would occur during daytime, so no preparations had been made to turn public or domestic shelters into dormitories. In the final analysis, the vast majority of Londoners opted for neither evacuation nor the bomb shelters. People took their chances at home.
For all that, there was an organized structure in place. There was a level of civilian discipline that served the country well when the horrendous bombing finally began. Churchill’s eloquence and the royal family’s refusal to evacuate from Buckingham Palace reinforced the population’s self-image of enduring hardship with a stiff upper lip. The British were emotionally prepared for whatever might come, even if what ultimately came was not what had been expected.
During the latter years of the war the Germans unveiled a new class of weapons to be deployed against civilian targets: the so-called Vergeltungswaffen, or retaliatory weapons. We children knew them as “doodlebugs” or “buzz bombs.” They were invented and designed by scientists, some of whom (depending on where and by whom they were captured) would, after the war, end up in either the U.S. or Soviet space programs. The V-1 was a precursor of the cruise missile; the V-2 was an early version of the liquid-fueled ballistic missile. We were unwitting witnesses to the dawn of a new age. All I knew as a child was that doodlebugs emitted a high-pitched whistle as they hurtled through the sky, and then, when they were over their targets, the whistle would stop as the missile fell to earth. There would be a brief period, perhaps ten seconds of silence, before impact and the sound of an explosion. Nothing would have pro
tected our family against a direct hit, but I didn’t know that. I knew that when I heard the whistle of a V-1 that I was to race for my father’s study and duck under his desk. It was a useful concept, not just as a practical matter but also for the confidence it instilled in a child who regarded the whole exercise as something of a lark.
One additional memory from my childhood, which may have been formed precisely on May 8, 1945: VE Day. It was after dark when my father took me out onto the street to show me a sight I had never seen before: lighted street lamps. My entire life until that moment had been lived under what we knew as the “blackout.” Every window in every building throughout England had some variation of a blackout shade, intended to keep even glimmers of light from serving as beacons to German bombardiers. There was a popular song at the time: “When the Lights Go On Again (All Over the World).” For me, they weren’t going on again; they were being lighted for the very first time. Much of Britain’s civilian population had faced infinitely more harrowing circumstances than ours. What lingers, after all these years, is the sense of preparedness, of having a plan, of being ready for whatever might come.
—
In a sense, preparing for the unknown has always been the challenge facing civil defense planners. How does a country’s leadership draw the appropriate line between prudence and paranoia when neither the timing nor the exact nature of a threat to national security can be defined? That was the question confronting the United States in the years following the Soviet Union’s first successful test of an atomic bomb in 1949. Like the British ten years earlier, American civil defense planners concluded that their options essentially boiled down to shelters and evacuation.
General Dwight D. Eisenhower, who masterminded the D-Day invasion as Supreme Allied Commander, had been impressed by the quality of Germany’s famous highway system, the autobahn. In 1956 President Eisenhower succeeded in ramming legislation through Congress to approve the U.S. Interstate Highway System. Eisenhower, the former military commander, envisioned the easier movement of troops and matériel in times of national crisis and the massive evacuation of cities in the event of an atomic attack. What that highway system produced, perhaps unintentionally, was almost immediate social and economic transformation. It led to, among other things, the vast exodus of industry and businesses from America’s cities, which in turn encouraged the development of suburbs, with their shopping centers and networks of gas stations, and a new category of Americans—commuters. Not that Eisenhower’s fears of a nuclear attack seemed unjustified in the 1950s. Indeed, a number of communities around the country ran vast evacuation exercises.
One such was Binghamton, New York. On Sunday, May 5, 1957, 1,448 men, women, and children from Binghamton took part in an exercise code-named Operation Evac-12. It would take them from the Twelfth Ward on the city’s east side to the appropriately named village of Deposit, some twenty-eight miles away. A contemporaneous account in a local newspaper, the Binghamton Press, reported that this would be “the first mass evacuation of such magnitude in the nation.” The newspaper went on to report: “At 1:30, the first of about 500 people emulating panic left their homes and climbed into about 120 automobiles. Others weren’t displaying panic, as planned[,] got into automobiles, and all began their 28-mile trip along State Route 17. Nearly 110 people boarded an Erie-Lackawanna passenger train, which served as a traveling hospital for patients. School buses were also deployed. Had this been a real attack, Deposit would’ve been expected to handle upward of 10,500 people.”
The exercise was a one-day affair. It lasted only until that same evening, when the village of Deposit hosted about three thousand evacuees, civil defense personnel, and observers for a dinner of chicken, biscuits, coffee, and doughnuts. Our reporter of that day’s events apparently did not ask the villagers how they would have handled the flood of “upward of 10,500 people” expected in the aftermath of an actual atomic bomb attack, nor did anyone speculate on how Deposit would have managed such a horde through a stay of indefinite duration.
What mattered at the time was not so much the likelihood of actual survival as the perception of a ready-for-anything level of preparedness. Michael Chertoff, who served as secretary of homeland security during the administration of George W. Bush, told me that civil defense planners in the 1950s may not have been as naive as they seemed. They were under no more illusions about the feasibility of mass evacuations than they were about the effectiveness of “duck and cover” drills in preparing for an atomic attack. These types of exercises, said Chertoff, were largely intended to convey to the Soviet Union the impression that the United States was determined to defend Western Europe even at the risk of all-out war.
That was during the brief age of the atomic bomb. When the even more deadly hydrogen bomb became a reality, it seemed that the government grew more invested in optimizing survival plans. In 1956 Congressman Chester E. Holifield, whose considerable clout derived from his chairmanship of the House Committee on Government Operations, publicly stated that there wouldn’t be time for evacuation and that exercises such as “duck and cover” were little more than a charade. He chastised the Federal Civil Defense Administration (FCDA) for proposing these “cheap substitutes for atomic shelter.” The FCDA ended the debate by calling the chairman’s bluff, proposing a distinctly non-cheap, $32 billion program providing tax incentives and special low mortgage rates to households that built or included shelters. Again, this was 1956, when a $32 billion federal program would have been considered exorbitantly expensive. President Eisenhower and Congress compromised by building a classified underground bunker for members of Congress beneath the Greenbrier Resort in West Virginia.
Once the impact of a hydrogen bomb explosion was fully understood by the public, however, even New York City’s ubiquitous bomb shelters, stocked as they were with blankets, biscuits, and water, were revealed as largely ineffective. It had all been part of a campaign to convey to the Soviets Washington’s confidence in the ability of American cities to survive the impact of a nuclear explosion—and, therefore, the firmness of the U.S. commitment to defend Europe against a Soviet attack.
Looking back, it’s a little difficult to determine where efforts to reassure the American public began and the campaign to mislead Moscow ended. Exercises were widespread and conducted with absolute seriousness. In July 1957 mock atomic bombs were dropped on a hundred American cities, and Boy Scouts were assigned to work with Civil Defense, searching for lost individuals, administering first aid, rescuing people and protecting animals.
Nebraska, home to Strategic Air Command outside Omaha and to a number of nuclear missile silos, had always been considered a prime Soviet target. Thinking about the unthinkable in 1963 included a bizarre exercise that involved a two-week survival test for thirty-five cows, one bull, and two student cowhands. History records that Dennis DeFrain and Ike Anderson cared for the test herd, buried under five feet of dirt and provisioned with cattle feed and water in a 10,000 gallon tank. There was an auxiliary generator that would be “available if electrical power was interrupted by an atomic blast.” While it’s easy now to ridicule some of Nebraska’s civil defense experiments, it may in fact have been the realization that sheltering a couple of million head of livestock—or finding durable shelter for, or evacuating, tens of millions of people—was impractical, to put it gently, that stimulated alternative thinking in Moscow and Washington. Ultimately, such exercises in speculation led to the evolution of a strategy that abandoned both shelter and evacuation as viable protection for millions of civilians. The new strategy was based on the proposition that as long as the United States and the Soviet Union could maintain a reasonable equilibrium of terror, as long as either side retained the capacity to respond to a nuclear attack with equal or greater force, there would be a disincentive to launch a surprise first strike.
This tenuous but crucial détente was possible only because of the deep awareness both nations had cultivated of the threat of nuclear war. Both sides needed to g
o through the motions to reach the conclusion that civil defense planning was essentially useless in the context of a nuclear war, which in turn led to the relative safe harbor of strategic arms limitation. Likewise, seventy-five years ago, the reality of an unprecedented campaign of aerial bombardment against its civilian population obliged the British government to rethink its earliest concept of civil defense. The initial planning had been imperfect, the focus was often misdirected, but there were preparations, which gave the British a sense of direction amid chaos and without which the destruction wreaked by the Luftwaffe would have been even more devastating.
There is, as yet, no real sense of alarm attached to the prospect of cyber war. The initial probes—into our banks and credit card companies, into newspapers and government agencies—have tended to leave us unmoved. The public was engaged by the North Korean hacking attack on Sony, with the saucy, gossipy tidbits capturing attention for a few weeks, but there was no real sense of outrage or danger. In reality, the ranks of our potential enemies have never been this deep. Our points of vulnerable access are greater than in all of previous human history, yet we have barely begun to focus on the actual danger that cyber warfare presents to our national infrastructure. Past experience in preparing for the unexpected teaches us that, more often than not, we get it wrong. It also teaches that there is value in the act of searching for answers. Acknowledging ignorance is often the first step toward finding a solution. The next step entails identifying the problem. Here it is: for the first time in the history of warfare, governments need to worry about force projection by individual laptop. Those charged with restoring the nation after such an attack will have to come to terms with the notion that the Internet, among its many, many virtues, is also a weapon of mass destruction.