Tom Zoellner
Page 12
The premise of the story was, of course, grossly incorrect. An estimated seventy thousand people were killed in the initial attack on Hiroshima, but the final body count was closer to one hundred thousand as radiation sickness and, eventually, cancer took their toll. Laurence was in an excellent position to know about the radiation, thanks to his level of scientific access to Los Alamos and to the measurable levels of gamma rays in the air immediately after Trinity—enough to kill rabbits far away from the epicenter and to create, as he later admitted, gray ulcers on the hides of faraway cows.
The Associated Press was already running speculation that the “uncanny effects” on the Japanese were due to gamma rays—an entirely correct assessment. But this would not have been the first time that Laurence introduced deliberate falsehood into the record. Before the Trinity test, Groves had ordered him to prepare a statement blaming the flash in the desert on the accidental ignition of an ammo dump. The local New Mexico newspapers printed the bogus story without question. “The secret had to be kept at all costs and so a plausible tale had to be ready for immediate release,” Laurence explained later, without a trace of regret.
Laurence’s series on the making of the atomic bomb, including his eyewitness account of the Nagasaki explosion, won a Pulitzer Prize. A later memoir, titled Dawn Over Zero and written with the same breathless moxie of his news style, became an international bestseller. He was promoted to science editor and took an office on the Times’s coveted tenth floor. At one point after the war, he had the unique honor of learning that FBI agents had been sent out to public libraries in 1940 to remove all copies of the edition of the Saturday Evening Post that carried his story on uranium. A translated copy of the same story, wrapped in cellophane, was discovered after the war, inside the safe of a German physicist assigned to work on the hapless Nazi weapons program. Even the enemy had apparently considered his writings prophetic. Until he died of a blood clot while vacationing in Spain in 1977, Laurence was treated as an authority on nuclear energy and as a minor celebrity. His obituary in the Times noted: “He occupied honored places on daises at major affairs of scientific and other organizations, and his short, chunky frame frequently stood on the lecture platform.”
Laurence was not completely blind to the moral ambiguities raised by the dawn of the age that he had trumpeted so loudly. His descriptions of Nagasaki made reference to the “living substance” boiling within the rainbow cloud. He acknowledged the “great forebodings” of the power now under America’s control. It was he who had extracted and publicized that ominous reverie from J. Robert Oppenheimer: “I am become death, the destroyer of worlds.” And in a 1948 article for the magazine Woman’s Home Companion, he allowed that the future with uranium could be a hell instead of a heaven if it were mishandled. The image he used was by that time already shopworn—nearly as much as “the genie is out of the bottle”—but it was still an admission of an alternative to nuclear Eden. Laurence being Laurence, he made it stark.
“Today we are standing at a major crossroads,” he wrote. “One fork of the road has a signpost inscribed with the word Paradise, the other fork has a signpost bearing the word Doomsday.”
The country began to accept Armageddon as a possibility, but one relegated to the future. The United States, after all, was still the only nation to possess atomic weapons, and the newspapers were quick to note that most of the uranium inside the earth was now under direct American control. Experts also reassured the public that the technological abilities of other countries—especially the Russians—were hopelessly backward. The celebrations that followed Japan’s surrender eight days after Hiroshima also helped lighten the mood. There was time to relax.
The word atomic soon became an adjective for all things mighty and exotic. Cleaning services and diners renamed themselves with the neologism. A set of salt-and-pepper shakers in the shapes of Fat Man and Little Boy—the bombs dropped on Nagasaki and Hiroshima—were popular sellers. A few enterprising bartenders concocted strange-tasting drinks and called them “atomic cocktails,” invariably followed up with a call to the local newspaper in hopes of a write-up. The Hotel Last Frontier blended vodka and ginger beer; “a perfect toast to victory,” enthused the Las Vegas Review-Journal. At the Washington Press Club, the recipe was Pernod and gin. Yet other versions mixed aquavit and beer; gin and grapefruit juice; vodka, brandy, and champagne. The proliferations inspired the jazz vocalist Slim Gaillard to write a new song.
It’s the drink you don’t pour
Now when you take one sip you won’t need anymore
You’re as small as a beetle or big as a whale
BOOM!
Atomic Cocktail
From the corpse of the Manhattan Project, a new government agency was formed: the Atomic Energy Commission. Its director, David Lilienthal, flew around the country making speeches to civic groups and schools. At a dramatic point in his address, he would pull a lump of coal from his pocket. A piece of uranium this big, he would say, could keep Minneapolis warm for a whole winter. Utopian projections such as this led the New Yorker magazine to propose that the “pea” be adopted as the new standard unit for the measurement of energy. Lilienthal was privately skeptical about the utopian promises for atomic power, but in public he was as irrepressible as William L. Laurence. He helped stage a Walt Disney-style exhibit in New York’s Central Park called Man and the Atom, featuring displays sponsored by the government’s biggest atomic contractors. One of the handouts from Westinghouse was a slick comic book called Dagwood Splits the Atom, in which the goofy husband from the strip Blondie blows a neutron through a straw and shatters a fat, puffy nucleus of U-235. “General Groves himself, in yet another of the high-level policy decisions of his career, chose Dagwood as the central character,” noted the historian Paul Boyer.
Lilienthal’s agency had the contradictory—some said impossible—mission of encouraging peaceful atomic energy while also gathering uranium for military use. One of its important early choices was picking a suitably useless piece of land as a spot to test new weapon prototypes. A stretch of basin and range one hour north of Las Vegas was sealed off and renamed the Nevada Proving Ground. This was the badlands described by the explorer John C. Frémont a hundred years before as “more Asiatic than American in its character”—a waterless sinkhole of brush and weeds hemmed in by peaks of basalt and granite. More than one hundred atomic bombs would be detonated there between 1951 and 1963, and hundreds more would be set off inside tunnels below the desert floor.
The surface blasts were plainly visible from Las Vegas and made for spectacular viewing. AEC officials made repeated assurances that the tests were perfectly safe, and the city embraced its new role as “The A-Bomb Capital of the West” with characteristic brio. A stylist at the Flamingo hotel created a hairdo in the shape of a mushroom cloud; it involved wire mesh and silver sprinkles and cost $75. A motel renamed itself the Atomic View and told its guests they could watch the detonations from poolside chairs. Dio’s Supermarket on 5th Street boasted of prices that were “Atomic—in the sense of being small.” Roulette balls and craps dice were occasionally nudged by blasts; casinos posted signs that made such events subject to the ruling of the pit boss. The Las Vegas High School class of 1951 adopted the mushroom cloud as its official mascot and painted one outside the entrance to the school. Clark County redesigned its official seal to include the same logo. “Dazzled by atomic eye candy,” wrote the Nevada historian Dina Titus, “citizens were virtually hypnotized into acceptance.” A local housewife named Violet Keppers was taken out to the viewing stands to witness a blast, and she later wrote about it for Parade magazine. “All my life I’ll remember the atomic cloud drifting on the wind after the blast,” she wrote. “It looked like a stairway to hell.”
The first indication that anything was wrong came after a May 19, 1953, explosion—later nicknamed “Dirty Harry”—in which the wind had shifted unexpectedly. Ranchers in neighboring Utah watched with puzzlement, and then anger, as herds
of their lambs and ewes fell sick and died. Under their wool, many were showing ugly running sores similar to those seen by William L. Laurence on cattle far away from the Trinity blast. The AEC told them their livestock had died of malnutrition and cold weather. But on one hard-hit ranch, an official with a Geiger counter was overheard hollering to his companion, “This sheep is as hot as a two-dollar pistol!” The ranchers sued for damages and lost.
The radioactive clouds had scattered dust over the nearby town of St. George, Utah, where, the following year, John Wayne and Susan Hayward would spend three months in a canyon filming The Conqueror, an epic about Genghis Khan (today generally regarded as Wayne’s worst movie). The canyon was breezy, and the cast and crew were constantly spitting dust from their mouths and wiping it from their eyes. Almost half of them, including Wayne and Hayward, would eventually die from assorted cancers, a rate three times above the norm. The downwind plume from the test site, which generally blew to the north and east, proved to be an accurate map for later elevated thyroid cancer occurrences. The number of people sent to early graves is still a matter of controversy; most estimates put the count at well more than ten thousand, spread across five mountain states.
But the testing was not about to stop. When a state senator called for a ban, he was vilified by the local newspapers. An AEC board member said, “People have got to live with the facts of life and part of the facts of life are [sic] fallout.” There were repeated calls for Nevada to embrace the explosions as a symbol of the continuing American pioneer spirit. Most residents also welcomed the high-wage federal jobs and the prestige. “More power to the AEC and its atomic detonations,” cheered the Las Vegas Review-Journal. “We in Clark County who are closest to the shots aren’t even batting an eye.” The new weapons were part of a grand strategy to grow the U.S. arsenal to deter a perceived threat from the USSR—a decision to swell, rather than shrink, the American dependence on uranium as a centerpiece of its defense posture.
But this decision came only after a serious discussion about creating an international body to safeguard the world’s supply of uranium and nuclear weapons—a scenario straight from The World Set Free.
President Truman had asked both Lilienthal and Undersecretary of State Dean Acheson to write up a proposal for a powerful multinational regime to purchase St. Joachimsthal and Shinkolobwe and all new mines, destroy all the bombs the United States had already manufactured, and assume guardianship of all the atomic facilities in every nation. Atomic power would be encouraged for peaceful use, leading eventually to total disarmament and the obsolescence of war. That such a dovish proposal could have ever come from the White House of Harry Truman, even when he believed the Russians were incapable of building the bomb, is testament to the millennial panic and social readjustment that had swept through the country after Hiroshima—as well as to the influence of newspaper prophets such as “Atomic Bill” Laurence.
A revised outline was presented to the first meeting of the United Nations Atomic Energy Commission on June 14, 1946, in a speech by the silver-haired financier Bernard Baruch. He began in high Laurentine style:
We are here to make a choice between the quick and the dead. That is our business. Behind the black portent of the new atomic age lies a hope which, seized upon with faith, can work our salvation. If we fail, then we have damned every man to be the slave of fear. Let us not deceive ourselves: we must elect world peace or world destruction.
The White House already knew that the plan would be dead on arrival. The USSR objected to the idea of giving up its uranium deposits and enrichment facilities, both of which it was frantically developing in secret. The United States also continued to manufacture weapons even while the plan was under debate, which gave the Soviet UN delegate Andrei Gromyko license to accuse his negotiating partners of hypocrisy. Baruch eventually resigned from the commission, and the proposal died quietly several months later.
This ploy came at a time when the geopolitical hierarchy was in the midst of a shift not seen since the defeat of Napoleon Bonaparte. Britain was in full retreat from its empire, granting independence to India, Pakistan, Israel, and dozens of other territories. The USSR was aggressively pushing its mandate in Eastern Europe and backing Communist rump governments and guerrilla fighters in places as disparate as Greece and North Korea. The American possession of atomic bombs, and the infrastructure it took to make them, were seen as a strategic asset in an unstable world—the ultimate trump card. Uranium did not march like an army: It was a silver scythe that could decapitate a nation in an hour. As Leslie Groves concluded in a memo, “If there are to be atomic weapons in the world, we must have the best, the biggest, and the most.”
American nuclear thinking was thereby solidified for the next forty years, crystallized in an elaborate paradox that sounded like a geometric proof. There could be no defending against a nuclear strike. But such an attack would invite retaliation, eliminating both nations. No sane leader would trigger such a thing. Therefore the devices most able to vaporize the enemy were also useful for ensuring that such a thing would never occur.
The warheads might just as well have been made of cardboard—it was their abstract threat that counted. As it was in H. G. Wells’s time, uranium’s physical powers were far secondary to the power of the narrative that man could craft around them. The United States and the USSR would be locked together in this crude, but effective, story for almost forty years. J. Robert Oppenhemier would memorably call the standoff “two scorpions in a bottle, each capable of killing the other but only at the risk of his own life.”
This psychological doctrine, first called massive retaliation and later known as mutually assured destruction, or MAD, was originally formulated by a young instructor at Yale named Bernard Brodie, who published the influential book The Absolute Weapon: Atomic Power and World Order in 1946. “Thus far the chief purpose of our military establishment has been to win wars. From now on its chief purpose must be to avert them,” he wrote in the book’s most quoted passage. Suspicion of one’s rival was a healthy thing, argued Brodie. That was, in fact, the basis on which a future peace would be secured. Deterrence was best achieved through preparing for war.
This was the intellectual cornerstone of the arms race. If war did come, it would be, in the popular phrase of the day, a “push-button war.” By the beginning of 1953, the United States had about a thousand nuclear weapons in its arsenal. Less than ten years later, there were twenty-one hundred, with a grandiose air, land, and submarine deployment pattern—the “triad”—and enough firepower to level Hiroshima again an estimated million and a half times. Pentagon officials struggled to find Russian targets to match the bombs and not the other way around. The buildup eventually cost taxpayers an average of $98 billion a year for weapons that were effectively useless except as public relations tools. “If you go on with this nuclear arms race,” warned Winston Churchill in 1954, “all you are going to do is make the rubble bounce.” The warning went unheeded. This ring of uranium would eventually cost the United States more than $10 trillion in armaments and support; the historian Richard Rhodes has pointed out that this figure is larger than the entire economic output of the United States in the nineteenth century.
MAD had other weaknesses. Atomic strength would prove useless in regional conflicts where the United States had an interest. In fact, it was worse than useless. The bombs sucked away money and attention from a conventional fighting apparatus, and tended to encourage magic-bullet thinking. President Eisenhower’s National Security Council had pressed him in 1954 to use nuclear weapons on the Vietnamese insurgents who had a hapless French colonial garrison surrounded at Dien Bien Phu. “You boys must be crazy,” he told them. “We can’t use those awful things against the Asians for the second time in ten years. My God.”
There may have been another, more distinctly human purpose for the United States to have sped up its pursuit of a uranium-based defense.
Institutional momentum is a hard—if not impossible—t
hing to stop. Careers and budgets were now at stake. The American military had been slow to awaken to the possibilities of uranium during the war—there was that year-long wait after Einstein’s letter—but now a $2 billion assembly line spread across three states and two continents was humming at warp speed. It was the showcase for the ingenuity and prestige of the nation, and the backbone of what William L. Laurence and others had hailed as the most important scientific advance of all time.
To have just locked the doors of the Manhattan Project and walked back toward the banality of nitrate bombs would have been a denial of one of man’s basic urges, that of creation and discovery. “Nuclear explosions have a glitter more seductive than gold to those who play with them,” said the physicist Freeman Dyson. “To command nature to release in a pint pot the energy that fuels the stars, to lift by pure thought a million tons of rock into the sky, these are exercises of the human will that produce an illusion of illimitable power.” As the writer Rebecca Solnit has observed, a nuclear test gives man the power to make a star, if only for a moment. Now a huge federal apparatus was pushing this Dionysian urge, with a lavish budget and patriotic fervor behind it.