Book Read Free

Seven Elements That Have Changed the World

Page 17

by John Browne


  In September 1942, the US government approved the acquisition of over 200 square kilometres of land surrounding the small town of Oak Ridge, Tennessee, to create the Clinton Engineer Works. As one of three main sites in the Manhattan Project, it was tasked with producing the enriched uranium for an atomic bomb. The other two sites under the umbrella of the Manhattan Project were Hanford, another production site for bomb material, and Los Alamos, the ‘mind centre’ of the project. Secrecy was a top priority for all three of the sites. They did not exist on maps and were referred to only by their code names of X, Y and Z.

  As soon as Major General Leslie Groves, the director of the Manhattan Project, saw the site he knew it was right. Hidden away in the middle of nowhere, Oak Ridge was perfectly positioned for the project’s secrecy and security needs. Being far from the coast reduced the risk of enemy attack and nearby rivers provided a plentiful supply of water and hydroelectric power, vital for the colossal industrial effort about to be undertaken.

  With characteristic efficiency, Groves evicted 1,000 families from the area, some with only two weeks’ notice. They had no choice. The site had been chosen, and no one was going to get in the way of America building the bomb. It was ‘child’s play’ according to one official from the US Army Corps of Engineers.19

  Scale was everything for the Clinton Engineer Works. At its peak it employed 80,000 workers. The small town of Oak Ridge quickly grew to become the fifth largest city in the state of Tennessee and held a greater concentration of PhDs per capita than any other city in the country. Twelve thousand people worked in the K-25 uranium-processing building alone. At the time, it covered more area than any structure ever built. Inside K-25, uranium was enriched by passing it, in gaseous form, through a series of membranes. Lighter molecules pass through a fine membrane faster than heavier molecules, so that the percentage of uranium-235, which was lighter than uranium-238, gradually increased.20

  The US did not know whether their prodigious experiment would work, but they had to try. Only an atomic bomb, harnessing the incomparably destructive energy of uranium, held the potential to win the war in an instant. They also understood that speed in this endeavour was essential. They feared that Germany might beat them to it and unleash the devastating power of uranium on them.

  To the US, these uniquely dark circumstances justified the enormous expense of the project and the forced evictions at Oak Ridge. They also enabled it to enlist the world’s brightest and best scientific minds at the Los Alamos site, the ‘mind centre’. Robert Oppenheimer, the director at Los Alamos, was among the first to witness the success of the US’s endeavours at the Trinity bomb test site on 16 July 1945. Later he recalled that, as he watched the bright atomic flash in the New Mexico desert, he was reminded of a line from Hindu scripture: ‘Now, I am become Death, the destroyer of worlds.’21

  Today, scientific challenges are more diverse and the solutions less clear. This was evident when I visited the site of the Clinton Engineer Works, now called Oak Ridge National Laboratory, in March 2009. The square industrial buildings sit out of place among Tennessee’s rolling forest landscape and the expansive countryside belies the true size of the Laboratory. On a tour of the site I was shown the X-10 graphite reactor, the second nuclear reactor in the world, and now a National Historic Landmark. The K-25 uranium enrichment facility is currently being demolished.

  Research priorities have long since moved on, and these were the reason for my visit. I was there in my role as a partner in a private equity firm, which at the time managed the world’s largest renewable and alternative energy investment fund. I came to learn about the Laboratory’s recent approaches to the production of biofuel from crops, such as grasses and trees, which are obviously not used for food. Biofuel can be made from the sugars contained in plant cellulose. But non-food crops also contain a lot of lignin, which forms strong bonds with these sugars, making them difficult to extract. A particular interest at the laboratory was poplar trees, which have a wide variation in many natural traits. Researchers were searching over 1,000 poplar tree varieties for traits that would produce the greatest amounts of sugar.22 By producing economically competitive biofuels sources, the US had hoped to reduce further their dependency on foreign oil. Oak Ridge may have moved on from uranium, but once again it was working in the interests of national security.

  On 25 July 1945, the last shipment of enriched uranium needed for the Hiroshima bomb left Oak Ridge, arriving at the Pacific island of Tinian two days later. Here the three-metre-long atomic bomb, called ‘Little Boy’, which was soon to be dropped on the city of Hiroshima, was assembled. From this point on, the science was frighteningly simple. Lumps of enriched uranium would be slammed together to form a critical mass, initiating an uncontrollable, runaway nuclear reaction.

  To create a bomb that would destroy a city, another city had been created. In total two billion dollars were spent ‘on the greatest scientific gamble in history’, a gamble that ultimately paid off.23 The Manhattan Project is a rare example of a government successfully picking winners, yet in this case the choice was clear: only one weapon held the potential to end a war in an instant. The battles in the laboratories were as instrumental as those of the air, land and sea in securing Allied victory. In dropping the bomb, humanity had unleashed ‘the basic power of the universe’.24 But the bomb also made us fearfully aware of our newly found capacity for self-destruction; in doing so it symbolised the beginning of the modern age.

  Up and Atom!

  ‘At the instant of fission, Captain Atom was not flesh, bone and blood at all … The desiccated molecular skeleton was intact but a change, never known to man, had taken place! Nothing … absolutely nothing … was left to mark the existence of what had once been a huge missile! Nor was there a trace of the man inside!’25

  Captain Atom, ‘radioactive as pure uranium-235’, was born inside the explosion of an atomic warhead in the March 1960 issue of Space Adventures, a popular American comic book. Through the late 1950s and early 1960s, I eagerly read these and other science fiction stories. The mysterious power of the atom was a gift for comic book writers. They created a whole array of thrilling superheroes who could harness atomic energy as a force for good against the evils of the world. At the time, the greatest global threat, at least as far as America was concerned, was all-out nuclear warfare with the Soviet Union. In Captain Atom’s debut adventure, he saves the world from destruction by intercepting a communist nuclear missile. ‘You, more than any other weapon, will serve as a deterrent of war!’ exclaimed the illustration of President Eisenhower on Captain Atom’s jubilant return to earth.

  Ever since the Little Boy bomb brought the power of uranium into the public eye, the possibilities of the Atomic Age seemed endless. While some atomic superheroes used their powers ‘to crush every evil influence in the world’ (Atomic Man) and ‘save mankind from itself (Atomic Thunderbolt), others were more villainous.26 Mister Atom, a power-crazy nuclear powered robot, was hell-bent on taking over the world.

  The same choice confronted the real world. General Groves, the director of the Manhattan Project, and regarded by many people as one of the fathers of the atomic bomb, also sternly warned that we had to choose the ‘right path’: weapons leading to atomic holocaust or a bright Utopian atomic future.27 Humanity, it seemed, stood at the fork in the road to a new atomic world.

  The same rhetoric was apparent in the real President Eisenhower’s 1953 address to the UN General Assembly. This came to be known as the ‘Atoms for Peace’ speech. Spurred by the sudden growth in the nuclear weapons arsenals of both the US and the Soviet Union, Eisenhower called on the world to ‘strip the military casing’ of the atomic bomb and adapt atomic energy for the benefit of humankind. He wanted America to lead the way in reducing nuclear stockpiles and to open up dialogue between the world’s great nuclear superpowers. Eisenhower pledged that the US would ‘devote its entire heart and mind to finding the way by which the miraculous inventiveness of man shall not be
dedicated to his death, but consecrated to his life’.

  In this imagined atomic Utopia, it was believed that the unlimited source of neutrons produced in the splitting of uranium atoms would enable us to produce artificially atoms of any type in the laboratory. Since the beginning of humanity, we have sought to understand and harness the basic constituents of matter; now we could apparently take control over the elements in a way that was far more potent than any of the ancient alchemists had imagined.

  Just as miraculous were the predicted medical benefits of radiation. Radioactive elements, people believed, would soon put an end to cancer, the most feared of diseases: one cartoon depicted a skeleton labelled ‘CANCER’ fleeing lightning bolts of ‘ATOMIC ENERGY’.28 So, too, would these radioactive elements enable us to trace a whole host of diseases as they made their way through the human body. By understanding these pathways, we hoped to develop a medical toolkit that would give a long and healthy life to all. And tracing similar pathways in plants would unlock the secrets of photosynthesis, harnessing the power of the sun and enhancing food production.

  Of all the benefits of splitting the uranium atom, the most obvious was that it was a simple, abundant energy source. That was only too apparent in the destruction wreaked by the Hiroshima bomb. By harnessing the power of the uranium nucleus, fuel crises would become a thing of the past: we would soon all be driving atomic cars, each running on their own mini nuclear power generator.

  Before Captain Atom hit the newsstands in 1960, my mainstay was the weekly Eagle, the leading British boys’ comic of the 1950s. I remember studying the intricate centrefold cutaways of atomic submarines and aircraft carriers. Another image illustrated an ‘Atomic Locomotive’, ‘the shape of things to come’, moving at breakneck speed, powered by the unlimited energy of uranium.29 With no need for refuelling or stoking, nuclear power was seen as superior to oil and coal. Using uranium, we would be able to travel across land and underwater farther and faster than ever before.

  It was even thought that atomic energy would give us complete control of our climate. Artificial suns would control the weather, even, as one writer suggested, being used to melt ice caps to create a warm and temperate global climate.

  The new nuclear energy source was unlike anything encountered before. Comic books and futurist writers give us a sense of the awe that uranium inspired in the 1950s and 1960s. Uranium was not unique in its technological potential being exaggerated. It is one of a trio of ‘post-war wonder elements’, along with titanium and silicon, whose stories are also explored in this book. The hyperbole surrounding uranium in this period was, however, greater than for any other element. The extraordinary power of uranium was self-evident in the images of mushroom clouds over Hiroshima. The fantastic imaginings that followed only raised uranium to a greater height from which, in many of its applications, it would ultimately fall.

  Atomic-powered transport was, for most uses, deemed impractical and unsafe, while the notion of artificially heating our climate using uranium now seems ridiculous and, in light of anthropogenic climate change, anachronistic. Radiation is an important medical tool, but has by no means cured cancer. However, in one industry excitement appears to have been justified. Nuclear power stations, generating electricity from the heat produced in nuclear fission, soon began to appear across the globe.

  An ambivalent institution

  On 17 October 1956, Queen Elizabeth II pressed the switch at the Calder Hall nuclear power station in Cumbria, UK. For the first time, uranium’s inherent energy was delivered directly to homes on a commercial scale.30 Standing in the shadow of Calder Hall’s cooling towers, the Queen presented the occasion as a solution to the dangers of atomic energy ‘which has proved itself to be such a terrifying weapon of destruction’.31 By using nuclear energy ‘for the common good of our community’, Britain wanted to give the impression of leading the way to the peaceful uses of atomic energy.32

  The sudden development of nuclear power in Britain came about by necessity rather than choice: the harsh winter of 1947 had led to a national fuel crisis. Over 90 per cent of Britain’s energy needs were supplied by coal in the post-war years, and in 1948 demand growth began to outstrip new supply. Reserves were fast disappearing, a shock for a nation which had once been one of the world’s greatest coal exporters. Industrial Britain had prospered on these once abundant reserves; and if she was to continue as a major global economic power, a new energy source was needed.

  Oil was one possible contender. In July 1954 the Minister of Fuel and Power announced that coal-fired power stations would be supplemented by imported oil. But this was only regarded as a short-term expedient. Concerns over diminishing oil reserves, which ultimately proved to be unfounded, demanded a longer-term solution. To Lord Cherwell, Chairman of the Atomic Energy Council, the mathematics was simple: ‘one pound of uranium equals 1,000 tons of coal.’33

  At the time I was living in Iran, where my father worked for the Anglo-Iranian Oil Company in the Masjid-i-Suleiman oilfields. Amid the excitement of well fires and the fascination with the mysterious process of oil production, concerns over energy security were far from my mind. Only later, back in the UK and considering university, did the importance of the nuclear power industry become apparent to me. The UK was beginning to invest in its second generation of nuclear power stations and the fast-growing, highly technological nuclear industry needed to attract the best minds. University education was a rarity then. Less than 5 per cent of young people went to university, and the industry offered lucrative scholarships to attract them. To me, the nuclear industry had a real sense of modernity. It seemed then that they were building the future and so I applied for a scholarship from the UK Atomic Energy Authority.

  In the end I accepted a scholarship from BP, but it was a tough decision to make. I was attracted by BP’s international dimension: the challenges the company faced seemed of greater scope and complexity. However, throughout my career in the oil industry, nuclear power has persisted in the background, coming to the fore in times of concern over oil supplies or following industrial accidents.

  We are in just such a period now. The risk of anthropogenically induced climate change has once again brought the nuclear power debate to the surface. Nuclear power could help meet our increasing global energy demands in a low-carbon economy, but the growth of nuclear power suffers from its continued association with nuclear explosions, one which, from the very beginning, was interwoven with the production of nuclear power at Calder Hall.

  Although held up as a paragon of peaceful nuclear energy, behind the scenes Calder Hall was used to create enriched uranium for atomic bombs. It was an ambivalent institution. Following the US bombings of Hiroshima and, subsequently, Nagasaki, the British government, like most other governments at the time, wanted an atomic bomb. In October 1946, the Prime Minister, Clement Attlee, called a cabinet meeting to discuss uranium enrichment for a nuclear weapon. They were about to decide against this on grounds of cost when Ernest Bevin, the Foreign Secretary, intervened. He was adamant and in an act of sadly human competitive behaviour said: ‘We’ve got to have this … I don’t mind it for myself, but I don’t want any other Foreign Secretary of this country to be talked to or at by the Secretary of State in the United States as I have just had … We’ve got to have this thing over here, whatever it costs … We’ve got to have the bloody Union Jack on top of it.’34

  The plutonium used in Britain’s first atomic bomb was produced at the site where Calder Hall was soon to be built. Calder Hall’s reactor design was chosen primarily for its ability to produce the plutonium needed to keep pace in the global nuclear arms race. Uranium-238 is converted into plutonium when it is irradiated by neutrons released from nuclear fission reactions taking place inside the reactor. Rather than waste the heat produced in this process, electricity generators were also incorporated into the reactor’s design. In its early life there was a trade-off between plutonium and power production: maximising one would mean diminish
ing the other. And often it was electricity output that lost out in favour of the British government’s desire for a growing nuclear weapons arsenal.35

  In the 1960s, as the US and Soviet Union became the focus of the global arms race, the UK’s need for nuclear weapons decreased. And so Calder Hall prioritised the generation of electricity over the production of weapons; Britain increasingly separated its military and civilian nuclear programmes. Between the Queen’s opening of Calder Hall in 1956 and the start of 2011, global nuclear electricity generating capacity had expanded to over 440 reactors, providing about a seventh of global electricity capacity. The growth of capacity had been slowing since the building boom of the seventies and eighties, but many industry analysts were predicting a ‘nuclear renaissance’ in the coming decade. At the start of 2011, Britain was considering building ten new nuclear power stations; China was planning to quadruple its nuclear power capacity by 2015; even traditionally anti-nuclear Germany was extending the lifetime of existing nuclear reactors. The bright future of nuclear power symbolised by Calder Hall seemed to be an ever-increasing reality. A few months later a single event would change everything.

  Nuclear fear

  On 11 March 2011, the Tōhoku earthquake sent a tsunami speeding towards the north-east coast of Japan. Almost 16,000 people died in the disaster, the great majority drowning in the tsunami flood waters.

  The Fukushima Dai-ichi nuclear power plant stood 180 kilometres away from the earthquake epicentre. It survived the initial magnitude 9.0 earthquake; this was one of the five most powerful shocks ever recorded anywhere on Earth. However, just under an hour later a towering 15-metre tsunami wave broke over the power station’s flood defences. The complete loss of power and the subsequent failure of equipment led to a series of nuclear meltdowns and explosions, releasing radioactive material into the environment.

 

‹ Prev