The Boy Who Played with Fusion

Home > Other > The Boy Who Played with Fusion > Page 13
The Boy Who Played with Fusion Page 13

by Tom Clynes


  His brain was racing now, imagining possibilities that went beyond himself, beyond his grandmother, far beyond his garage laboratory and his hometown. If he could design a reactor small and cheap and safe enough, people everywhere, even in the middle of Africa, could get earlier diagnoses at a small fraction of the current cost.

  But the process of building a miniature sun on Earth—accelerating particles at speeds and temperatures robust enough to fuse atoms—is extraordinarily complicated, something that government-sponsored research laboratories spend tens of billions of dollars on. At that point in Taylor’s life, only a couple of dozen people had managed to build a working fusion reactor without the support of institutions or governments. Carl Willis was one of them. But Carl had two advanced degrees, access to a high-tech laboratory, and a nuclear engineer’s salary to spend on precision equipment. How could an eleven-year-old kid in southern Arkansas ever hope to make his own star?

  PART III

  14

  * * *

  Bringing the Stars Down to Earth

  NUCLEAR FUSION IS, for starters, the opposite of nuclear fission. Fission splits apart hefty, unstable atoms like uranium or plutonium to release their energy, giving us electricity, atomic bombs, and nuclear waste. Fusion combines the nuclei of two light, stable atoms into a single, heavier atom that is slightly less massive than the sum of its parts. That extra mass is released as energy. Fusion energy is extraordinarily abundant in our universe; it’s what fills our world with heat and light, and it’s what fuels life on Earth. The sun—our planet’s distant power plant—and other stars are powered by fusion. In fact, much of what we know of in the visible universe originates from nuclear fusion. Most of the elements we’re made of, the food we eat, and the Earth itself were born of fusion reactions.

  Like all stable stars, the sun is a self-perpetuating thermonuclear fusion reactor fueled by hydrogen, the simplest and most abundant element. Each second, some five million tons of hydrogen atoms are drawn into the sun’s high-pressure plasma core, where they collide so forcefully that their nuclei fuse, transforming single-proton hydrogen nuclei into double-proton helium nuclei.

  In 1905, Albert Einstein proposed his famous E = mc² formula, which made it possible to understand the conversion process that powers the sun and other nuclear fusion reactions. Einstein’s equation explained the fundamental relationship between mass and energy: that the energy content (E) of a body is equal to the mass (m) of the body times the speed of light (c) squared. In a fusion reaction, with each coupling of atomic nuclei, only a tiny amount of mass is lost, but the amount of energy released is immense—many times the amount of energy needed to bring the atoms together.

  Flowing out of his theory of special relativity, Einstein’s E = mc² insight spawned a whole new branch of science known as high-energy particle physics. Physicists who work in this field thrive on E = mc² conversions. Without a thorough understanding of the equation it would be impossible to build fusion reactors or other particle accelerators, impossible to understand the behavior of particles that are constantly colliding, releasing dollops of energy, and transmuting into newly formed particles. Once physicists had a basis for understanding these reactions, they began to imagine the possibilities of unleashing energy by splitting or combining atoms.

  Fusion power has long been the Holy Grail of energy production, since it offers the possibility of abundant, clean electricity. Unlike fossil fuels or nuclear fission, fusion could give us energy without pollution, long-lasting radioactive waste, or the threat of a catastrophic release of deadly radiation. Fusion power plants would most likely be fueled with deuterium, a virtually inexhaustible hydrogen isotope found in seawater; and tritium, which is bred in reactors from lithium, a plentiful element. Two and a half pounds of these fuels can produce as much energy as eighteen million pounds of coal. Apart from energy, fusion produces helium, a useful inert gas that doesn’t contribute to climate change. Although a fusion reactor’s walls become radioactive after sustained neutron bombardment and need to be periodically replaced, this low-level waste decays to a safe level much more quickly—a hundred years versus tens of thousands of years—than the waste that collects in yellow drums outside fission power plants.

  Fusion energy could conceivably be used to power far-roving spaceships. But to many of the thousands of scientists working to create fusion power on Earth, the quest to produce electricity by fusing atomic nuclei is no less than a race to enable humans to continue to survive on this planet. Climate scientists understand that continuing to burn fossil fuels at anywhere close to our current rate (about 80 percent of our energy is derived from fossil fuels) will raise atmospheric carbon dioxide (CO₂) levels to about five hundred parts per million by midcentury. The runaway growth in greenhouse-gas emissions has thus far swamped efforts to take meaningful action, but much of the world is waking up to the changes already happening to our climate. By 2100, the effects of these human-caused changes will be impossible to ignore as they lead to a series of devastating and possibly irreversible ecological impacts.

  Unfortunately, no existing form of renewable energy is ready to serve as a prime-time replacement for fossil fuels—at least, not at a price most people are willing or able to pay. “The world needs a technology that can be switched on within a few decades, and preferably a lot sooner,” says physicist Steven Cowley, director of the Culham Centre for Fusion Energy and CEO of the United Kingdom Atomic Energy Authority, “one that’s compatible with current power grids, affordable, non-polluting, and impossible to use to make nuclear weapons.”

  Fusion energy does all that. Theoretically, it’s the perfect energy source: safe and clean, fueled by virtually inexhaustible resources, and carbon-free. But there’s a catch: Though nuclear fusion has powered the sun for 4.6 billion years, it has never produced usable energy on Earth, despite more than six decades of effort by the world’s brightest scientists and engineers. The problem isn’t whether fusion can work. Physics laboratories, even a few individuals, have successfully fused the nuclei of light atoms together, liberating their copious energy. But to actually produce energy on a useful scale, simply fusing atoms isn’t enough. A viable fusion reactor would have to produce more energy than is poured into it.

  That’s the tricky part. Because hydrogen nuclei are positively charged, they repel each other and will fuse only if they collide with enough energy to overcome this repulsive electromagnetic force. Only when the nuclei get close enough to be drawn together by what scientists call the strong nuclear force—the extremely short-range force (less than 0.000000000000001 meters) that binds atomic nuclei—can fusion occur.

  The sun overcomes electromagnetic repulsion with its massive gravity and extremely high pressure and core temperature (roughly 27 million degrees F, or 15 million degrees C). On Earth, fusion requires much higher energies and temperatures—at least 180 million degrees F, or 100 million degrees C.

  Thus far, the methods that physicists have used to create environments conducive to fusing atoms have required more energy than the actual fusion reactions produce. Though one set of experiments reached so-called scientific break-even (generating as much energy as the fuel put into the system contained), a viable nuclear fusion power plant needs to go beyond both scientific break-even and engineering break-even; the energy produced and sent into the electric grid must exceed all the energy needed to create and sustain the reaction.

  Einstein’s formula inspired the physics community to explore the possibilities of bringing the power of the stars down to Earth. Progress and funding for experiments were scarce until the 1950s, when some American physicists who had worked on the hydrogen bomb began to shift their research toward controlled thermonuclear reactions. The most promising approach seemed to be magnetic confinement fusion (MCF), which uses a powerful magnetic field to confine and stabilize a superheated plasma field where atomic nuclei collide and fuse.

  Plasma, which consists of free electrons and positive ions, is the only state
of matter in which self-sustaining thermonuclear reactions can occur. Plasma shares similarities to gases, but in gases, electrons remain in their normal state, bound to the nuclei of atoms. In plasma, electrons are stripped away from the nuclei (which become positive ions) and move around freely.

  Fusion research in the United States was at first classified, but when technological progress stalled, the U.S. and the USSR came to an agreement, signed in 1955, to open a worldwide exchange of fusion research and technology for peaceful uses. When scientists from the East and the West got together, the American physicists discovered that they’d gotten the better end of the deal; their Russian counterparts had developed a much more elegant arrangement of magnetic fields that produced denser, hotter plasma. The Russians called their doughnut-shaped design a tokamak, short for “toroidal chamber with an axial magnetic field.”

  By the 1980s, most major physics laboratories in the U.S. had built their own tokamaks and were experimenting with ways to stabilize the swirling, turbulent storms of plasma. Physicists theorized that a bigger chamber volume would increase a tokamak’s capacity to stabilize the plasma and give the ions more time to collide and react. In 1985, Ronald Reagan and Mikhail Gorbachev agreed to collaborate on a massive tokamak, now called the International Thermonuclear Experimental Reactor, or ITER. Since then, the European Union, China, India, Japan, and South Korea have thrown their resources behind the experimental reactor, which is currently being built at the Cadarache Research Centre in Saint-Paul-lez-Durance, France.

  The twenty-three-thousand-ton machine, under construction since 2007, won’t be operational until 2019 at the earliest. Physicists expect that ITER’s system of superconducting magnets—the most powerful ever built; they can exert a combined force of sixty meganewtons, enough to lift thirteen million pounds—will confine the plasma effectively enough to achieve fusion’s long-sought breakthrough: a starlike self-sustaining reaction that produces more power than it consumes.

  The experiment’s goal is to generate five hundred megawatts of electricity, enough to power roughly five hundred thousand homes. But while most of ITER’s physics problems have been worked out, huge engineering challenges remain. Foremost is the question of what the vacuum chamber’s walls will be made of, as the material must be able to withstand constant bombardment by the neutrons released by the fusion reactions (unlike other particles, neutrons can’t be contained by magnets, since they have no charge).

  If the experiment succeeds, the next step will be a larger demonstration power plant. But ITER’s project leaders don’t really know exactly what will happen when they flip the switch on their massive magnetic bottle, the most complex machine—it will contain more than 10 million individual parts—mankind has ever built. Their often-frustrating, decades-long experiment may usher the world into a new era of safe, clean, and abundant energy—or it may be a twenty-billion-dollar fiasco.

  Meanwhile, a fundamentally different approach to nuclear fusion, called inertial confinement fusion, or ICF, shows promise. ICF machines typically use lasers to flash-heat a deuterium/tritium fuel capsule. Though the capsule is only the size of a pinhead, it has the energy content of a full barrel of oil. To release that energy, a circle of lasers hit the capsule’s outer layer with a high-energy pulse, compressing and heating the atoms at the center to the point that their nuclei fuse. Ignition, in the case of ICF, would occur when these reactions force the surrounding fuel into fusion and create a sustained chain reaction. Although inertial confinement reaction vessels are much smaller than tokamaks, the lasers and the systems needed to power them are enormous. The main technical challenge of ICF is that when the lasers fire, some of the fuel in the pellets can escape before a significant portion of the fuel has a chance to undergo fusion.

  The fusion physics community has thrown much of its intellectual and financial weight behind ITER, which is currently eating up between one billion and two billion dollars per year. At the current pace of progress and funding, it will be at least thirty years before fusion energy production becomes a reality by ITER or any other technology. Given the potential payoffs and high stakes—filling the world’s energy needs for millennia to come and saving the planet from environmental catastrophe—we need to ask whether we’re willing to wait that long.

  In 1961, John F. Kennedy declared that America would send astronauts to the moon and bring them home safely by the end of the decade. It was a bold idea and a difficult challenge, but the nation rallied behind the ambitious Apollo program. Inspired by the president’s energetic vision and motivated by the perceived Soviet threat, politicians gave von Braun and his NASA team the resources they needed to fast-track the space race.

  “What if we approached fusion energy like the U.S. approached Apollo?” asks Ralf Kaiser, a fusion physicist with the International Atomic Energy Agency (IAEA). “Instead of scaling back ITER funding [as has been done several times], what if we instead put it on an ultra–fast track, with the entire world making an Apollo-like commitment?”

  Between 1963 and 1972, the U.S. invested a little more than a hundred billion dollars (in today’s dollars) on lunar programs. An Apollo-like budget (which would be underwritten by thirty-five countries rather than just one) in the neighborhood of ten billion dollars a year would cost just one-seventh of 1 percent of the seven trillion dollars the world spends each year on energy. With that kind of funding, top fusion physicists believe they could deliver fusion power within a decade. Betting that big on fusion would take a lot of political will and imagination—which was exactly what made Apollo possible.

  Of course, the need for a safe, cheap, nonpolluting energy source is much more pressing than the need for a man to walk on the moon. And getting fusion to work will be much harder. “Getting to the moon is almost trivial in comparison to nuclear fusion energy,” says plasma physicist Ron Phaneuf of the University of Nevada–Reno. “Fusion is by far the most significant scientific and technical challenge mankind has ever attempted.”

  All of which made nuclear fusion such a compelling challenge for Taylor, who already, at the age of eleven, seemed incapable of attempting anything on what most people would consider a reasonable scale.

  “Someone saying it can’t be done, or it’s extremely hard to do, just makes me want to do it,” Taylor says. “I just don’t accept that I can’t. I really do think that someday we’ll have fusion power and that I can be part of the breakthroughs that make it happen.”

  But first, he wanted to use nuclear fusion as a means to generate neutrons. At eleven years old, Taylor was just beginning to see the connections among the disciplines he’d studied, and he was captivated by the idea that he might find answers to some very big questions hidden in those connections.

  E. Paul Torrance, the noted creativity researcher, wrote elegantly about “the dreadful importance of falling in love with ‘something’—a dream, an image of the future”:

  Positive images of the future are a powerful and magnetic force. These images of the future draw us on and energize us, giving us the courage and will to take important initiatives and move forward to new solutions and achievements. To dream and to plan, to be curious about the future and to wonder how much it can be influenced by our efforts, are important aspects of our being human. In fact, life’s most energizing and exciting moments occur in those split seconds when our strugglings and searchings are suddenly transformed into the dazzling aura of the profoundly new, an image of the future.

  Torrance’s mid- and late-twentieth-century take on the importance of one’s image of his or her own future has been affirmed by modern psychologists who have researched motivation, creativity, and achievement; we now know that our beliefs about our abilities and potential drive our behaviors and predict our successes even more accurately than past performance does. That sort of future-oriented inspiration, researchers say, is an oft-overlooked factor that can affect one’s motivation and capacity to work persistently toward long-term goals and help one prevail over obstacles and
setbacks. There’s also evidence that these motivational/inspirational factors can boost both cognitive efficiency and overall productivity.

  From that point on, Taylor went forward not just out of curiosity but out of a genuine love for his subject and a desire to create something that could change the world and make it better. His ambition was not, for an eleven-year-old, particularly realistic. But the biographies and memoirs of many world-changers make it clear that the source of the sustained creative energy that fueled their breakthrough achievements was an early obsession with something that stayed with them for their entire lives.

  Anthony Fauci, the renowned immunologist whose teams have achieved several breakthroughs in the treatment of HIV/AIDS, maintains that a lack of experience can at times actually be a benefit, since younger people, who are natural outsiders, often have less constricted, more creative views of the world.

  “The trick is to work with young scientists and put them on projects that they don’t know can’t be done,” says Fauci, who directs the National Institute of Allergy and Infectious Diseases. “When you have experienced and inexperienced people working in the lab together, you can balance the skepticism that accomplished scientists often have. They’ll say, ‘You’ll never be able to do that, why would you waste your time?’ But young people are more open to taking a chance and so you often get a very important observation because someone went down a road a more experienced person wouldn’t have.”

 

‹ Prev