Films from the Future

Home > Other > Films from the Future > Page 25
Films from the Future Page 25

by Andrew Maynard


  While Birnley is fixated on the short-term profits he’s going to make off of Stratton’s invention, others in the textile industry realize that this is not going to end well. They need their products to wear out and need replacing if they’re to stay in business, and the very last thing they need is clothes that last forever. So they hatch a plan to persuade Stratton to sign over the rights to his invention, so they can bury it.

  To make matters worse, it quickly becomes apparent that the mill owners and their investors aren’t the only ones who stand to lose from Sidney’s invention. If the industry collapsed because of his new textile, the workforce would be out on the streets. And so, in a Luddite-like wave of self-interest, they also set about challenging Sidney, not because they are anti-science, but because they are pro- having jobs that pay the bills.

  The more people hear about Stratton’s invention, the more they realize that this seemingly-great discovery is going to make life harder for them. Even Sidney’s landlady plaintively asks, “Why can’t you scientists leave things alone? What about my bit of washing, when there’s no washing to do?” In his naïvety, it becomes clear that Stratton didn’t give a second thought to the people he claimed he was doing his research for, and, as a result, he hits roadblocks he never imagined existed.

  As everything comes to a head, Sidney finds himself in his white suit, made of the new indestructible, unstainable cloth, being chased by manufacturers, laborers, colleagues, and pretty much everyone else who has realized that what they really cannot abide, is a smart-ass scientist who didn’t think to talk to them before doing research he claimed was for their own good.

  Just as he’s cornered by the mob, Sidney discovers the full extent of his hubris. Far from being indestructible, his new fabric has a fatal flaw. His wonder material is unstable, and after a few days, it begins to disintegrate. And so, in front of the crowd, his clothes begin to quite literally fall apart. Scientific hubris turns to humility and ridicule, and everyone but Stratton leaves secure in the knowledge that, clever as they might be, scientists like Sidney are, at the end of the day, not particularly smart.

  And Stratton? His pride is dented, but not his ambition—nor his scientific myopia, it would seem. In an admirable display of disdain for learning the lessons of his social failures, he begins work on fixing the science he got wrong in his quest to create the perfect fabric.

  The Man in the White Suit admittedly feels a little dated these days, and, even by 1950s British comedy standards, it’s dry. Yet the movie successfully manages to address some of the biggest challenges we face in developing socially responsible and responsive technologies, including institutional narrow-mindedness, scientific myopia and hubris, ignorance over the broader social implications, human greed and self-interest, and the inevitability of unintended outcomes. And of course, it’s remarkably prescient of Eddie Bauer’s nano pants and the protests they inspired. And while the movie uses polymer chemistry as its driving technology, much of it applies directly to the emerging science of nanoscale design and engineering that led to the nano pants, and a myriad other nanotechnology-based products.

  Mastering the Material World

  On December 29, 1959, the physicist Richard Feynman gave a talk at the annual meeting of the American Physical Society, which was held that year at the California Institute of Technology. In his opening comments, Feynman noted:

  “I would like to describe a field, in which little has been done, but in which an enormous amount can be done in principle. This field is not quite the same as the others in that it will not tell us much of fundamental physics (in the sense of, “What are the strange particles?”) but it is more like solid-state physics in the sense that it might tell us much of great interest about the strange phenomena that occur in complex situations. Furthermore, a point that is most important is that it would have an enormous number of technical applications.

  “What I want to talk about is the problem of manipulating and controlling things on a small scale.”145

  Feynman was intrigued with what could be achieved if we could only manipulate matter at the scale of individual atoms and molecules. At the time, he was convinced that scientists and engineers had barely scratched the surface of what was possible here, so much so that he offered a $1,000 prize for the first person to work out how to write out a page of a book in type so minuscule it was at 1:25,000 scale.146

  Feynman’s talk didn’t garner that much attention at first. But, over the following decades, it was increasingly seen as a milestone in thinking about what could be achieved if we extended our engineering mastery to the nanometer scale of atoms and molecules. In 1986, Eric Drexler took this up in his book Engines of Creation and popularized the term “nanotechnology.” Yet it wasn’t until the 1990s, when the US government became involved, that the emerging field of nanotechnology hit the big-time.

  What intrigued Feynman, Drexler, and the scientists that followed them was the potential of engineering with the finest building blocks available, the atoms and molecules that everything’s made of (the “base code” of physical materials, in the language of chapter nine). As well as the finesse achievable with atomic-scale engineering,147 scientists were becoming increasingly excited by some of the more unusual properties that matter exhibits at the nanoscale, including changes in conductivity and magnetism, and a whole range of unusual optical behaviors. What they saw was an exciting new set of ways they could play with the “code of atoms” to make new materials and products.

  In the 1980s, this emerging vision was very much in line with Drexler’s ideas. But in the 1990s, there was an abrupt change in direction and expectations. And it occurred at about the time the US federal government made the decision to invest heavily in nanotechnology.

  In the 1990s, biomedical science in the US was undergoing something of a renaissance, and federal funding was flowing freely into the US’s premier biomedical research agency, the National Institutes of Health. This influx of research funding was so prominent that scientists at the National Science Foundation—NIH’s sister agency—worried that their agency was in danger of being marginalized. What they needed was a big idea, one big enough to sell to Congress and the President as being worthy of a massive injection of research dollars.

  Building on the thinking of Feynman, Drexler, and others, the NSF began to develop the concept of nanotechnology as something they could sell to policy makers. It was a smart move, and one that was made all the smarter by the decision to conceive of this as a cross-agency initiative. Smarter still was the idea to pitch nanotechnology as a truly interdisciplinary endeavor that wove together emerging advances in physics, chemistry, and biology, and that had something for everyone in it. What emerged was a technological platform that large numbers of researchers could align their work with in some way, that had a futuristic feel, and that was backed by scientific and business heavyweights. At the heart of this platform was the promise that, by shaping the world atom by atom, we could redefine our future and usher in “the next Industrial Revolution.”148

  This particular framing of nanotechnology caught on, buoyed up by claims that the future of US jobs and economic prosperity depended on investing in it. In 2000, President Clinton formed the US National Nanotechnology Initiative, a cross-agency initiative that continues to oversee billions of dollars of federal research and development investment in nanotechnology.149

  Eighteen years later, the NNI is still going strong. As an initiative, it has supported some incredible advances in nanoscale science and engineering, and it has led the growth of nanotechnology the world over. Yet, despite the NNI’s successes, it has not delivered on what Eric Drexler and a number of others originally had in mind. Early on, there was a sharp and bitter split between Drexler and those who became proponents of mainstream nanotechnology, as Drexler’s vision of atomically precise manufacturing was replaced by more mundane visions of nanoscale materials science.

  With hindsight, this isn’t too surprising. Drexler’s ideas were
bold and revolutionary, and definitely not broadly inclusive of existing research and development. In contrast, because mainstream nanotechnology became a convenient way to repackage existing trends in science and engineering, it was accessible to a wide range of researchers. Regardless of whether you were a materials scientist, a colloid chemist, an electron microscopist, a molecular biologist, or even a toxicologist, you could, with little effort, rebrand yourself as a nanotechnologist. Yet despite the excitement and the hype—and some rather Transcendence-like speculation—what has come to be known as nanotechnology actually has its roots in early-twentieth-century breakthroughs.

  In 1911, the physicist Earnest Rutherford proposed a novel model of the atom. Drawing on groundbreaking experiments from a couple of years earlier, Rutherford’s model revolutionized our understanding of atoms, and underpinned a growing understanding of, not only how atoms and molecules come together to make materials, but how their specific arrangements affect the properties of those materials.

  Building on Rutherford’s work, scientists began to develop increasingly sophisticated ways to map out the atomic composition and structure of materials. In 1912, it was discovered that the regular arrangement of atoms in crystalline materials could diffract X-rays in ways that allowed their structure to be deduced. In 1931, the first electron microscope was constructed. By the 1950s, scientists like Rosalind Franklin were using X-rays to determine the atomic structure of biological molecules. This early work on the atomic and molecular makeup of materials laid the foundations for the discovery of DNA’s structure, the emergence of transistors and integrated circuits, and the growing field of materials science. It was a heady period of discovery, spurred on by the realization that atoms, and how they’re arranged, are the key to how materials behave.

  By the time Feynman gave his lecture in 1959, scientists were well on the way to understanding how the precise arrangement of atoms in a material determines what properties it might exhibit. What they weren’t so good at was using this emerging knowledge to design and engineer new materials. They were beginning to understand how things worked at the nano scale, but they still lacked the tools and the engineering dexterity to take advantage of this knowledge.

  This is not to say that there weren’t advances being made in nanoscale engineering at the time—there were. The emergence of increasingly sophisticated synthetic chemicals, for instance, depended critically on scientists being able to form new molecules by arranging the atoms they were made of in precise ways, and, in the early 1900s, scientists were creating a growing arsenal of new chemicals. At the same time, scientists and engineers were getting better at making smaller and smaller particles, and using some of the convenient properties that come with “smallness,” like adding strength to composite materials and preventing powders from caking. By the 1950s, companies were intentionally manufacturing a range of nanometer-scale powders out of materials like silicon dioxide and carbon.

  As the decades moved on, materials scientists became increasingly adept at manufacturing nanoscopically small particles with precisely designed properties, especially in the area of catalysts. Catalysts work by increasing the speed and likelihood of specific chemical reactions taking place, while reducing the energy needed to initiate them. From the early 1900s, using fine particles as catalysts—so-called heterogeneous catalysts—became increasingly important in industry, as they slashed the costs and energy overheads of chemical processing. Because catalytic reactions occur at the surface of these particles, the smaller the particles, the more overall surface area there is for reactions to take place on, and the more effective the catalyst is.

  This led to increasing interest in creating nanometer-sized catalytic particles. But there was another advantage to using microscopically small particles in this way. When particles get so small that they are made of only a few hundred to a few thousand atoms, the precise arrangement of the atoms in them can lead to unexpected behaviors. For instance, some particles that aren’t catalytic at larger sizes become catalytic at the nano scale. Other particles interact with light differently; gold particles, for instance, appear red below a certain size. Others still can flip from being extremely inert to being highly reactive.

  As scientists began to understand how particle size changes material behavior, they began developing increasingly sophisticated particle-based catalysts that were designed to speed up reactions and help produce specific industrial chemicals. But they also began to understand how the precise atomic configuration of everything around us affects the properties of materials, and can in principle be used to design how a material behaves.

  This realization led to the field of materials science growing rapidly in the 1970s, and to the emergence of novel electronic components, integrated circuits, computer chips, hard drives, and pretty much every piece of digital gadgetry we now rely on. It also paved the way for the specific formulation of nanotechnology adopted by the US government and by governments and scientists around the world.

  In this way, the NNI successfully rebranded a trend in science, engineering, and technology that stretched back nearly one hundred years. And because so many people were already invested in research and development involving atoms and molecules, they simply had to attach the term “nanotechnology” to their work, and watch the dollars flow. This tactic was so successful that, some years ago, a colleague of mine cynically defined nanotechnology as “a fourteen-letter fast track to funding.”

  Despite the cynicism, “brand nanotechnology” has been phenomenally successful in encouraging interdisciplinary research and development, generating new knowledge, and inspiring a new generation of scientists and engineers. It’s also opened the way to combining atomic-scale design and engineering with breakthroughs in biological and cyber sciences, and in doing so it has stimulated technological advances at the convergence of these areas. But “brand nanotechnology” is most definitely not what was envisioned by Eric Drexler in the 1980s.

  The divergence between Drexler’s vision of nanotechnology and today’s mainstream ideas goes back to the 1990s and a widely publicized clash of opinions between Drexler and chemist Richard Smalley.150 Where Drexler was a visionary, Smalley was a pragmatist. More than this, as the co-discoverer of the carbon-60 molecule (for which he was awarded the Nobel Prize in 1996, along with Robert Curl and Harry Kroto) and a developer of carbon nanotubes (a highly novel nanoscale form of carbon), he held considerable sway within established scientific circles. As the US government’s concept of nanotechnology began to take form, it was Smalley’s version that won out and Drexler’s version that ended up being sidelined.

  Because of this, the nanoscale science and engineering of today looks far more like the technology in The Man in the White Suit than the nanobots in Transcendence. Yet, despite the hype behind “brand nano,” nanoscale science and engineering is continuing to open up tremendous opportunities, and not just in the area of stain-resistant fabrics. By precisely designing and engineering complex, multifunctional particles, scientists are developing new ways to design and deliver powerful new cancer treatments. Nanoscale engineering is leading to batteries that hold more energy per gram of material, and release it faster, than any previous battery technology. Nanomaterials are leading to better solar cells, faster electronics, and more powerful computers. Scientists are even programming DNA to create new nanomaterials. Hype aside, we are learning to master the material world, and become adept in coding in the language of atoms and molecules. But just as with Stratton’s wonder material, with many of these amazing breakthroughs that are arising from nanoscale science and engineering, there are also unintended consequences that need to be grappled with.

  Myopically Benevolent Science

  In 2000, I published a scientific paper with the somewhat impenetrable title “A simple model of axial flow cyclone performance under laminar flow conditions.” It was the culmination of two years’ research into predicting the performance of a new type of airborne dust sampler. At the time, I was pr
etty excited by the mathematics and computer modeling involved. But despite the research and its publication, I suspect that the work never had much impact beyond adorning the pages of an esoteric scientific journal.151

  Like many scientists, I was much more wrapped up in the scientific puzzles I was trying to untangle than in how relevant the work was to others. Certainly, I justified the research by saying it could lead to better ways of protecting workers from inhaling dangerous levels of dust. If I was honest, though, I was more interested in the science than its outcomes. At the same time, I was quite happy to coopt a narrative of social good so that I could continue to satisfy my scientific curiosity.

  I suspect the same is true for many researchers. And this isn’t necessarily a bad thing. Science progresses because some people are driven by their curiosity, their desire to discover new things and to see what they can do with their new knowledge. While this is often inspired by making the world a better place or solving tough challenges, I suspect that it’s the process of discovery, or the thrill of making something that works, that keeps many scientists and engineers going.

  This is actually why I ended up pursuing a career in science. From a young age, I wanted to do something that would improve people’s lives (I was, I admit, a bit of an earnest child). But my true love was physics. I was awestruck by the insights that physics provided into how the universe works. And I was utterly enthralled by how a grasp of the mathematics, laws, and principles of physics opened up new ways of seeing the world. To me physics was—and still is—a disciplined way of thinking and understanding that is both awe-inspiring and humbling, revealing the beauty and elegance of the universe we live in while making it very clear that we are little more than privileged observers in the grand scheme of things. It challenged me with irresistible puzzles, and filled me with amazement as I made new discoveries in the process of trying to solve them. While I’ve always been mindful of the responsibility of science to serve society, I must confess that it’s often the science itself that has been my deepest inspiration.

 

‹ Prev