Book Read Free

End Times: A Brief Guide to the End of the World

Page 23

by Bryan Walsh


  The new rules around such work represent a step forward, although critics were unhappy that the details of the government review weren’t made public, which Lipsitch and Inglesby wrote in the Washington Post “could put health and lives at risk.”43 But all we have to do is look to the example of He Jiankui and his gene-edited babies to know that there are limits to the scientific community’s ability to regulate itself. When science was mostly the province of a handful of countries, it might have been possible for top researchers to effectively control the spread of knowledge, but those days are long past. In 2010 China produced 117,000 PhDs in all fields, more than twice as many as the United States44 and ten times more than it graduated in 1999.45 That number has only continued to grow. Even before He, Chinese scientists were the first to CRISPR monkeys and nonviable embryos, and the first to put CRISPR’d cells in a live human adult.46 It’s not that Chinese scientists are less ethical than their Western peers, but they have demonstrated a willingness to push the envelope on biotechnology. The unilateralist’s curse is a multiplier, after all—the more scientists who are capable of carrying out potentially dangerous research, the more who will actually go ahead and do it.

  Too little attention is paid to the possibility that science is inadvertently creating low-probability but high-consequence existential risks—especially in biotechnology. The default remains: do the science to the best of your ability, and let the chips fall where they may. Scientists have historically been quicker to see the danger of overregulation than underregulation.

  But not every scientist.

  With a towering frame, a long white beard, and a deep voice that begins at the bottom of his shoes, George Church could audition to play the role of God. And playing God is exactly what Church has been accused of from time to time. The Harvard geneticist is one of the foremost figures in the science of reading, writing, and editing DNA—and he’s certainly the boldest. Church has worked with the Long Now Foundation to bring the woolly mammoth back from the grave of extinction by editing the genetic traits of mammoths into the genomes of their close relatives, the Asian elephant. He was among the first to use CRISPR on mammalian cells and has explored the possibility of synthesizing—meaning writing afresh—an entire human genome, all three billion DNA base pairs. If the synthetic biology revolution is going to change how we live, how we work, and even how we die—or don’t—Church will get much of the credit. Or the blame.

  Church is unusual among the scientists in his field for another reason. While most researchers soft-pedal how new tools like CRISPR might change medicine or society—partly out of fear of public overreaction—Church is more than willing to talk through the full consequences of his revolutionary work, up to and including the possible end of the world. In addition to his labs at Harvard and in the southern Chinese city of Shenzhen, Church has helped found countless biotech start-ups that seek to commercialize his discoveries. (As a result, a conflict-of-interest slide that Church provides at the start of his talks—listing all the various companies he has a financial interest in—is as convoluted as a map of the Tokyo subway.) One of his most exciting experiments involves using CRISPR to edit the genome of pigs so that they could be used to grow organs for direct transplant into human beings—what’s known as xenotransplantation—without any concerns over immune rejection.47 He has a start-up for that discovery, too.

  But as eager as Church is to explore what biotechnology can do, he is also well aware that we may not be ready for the power we’re beginning to wield. That’s why he is on the board of the Centre for the Study of Existential Risk at Cambridge University and the Future of Life Institute in Boston. Church takes a position that is unusual among many scientists, even in the life sciences, with its institutional review boards and independent ethics committees. He asks to be regulated, almost as Odysseus asked to be lashed to the mast of his ship as he approached the Sirens with their deadly songs.

  “In 2004, I was worried about the proliferation of DNA synthesis, being able to make anything,” Church told me when I met him at a synthetic biology conference in Boston in 2018. “I was worried again when I started coding digital information into DNA, that it would suddenly create a gigantic market for DNA… when we start making an Internet of things and any person can build any option. I worry that people could make DNA synthesizers that don’t follow the rules and regulations. I worry that once people learn to put pig organs into humans they’ll get viruses from the pigs, and those will evolve inside an immunosuppressed patient, creating something worse. I’m worried that any cellular-based therapy could cause cancer, even if the goal is to cure cancer. These are subsets. I could go on and on.”

  Church knows that if his field, which is changing the stuff of life, is seen to be out of control, the backlash from the public could be disastrous. “You want equitable distribution so that everyone can have access to the positive aspects of this technology,” he told me in an earlier interview. “You don’t want people creating killer viruses, and that will require some regulation. I am a fan of the regulatory agencies. Without them we would be at risk for much bigger setbacks. When a lot of people die—that’s when you get the real setbacks.”

  Biotechnology has already experienced that kind of fatal setback. In 1999, eighteen-year-old Jesse Gelsinger died during a gene therapy trial, a tragedy that halted progress in the then-promising field of correcting genetic disorders through the replacement of defective genes with transplanted healthy ones.48 In the aftermath of He Jiankui’s CRISPR’d babies, scientists worried that politicians might demand a total ban on the use of gene editing in embryos, cutting off what could be an effective technique to treat incurable genetic disorders.49 But this is the question we must answer, for biotechnology and for other man-made existential risks caused by emerging technologies: how do we regulate a rapidly advancing science that has consequences—positive and negative—we can’t yet fully understand?

  The truth is that it’s almost impossible to successfully thread the needle between allowing new technologies to develop without exposing society, and perhaps the entire human race, to unknown levels of risk, in part because we can rarely know what those risks will be before it’s too late. When Facebook first emerged in the mid-2000s, social media was just a small part of the internet, the economy, and our lives. The companies providing it weren’t very powerful. It would have been easy for the government at the time to regulate social media—but without a crystal ball, how would Washington possibly have known what to regulate and why? Now, more than a decade later Facebook is worth hundreds of billions of dollars, and social media has been employed to hack a presidential election. Facebook, Twitter, and their peers have changed our habits, culture, and even our minds in ways we’re just beginning to comprehend. Yet the government is struggling to figure out how to effectively regulate a global industry in a manner that these now very rich and influential companies can’t sidestep or subvert.

  This is what is known as the Collingridge dilemma. David Collingridge was an academic at the University of Aston in Britain when he published a book in 1980 called The Social Control of Technology. Collingridge maintained that we can either regulate a technology in the nascent stages, when both its potential risks and its potential benefits are still unknown, or we can wait until it has more fully matured, by which time it may have spread so widely that effective regulation becomes impossible.50 Either way we may lose.

  Facebook isn’t going to end the world—probably—but biotechnology could. Synthetic biology holds enormous potential—cures for disease, engineered organs, cell lines resistant to all viruses, biofuels that could replace oil, meat made without killing animals. Holding back the development of this field would cost lives, and may even make it more difficult to combat existential risks like climate change. The potential downside of synthetic biology, however, could be absolute—and if we wait until after an engineered pathogen is released into the public, it will be too late.

  On the same trip where I met with Geo
rge Church, I stopped by the South Boston offices of Ginkgo Bioworks. Ginkgo is the first company in synthetic biology to become a “unicorn,” meaning a private valuation of more than $1 billion. Ginkgo bills itself as the “organism company,” and its bioengineers design and build custom microbes. Originally its microbes were mostly used to synthesize flavors or fragrances derived from plants. In one case Ginkgo partnered with a French perfume company to create a rose fragrance by extracting the genes from real roses, injecting them into yeast, and then engineering the microbe’s biosynthetic pathways to produce the smell of a rose—which, it turns out, smells just as sweet when emitted from a genetically-modified yeast. The company is also working on extracting DNA molecules from preserved plant specimens to synthesize the fragrances of flowers that have gone extinct, like a hibiscus from Maui that vanished from the wild around 1911.51

  Fragrances were just the beginning, however. In 2017 Ginkgo partnered with the German life sciences giant Bayer on a joint venture that will engineer microbes capable of providing nitrogen directly to plants, reducing the need for artificial chemical fertilizer.52 Already Ginkgo estimates that it is responsible for at least one-third of all gene-length DNA synthesis.53 (Twist Bioscience is Ginkgo’s biggest supplier.) The facility I visited in South Boston is less a lab than a biological factory, pumping out synthetic organisms the way a Ford factory pumps out F-150 pickups. Christina Agapakis, Ginkgo’s creative director, showed me around the factory floor, as casually dressed biology PhDs busily pipetted and assayed. But what struck me—beyond the Jurassic Park T-shirts a few of them were wearing, which hit a little too close to home—was how few of them there were. Much of the work at the Ginkgo factory—or “foundry,” as they call it—was now automated, a shift that has already taken place in conventional manufacturing.

  The synthetic biology revolution isn’t just about what scientists can do, but how they can do it. The growing automation at a biotech company like Ginkgo is an example of a trend called deskilling. With each passing year, the scientific expertise needed to pull off a specific experiment in synthetic biology falls. What might have recently required the hard work of a postdoctoral student can now be done by undergraduates, and might soon show up in an ambitious high school student’s science fair project. Synthetic biology is getting easier and it’s getting faster and it’s getting cheaper, which means more and more people can do it. And so the information hazard accumulates.

  Deskilling has helped Ginkgo design and produce more than 10,000 genes per month.54 To give a sense of how much that is, Jason Kelly, Ginkgo’s cofounder and CEO, told me he designed all of 50,000 base pairs of DNA throughout the entirety of his time as a grad student at MIT in the mid-2000s. All of his work then was done by hand. But now synthetic biology is shifting from an artisanal practice carried out in the laboratory by highly trained experts to a true industry. Everything is sped up, allowing researchers to design an organism, build it, test it, analyze the test, and then start the whole cycle over and over again. “Things that would have made a great PhD thesis a few years back can now be done by people in two weeks here,” Agapakis told me.

  The personal history of another one of Ginkgo’s founders, Tom Knight, illustrates just how far synthetic biology has come, and where it is poised to go. Knight was a computer engineering prodigy at MIT in the 1960s. Back then, Knight did his programming on bulky computers that required users to manually enter instructions from deck after deck of punched cards. Today programming is just a matter of typing code into a computer, but for decades biological programming moved no faster than Knight’s punch-card processing, limiting what could be done. “In any sort of engineering discipline, the critical thing that controls what you can ultimately do is the speed at which you can try things out,” said Knight.

  During Knight’s first act as a computer engineer in the 1960s, ’70s, and ’80s—before he switched to the nascent field of synthetic biology—he benefited from the computing revolution described by Moore’s law. Laid down by Gordon Moore, the cofounder of the chip company Intel, Moore’s law predicted that computing power would double roughly every eighteen months. Supercomputers, laptops, iPhones—they were all possible because Moore’s law turned out to be correct. And in recent years the same trend has taken place in the reading and writing of DNA, which has fallen in price and increased in speed. This has made possible the industrialization of synthetic biology. Synthetic biologists may never be able to program as seamlessly as their counterparts in tech—biology is made up of bits of life, however tiny, whereas computer code is just code—but they’re getting better and faster all the time.

  Students today are practically ordered to learn computer coding, but in the future everyone might be an amateur synthetic biologist, programming the stuff of life. If digital apps on our smartphones have transformed how we live and work, imagine what bioengineering apps might do. “Designing genomes will be a personal thing, a new art form as creative as painting or sculpture,” as the physicist and writer Freeman Dyson envisioned in a 2007 article for the New York Review of Books. “Few of the new creations will be masterpieces, but a great many will bring joy to their creators and variety to our fauna and flora. The final step… will be biotech games, designed like computer games for children down to kindergarten age, but played with real eggs and seeds rather than with images on a screen.”55

  That’s the optimistic version. But deskilling something as powerful as synthetic biology will inevitably amplify existential risk. The artificial intelligence (AI) risk scholar Eliezer Yudkowsky has a smart take on the effects of deskilling that he has called “Moore’s law of mad science”: “Every 18 months, the minimum IQ necessary to destroy the world drops by one point.”56 It’s not quite as precise as the original Moore’s law—nor was it meant to be—but Yudkowsky is on to something. Consider computer programming. The relative ease of coding has created the multitrillion-dollar tech industry as we know it, but it has also empowered thousands of people to create malware for crime, for espionage, and sometimes just for kicks—to make computer viruses, in other words, viruses that cost the global economy more than $600 billion in 2017.57

  Now imagine if it became almost as easy to program biology as it is to program computers, just as Dyson envisioned. Creativity would be unleashed, but so would our darker impulses. Far fewer people would likely use synthetic biology to program and release a killer virus into the world than the number who create malware for crime, in part because there are far more thieves among the population than there are murderers. But murderers do exist. And this technology could empower them in a way that threatens us all.

  In 2013, Cody Wilson—a crypto-anarchist, law student, and one of the fifteen most dangerous people in the world according to Wired magazine—posted plans on the internet for a 3-D printable gun. Called “the Liberator,” the gun was a single-shot pistol made mostly of plastic. If you had access to a 3-D printer, and a few extra ingredients, you could download and print your own untraceable gun, with no regulation whatsoever from the government. Shortly after the blueprints were put online the State Department ordered them removed, citing a possible violation of firearm export rules. Wilson sued, and in June 2018 the State Department—now under the control of the pro-gun rights Trump administration—decided to settle his case, winning him the right to put the plans back online while also recouping nearly $40,000 in legal fees from the U.S. government. Gun-rights advocates celebrated the settlement as “the end of gun control,” in the words of Fox News columnist John Lott Jr.58

  On August 1, 2018, a federal judge issued a restraining order temporarily halting the release of the blueprints, pending lawsuits by several states, and while Wilson insisted he could still sell the plans online, in September he left Defense Distributed after a separate arrest on sexual assault charges.59 But the Liberator is an early sign of how technologies like 3-D printing and synthetic biology can radically lower the bar of expertise and empower individual rogue actors while eroding the ability of th
e government to protect us from ourselves.

  If everyone eventually gains the power to potentially end the world, and governments are largely helpless to stop them, then the continued existence of the world depends on the collective action of all of us—all 7.7 billion and counting—to actively choose not to destroy it. The Stanford political scientist James Fearon developed a thought experiment he outlined in a 2003 talk, back when the global population was closer to 5 billion. He imagined a time when each person had the ability to destroy the world by pushing a button on their cell phone. “How long do you think the world would last if five billion individuals each had the capacity to blow the whole thing up?” he asked. “No one could plausibly defend an answer of anything more than a second. Expected life span would hardly be longer if only one million people had these cell phones, and even if there were 10,000 you’d have to think that an eventual global holocaust would be pretty likely. 10,000 is only two-millionth of five billion.”

 

‹ Prev