“When your hard drive dies you can go to the nearest computer store, buy a new one, and swap it out,” Keasling said. “That’s because it’s a standard part in a machine. The entire electronics industry is based on a plug-and-play mentality. Get a transistor, plug it in, and off you go. What works in one cell phone or laptop should work in another. That is true for almost everything we build: when you go to Home Depot you don’t think about the thread size on the bolts you buy because they’re all made to the same standard. Why shouldn’t we use biological parts in the same way?” Keasling and others in the field—who have formed a bicoastal cluster in the San Francisco Bay Area and in Cambridge, Massachusetts—see cells as hardware and genetic code as the software required to make them run. Synthetic biologists are convinced that with enough knowledge, they will be able to write programs to control those genetic components, which would not only let them alter nature, but guide human evolution as well.
In 2000, Keasling was looking for a chemical compound that could demonstrate the utility of these biological tools. He settled on a diverse class of organic molecules known as isoprenoids, which are responsible for the scents, flavors, and even colors in many plants: eucalyptus, ginger, and cinnamon, for example, as well as the yellow in sunflowers and red in tomatoes. “One day a graduate student stopped by and said, ‘Look at this paper that just came out on amorphadiene synthase,’ ” Keasling told me as we sat in his in office in Emeryville, across the Bay Bridge from San Francisco. He had recently been named chief executive officer of the new Department of Energy Joint BioEnergy Institute (JBEI), a partnership between three national laboratories and three research universities, led by the Lawrence Berkeley National Laboratory. The consortium’s principal goal is to design and manufacture artificial fuels that emit little or no greenhouse gases—one of President Barack Obama’s most frequently cited priorities.
Keasling wasn’t sure what to tell his student. “ ‘Amorphadiene,’ I said. ‘What’s that?’ He told me that it was a precursor to artemisinin. I said, ‘What’s that ?’ and he said it was supposedly an effective antimalarial. I had never worked on malaria. As a microbiology student I had read about the life cycle of the falciparum parasite; it was fascinating and complicated. But that was pretty much all that I remembered. So I got to studying and quickly realized that this precursor was in the general class we were planning to investigate. And I thought, amorphadiene is as good a target as any. Let’s work on that.”
Malaria infects as many as five hundred million of the world’s poorest people every year. For centuries the standard treatment was quinine, and then the chemically related compound chloroquine. At ten cents per treatment, chloroquine was cheap, simple to make, and it saved millions of lives. In Asia, though, by the height of the Vietnam War, the most virulent malaria parasite—falciparum—had grown resistant to the drug. Eventually, that resistance spread to Africa, where malaria commonly kills up to a million people every year, 85 percent of whom are under the age of five. Worse, the second line of treatment, sulfadoxine pyrimethanine, or SP, had also failed widely.
Artemisinin, when taken in combination with other drugs, has become the only consistently successful treatment that remains. (Relying on any single drug increases the chances that the malaria parasite will develop resistance; if taken by itself even artemisinin poses dangers, and for that reason the treatment has already begun to fail in parts of Cambodia.) Known in the West as Artemisia annua, or sweet wormwood, the herb grows wild in many places, but until recently it had been used mostly in Asia. Supplies vary and so does the price, particularly since 2005, when the World Health Organization officially recommended that all countries with endemic malaria adopt artemisinin-based combination therapy as their first line of defense.
That approach, while unavoidable, has serious drawbacks: combination therapy costs ten to twenty times more than chloroquine, and despite growing assistance from international charities, that is far too much money for most Africans or their governments. In Uganda, for example, one course of artemisinin-based medicine would cost a typical family as much as it spends in two months for food. Artemisinin is not an easy crop to cultivate. Once harvested, the leaves and stems have to be processed rapidly or they will be destroyed by exposure to ultraviolet light. Yields are low, and production is expensive. Although several thousand African farmers have begun to plant the herb, the World Health Organization expects that for the next several years the annual demand—as many as five hundred million courses of treatment per year—will far exceed the supply. Should that supply disappear, the impact would be incalculable. “Losing artemisinin would set us back years—if not decades,” Kent Campbell, a former chief of the malaria branch at the Centers for Disease Control, and head of the Malaria Control and Evaluation Partnership in Africa, said. “One can envision any number of theoretical public health disasters in the world. But this is not theoretical. This is real. Without artemisinin, millions of people could die.”
JAY KEASLING is not a man of limited ambitions. “We have gotten to the point in human history where we simply do not have to accept what nature has given us,” he told me. It has become his motto. “We can modify nature to suit our aims. I believe that completely.” It didn’t take long before he realized that making amorphadiene presented an ideal way to prove his point. His goal was, in effect, to dispense with nature entirely, which would mean forgetting about artemisinin harvests and the two years it takes to turn those leaves into drugs. If each cell became its own factory, churning out the chemical required to make artemisinin, there would be no need for an elaborate and costly manufacturing process either. He wondered, why not try to build the drug out of genetic parts? How many millions of lives would be saved if, by using the tools of synthetic biology, he could construct a cell to manufacture that particular chemical, amorphadiene? It would require Keasling and his team to dismantle several different organisms, then use parts from nearly a dozen of their genes to cobble together a custom-built package of DNA. They would then need to create an entirely new metabolic pathway, one that did not exist in the natural world.
By 2003, the team reported its first success, publishing a paper in Nature Biotechnology that described how they constructed that pathway—a chemical circuit the cell needs to do its job—by inserting genes from three organisms into E. coli, one of the world’s most common bacteria. The paper was well received, but it was only the first step in a difficult process; still, the research helped Keasling secure a $42.6 million grant from the Bill and Melinda Gates Foundation. It takes years, millions of dollars, much effort, and usually a healthy dose of luck to transform even the most ingenious idea into a product you can place on the shelf of your medicine cabinet. Keasling wasn’t interested in simply proving the science worked; he wanted to do it on a scale that would help the world fight malaria. “Making a few micrograms of artemisinin would have been a neat scientific trick,” he said. “But it doesn’t do anybody in Africa any good if all we can do is a cool experiment in a Berkeley lab. We needed to make it on an industrial scale.”
To translate the science into a product, Keasling helped start a company, Amyris Biotechnologies, to refine the raw organism, then figure out how to produce it more efficiently. Slowly, the company’s scientists coaxed greater yields from each cell. What began as 100 micrograms per liter of yeast eventually became 25 grams per liter. The goal was to bring the cost of artemisinin down from more than ten dollars a course to less than one dollar. Within a decade, by honing the chemical sequences until they produced the right compound in the right concentration, the company increased the amount of artemisinic acid that each cell could produce by a factor of one million. Keasling, who makes the cellular toolkit available to other researchers at no cost, insists that nobody profit from its sale. (He and the University of California have patented the process in order to make it freely available.) “I’m fine with earning money from research in this field,” he said. “I just don’t think we need to profit from the poorest peo
ple on earth.”
Amyris then joined the nonprofit Institute for OneWorld Health, in San Francisco, and in 2008 they signed an agreement with the Paris-based pharmaceutical company Sanofi-Aventis to produce the drug, which they hope to have on the market by the end of 2011. Scientific response has been largely reverential—it is, after all, the first bona fide product of synthetic biology, proof of a principle that we need not rely on the unpredictable whims of nature to address the world’s most pressing crises. But there are those who wonder what synthetic artemisinin will mean for the thousands of farmers who have begun to plant the crop. “What happens to struggling farmers when laboratory vats in California replace [wormwood] farms in Asia and East Africa?” asked Jim Thomas, an activist with ETC Group, a technology watchdog based in Canada. Thomas has argued that while the science of synthetic biology has advanced rapidly, there has been little discussion of the ethical and cultural implications involved in altering nature so fundamentally, and he is right. “Scientists are making strands of DNA that have never existed,” Thomas said. “So there is nothing to compare them to. There’s no agreed mechanisms for safety, no policies.”
Keasling, too, believes we need to have a national conversation about the potential impact of this technology, but he is mystified by opposition to what would be the world’s most reliable source of cheap artemisinin. “We can’t let what happened with genetically engineered foods”—which have been opposed by millions of people for decades—“happen again,” he said. “Just for a moment imagine that we replaced artemisinin with a cancer drug. And let’s have the entire Western world rely on some farmers in China and Africa who may or may not plant their crop. And let’s have a lot of American children die because of that. It’s so easy to say, ‘Gee, let’s take it slow’ about something that can save a child thousands of miles away. I don’t buy it. They should have access to Western technology just as we do. Look at the world and tell me we shouldn’t be doing this. It’s not people in Africa who see malaria who say, ‘Whoa, let’s put the brakes on.’ ”
Keasling sees artemisinin as the first part of a much larger program. “We ought to be able to make any compound produced by a plant inside a microbe,” he said. “We ought to have all these metabolic pathways. You need this drug? okay, we pull this piece, this part, and this one off the shelf. You put them into a microbe and two weeks later out comes your product.”
That’s the approach Amyris has taken in its efforts to develop new fuels. “Artemisinin is a hydrocarbon and we built a microbial platform to produce it,” Keasling said. “We can remove a few of the genes to take out artemisinin and put in a different hydrocarbon to make biofuels.” Amyris, led by John Melo, who spent years as a senior executive at British Petroleum, has already engineered three molecules that can convert sugar to fuel. “It is thrilling to address problems that only a decade ago seemed insoluble,” Keasling said. “We still have lots to learn and lots of problems to solve. I am well aware that makes people anxious, and I understand why. Anything so powerful and new is troubling. But I don’t think the answer to the future is to race into the past.”
FOR THE FIRST four billion years, life on earth was shaped entirely by nature. Propelled by the forces of selection and chance, the most efficient genes survived and evolution ensured they would thrive. The long, beautiful Darwinian process of creeping forward by trial and error, struggle and survival, persisted for millennia. Then, about ten thousand years ago, our ancestors began to gather in villages, grow crops, and domesticate animals. That led to new technologies—stone axes and looms, which in turn led to better crops and the kind of varied food supply that could support a larger civilization. Breeding goats and pigs gave way to the fabrication of metal and machines. Throughout it all, new species, built on the power of their collected traits, emerged, while others were cast aside.
As the world became larger and more complex, the focus of our discoveries kept shrinking—from the size of the planet, to a species, and then to individual civilizations. By the beginning of the twenty-first century we had essentially become a society fixated on cells. Our ability to modify the smallest components of life through molecular biology has endowed humans with a power that even those who exercise it most proficiently cannot claim to fully comprehend. Man’s mastery over nature has been predicted for centuries—Bacon insisted on it, Blake feared it profoundly. Little more than one hundred years have passed, however, since Gregor Mendel demonstrated that the defining characteristics of a pea plant—its shape, size, and the color of the seeds, for example—are transmitted from one generation to the next in ways that can be predicted, repeated, and codified.
Since then, the central project of biology has been to break that code and learn to read it—to understand how DNA creates and perpetuates life. As an idea, synthetic biology has been around for many years. It took most of the past century to acquire the knowledge, develop the computing power, and figure out how to apply it all to DNA. But the potential impact has long been evident. The physiologist Jacques Loeb was perhaps the first to predict that we would eventually control our own evolution by creating and manipulating new forms of life. He considered artificial synthesis of life the “goal of biology,” and encouraged his students to meet that goal. In 1912, Loeb, one of the founders of modern biochemistry, wrote that “nothing indicates . . . that the artificial production of living matter is beyond the possibilities of science. . . . We must succeed in producing living matter artificially or we must find the reasons why this is impossible.”
The Nobel Prize-winning geneticist Hermann J. Muller attempted to do that. By demonstrating that exposure to X-rays can cause mutations in the genes and chromosomes of living cells, he was the first to prove that heredity could be affected by something other than natural selection. He wasn’t entirely certain that humanity would use the information responsibly, though. “If we did attain to any such knowledge or powers there is no doubt in my mind that we would eventually use them,” Muller wrote in 1916. “Man is a megalomaniac among animals—if he sees mountains he will try to imitate them by building pyramids, and if he sees some grand process like evolution, and thinks it would be at all possible for him to be in on that game, he would irreverently have to have his whack at that too.”
We have been having that “whack” ever since. Without Darwin’s most important—and contentious—contribution, none of it would have been possible, because the theory of evolution explained that every species on earth is related in some way to every other species; more important, we carry a record of that history in each of our bodies. In 1953, James Watson and Francis Crick began to make it possible to understand why, by explaining how DNA arranges itself. The language of just four chemical letters—adenine, guanine, cytosine, and thymine—comes in the form of enormous chains of nucleotides. When joined together, the arrangement of their sequences determine how each human differs from each other and from all other living beings.
By the 1970s, recombinant DNA technology permitted scientists to cut long, unwieldy molecules of nucleotides into digestible sentences of genetic letters and paste them into other cells. Researchers could suddenly combine the genes of two creatures that would never have been able to mate in nature. In 1975, concerned about the risks of this new technology, scientists from around the world convened a conference in Asilomar, California. They focused primarily on laboratory and environmental safety, and concluded that the field required only minimal regulation. (There was no real discussion of deliberate abuse—at the time it didn’t seem necessary.)
In retrospect at least, Asilomar came to be seen as an intellectual Woodstock, an epochal event in the history of molecular biology. Looking back nearly thirty years later, one of the conference’s organizers, the Nobel laureate Paul Berg, wrote that “this unique conference marked the beginning of an exceptional era for science and for the public discussion of science policy. Its success permitted the then contentious technology of recombinant DNA to emerge and flourish. Now the use of the rec
ombinant DNA technology dominates research in biology. It has altered both the way questions are formulated and the way solutions are sought.”
Scientists at the meeting understood what was at stake. “We can outdo evolution,” said David Baltimore, genuinely awed by this new power to explore the vocabulary of life. Another researcher joked about joining duck DNA with orange DNA. “In early 1975, however, the new techniques hardly aspired to either duck or orange DNA,” Michael Rogers wrote in the 1977 book Biohazard, his riveting account of the meeting at Asilomar and of the scientists’ attempts to confront the ethical as well as biological impact of their new technology. “They worked essentially only with bacteria and viruses—organisms so small that most human beings only noticed them when they make us ill.”
That was precisely the problem. Promising as these techniques were, they also made it possible for scientists to transfer viruses—and cancer cells—from one organism to another. That could create diseases anticipated by no one and for which there would be no natural protection, treatment, or cure. The initial fear “was not that someone might do so on purpose,” Rogers wrote—that would come much later—“but rather that novel microorganisms would be created and released altogether accidentally, in the innocent course of legitimate research.”
Decoding sequences of DNA was tedious work. It could take a scientist a year to complete a stretch ten or twelve base pairs long (our DNA consists of three billion such pairs). By the late 1980s automated sequencing had simplified the procedure, and today machines are capable of processing that information, and more, in seconds. Another new tool—polymerase chain reaction—was required to complete the merger of the digital and biological worlds. Using PCR, a scientist can take a single DNA molecule and copy it many times, making it easier to read and manipulate. That permits scientists to treat living cells like complex packages of digital information that happen to be arranged in the most elegant possible way.
Denialism: How Irrational Thinking Hinders Scientific Progress, Harms the Planet, and Threatens Our Lives Page 21