Regenesis

Home > Other > Regenesis > Page 17
Regenesis Page 17

by George M. Church


  * Robert Lanza, of Advanced Cell Technology, an adviser to the bucardo project, speculated that if a pulmonary surfactant, which aids breathing, had been promptly and properly administered, the animal might have lived.

  CHAPTER 7

  -10,000 YR, NEOLITHIC

  Industrial Revolutions.

  The Agricultural Revolution and Synthetic Genomics.

  The BioFab Manifesto

  Industrial Revolutions

  The Neolithic era began roughly 10,000 years ago in the Middle East, at the tail end of the Stone Age. By this time in history the genus Homo had winnowed itself down to only one remaining human species, Homo sapiens—us—which had by then vanquished, assimilated, or otherwise out-survived Neanderthal man, the Denisovans, and all earlier examples of archaic humankind. The period was put on the map and immortalized by the development and use of polished stone implements—and nice-looking ones at that—as opposed to the chipped or found stone tools utilized in the earlier Paleolithic.

  But the Neolithic is noted for something far more important than stone tools—the invention of agriculture. Other than the massive set of effects wrought by the industrial revolution, the single greatest transformation in human history occurred during the Neolithic, as people turned from hunting and gathering to farming and animal husbandry.

  The agricultural and other industrial revolutions are major turning points in human history because they allowed us to make immense leaps in our understanding of and control over nature. They were revolutions in knowledge and in toolmaking, and they have clear analogs in synthetic biology, which is likewise a product of specialized knowledge and a unique set of tools.

  Human history includes at least six different “industrial revolutions.” Arguably we are now in the midst of the sixth industrial revolution, and the tools and knowledge it encompasses have given us the power to remake ourselves. Revolutions are sometimes scary, but they do not have to be. Each revolution begins with a period of tinkering by trial and error. A prehistoric “scientist” stumbles across a fire and tries adding dry leaves that start to burn, hot as the sun. Then he tries adding sand but discovers that this puts the fire out. Revolutions spread outward from the center with vague ways of communicating intentions and degrees of progress, as when our fire man tries to tell his friends that fire is hot, and that friend tells others. Eventually we develop measurements, in this case scalar indications of temperature, and models that enable prediction and design.

  Revolutions can have unanticipated positive and negative consequences—as when a fire rages out of control and perhaps incinerates its maker. Bearing this in mind, we will chart the course of the revolutions that have led to the power to control our future biological development—to understand and then manipulate the evolving genome of life itself.

  The first industrial revolution was centered on the notion of time. It began 15,000 years ago, when we had no idea of what 15,000 years meant or what a revolution was. Those of us who looked human (including Neanderthals and Denisovans) had spread far beyond Africa and were discovering a need to understand time, so that we could predict the seasons. We could get by in the transition from the season of gathering foods to the season of planting them just by waiting for the warmth of spring. Why, then, bother measuring time?

  Because it was reassuring during the winter to visualize the time remaining until spring to pace the use of stored food. Floods and droughts recurred, somewhat predictably, each year. Some life-altering events took place on longer time frames like the five-year cycle of El Niño. Some catastrophes occurred less frequently and lacked a periodic component but required a collective memory to maintain preparedness. Fortunately, the crucial measurements tended to be easy and digital and could be checked against other measures. Thirty sunrises corresponded to one lunar cycle. Twelve lunar cycles made up a year. The day wasn’t digital but rather smoothly analog, and dividing it into hours with a sundial and precise seconds with a mechanical clock probably wasn’t crucial until we started serious navigation.

  What was the killer app, or tool, for measuring time? In terms of tempo, biological systems exhibit natural cycles that are synchronized with some astrophysical cycles. Biological cycles such as times of hibernating, mating, and flowering match up with the earth’s tilted revolution around the sun. Bears wake up slowly at the end of winter, and then their prey animals get a fast and rude awakening as the season’s first bear claws penetrate into their resting places. Matching the lunar cycles most evidently are the tidal behaviors. Less obviously, some animals (e.g., primates) menstruate monthly, while other mammals generally have nonmonthly estrous cycles. Matching the rotation of our home planet, almost all life has circadian, diurnal cycles of metabolism. Probably all animals with brains have tendencies toward sleep patterns synchronized to the sun. Cave-dwelling animals lost this synchronization over the course of many millennia.

  At the scale of seconds, we notice heartbeats and wing beats. In the millisecond range, whales and bats produce ultrasound vibrations of 100,000 per second and up to 180 decibels to navigate and communicate. Some biological systems purposefully avoid simple patterns in order to thwart predation; for example, seventeen-year cicadas and eighty-year cycles for the blooming of bamboo. The point is that the first clocks were hardwired into living things of all stripes, and then human beings started reinventing them and soft-wired them into our culture. Initially this was in service to the gods of agriculture but the study and engineering of time spread aggressively into many of our technologies today. Our close relatives the great apes tend to think on very short timeframes—instant gratification. The ability to tell long narratives in the form of epic poems and songs, and to draw cave paintings (as far back as 32,000 BCE), went hand in hand with a growing awareness of causality and the advantages that such awareness brings. This contributed to developing strategies for hunts and for warfare that required more coordination and timing than even the remarkable skills of wolf packs.

  Keep in mind that it doesn’t take much of an advantage for a revolutionary advance to sweep through a population. A 5 percent advantage compounded annually for twenty years is a 260 percent advantage, and over two hundred years is a 17,000-fold advantage.

  As with most technologies, the taming of time bore unwelcome and unanticipated consequences. Today, as we face impending deadlines, a hectic pace of life, and existential risks of all kinds, it’s tempting to think that stress was less severe in prehistoric times. But we have had many generations to adapt evolutionarily, while the revolutionary concepts of time and causality may have had a comparatively rapid onset. The unwelcome consequences of warfare and stealth and deception reverberate in our culture and inherited psyche today.

  The moral of the story is that progress comes with hidden costs, risks, and unpleasant surprises. As I chart the course of genomic technologies, I will do my best to point them out.

  The Second Industrial Revolution, 4000 BCE:

  The Agricultural Revolution and Synthetic Genomics

  Agriculture, the domestication of animals and crops, and the trade it resulted in encouraged the concentration of people and led to cities. Probably the first domesticated crop was emmer wheat (Triticum dicoccoides), found growing wild in the ancient Near East. Two wild grasses, Triticumurartu and Aegilops speltoides, had intergenus sex. They were diploid (2X), meaning that they had one copy of each chromosome from their mother and father, but their intergenetic children were tetraploid (4X), meaning that they kept two copies each—a full set from all four grandparents. This is rare in any given generation but common over evolutionary time, and ranges from triploid (3X) watermelons and water bears to dodecaploid (12X) plumed cockscomb (Celosia argentea) and clawed frogs (Xenopus ruwenzoriensis).The tetraploid wheat hybrids were adopted by humans possibly as early as 17,000 BCE (based on carbon-14 isotopic dating), in what is now southern Turkey (based on DNA studies), and then spread as far as Egypt to feed the pharaonic dynasties. Along the Yangtze River we see another dramatic
domestication process dating from 12,000 BCE: changes in the morphology of rice phytoliths. And yet another in the Balsas River valley in southeastern Mexico around 6700 BCE, when an annual grass, Zea mays, began its long transformation into modern corn. Domestication of thousands of additional species of plants and animals followed.

  Possibly predating agriculture were dense populations of animals, minerals, or vegetables that hunter-gatherer tribes concentrated and then began to trade and protect their resources. Advantages of concentrating people and their material goods included the scaling of construction—for example, the number of people within a walled enclosure goes up with the square of the material to build the wall. As the density of such wealth grew, so did civil engineering of buildings, walls, boats, and bridges. This required the invention of measurements of length, weight, and cost. The consequences of poor measurements could be fatal. The misalignment of a wall could result in the collapse of a building when tested by a storm or invaders. Ancient architects are said to have been required to stand under their arches when first load-tested. More recently, in 1999, the $328 million Mars climate orbiter mission failed due to the use of incorrect units of force (pound-force vs. newton).

  The unwelcome consequences of the second industrial revolution and the resulting abundance included diseases of crowding. Cholera is caused by the gut bacterium Vibrio cholerae found in contaminated drinking water. The chance of such contamination rises sharply with the density of people and is fairly rare in other animals. Similarly, Yersinia pestis, the causative agent of black plague, depends on high concentrations of grain, which bring rats, which in turn bring the fleas that harbor the plague organism. This phenomenon is often associated with two waves of plague, the Black Death, which spread from China to Europe between 1330 and 1360 CE. The first documented instance of a plague epidemic occurred in 1200 BCE, around the time that the Philistines stole the Ark of the Covenant from the Israelites and then returned it (possibly thought of as a means of escaping a curse). A recurrence of the plague in 540 CE, in Ethiopia or central Asia via Egypt, spread by ships and caravans of the Emperor Justinian, killed as many as 100 million people. In the Middle Ages another 75 million died (roughly 50 percent of many European villages). Another 10 million died in Asia in 1885 CE. Gabriele de’Mussi’s contemporary account of the 1346 siege of the Crimean city of Caffa (currently Feodosia, Ukraine) describes Mongol soldiers catapulting plague-ridden Mongol corpses into the double-walled city, and constitutes one of the first documented instances of biological warfare.

  Malaria arose from expanses of stagnant water in rice paddies and other irrigated crops. Celiac disease (a failure to digest food caused by a hyper-sensitivity to gluten in the small intestine) arose when wheat became plentiful in our diet before our genome had a chance to adapt—or more accurately stated, before those adaptations had spread to all wheat eaters. The convenience of monoculture crops brought with it monoculture pests, like locusts. Plowing removed meter-thick roots that fought erosion. The use of plants lacking nitrogen-fixing bacterial ecosystems resulted in soil depletion and the need to fertilize. Fertilization, in turn, resulted in runoff into ponds with consequent blooms of microbes, which consume so much oxygen that fish can’t survive. So they go belly up.

  The switch to agriculture had several further consequences. Whereas hunter-gatherers existed in small, mobile, roving bands, early farmers lived near their fields in order to protect them from predators and plunderers, as well as to harvest and process crops. Harvesting, in turn, required the development and use of new tools and implements such as plows, sickles, and milling and grinding stones. Houses, community centers, and then villages, towns, and cities arose near these fertile areas. These city inhabitants led more sedentary lives than their hunting and gathering forebears. Social life became more complex, structured, and hierarchical than ever before.

  Farmers often grew more food than they could use, which led them to develop storage vessels, bins, and storehouses. More important, the accumulations of foodstuffs prompted the early farmers to trade with other people, which helped create a working economy and led to new concentrations of wealth.

  Further, whereas hunter-gatherers tended to exhaust the resources of a given region and then move on to the next, only to despoil it, the early farmers actually improved and increased the yield of a given piece of land through cultivation and irrigation. Instead of merely letting wild plants resow themselves wherever their seeds happened to fall, the farmers preferentially sowed seeds of plant types that were hardier, bore more fruit, were better looking, tastier, or otherwise viewed as more desirable than lesser species. This was a form of artificial selection, a favoring of one sort of plant type over another, and of increasing the numbers of the favored plant at the expense of those considered to be less attractive.

  The domestication of plants and animals that occurred during the Neolithic era has clear parallels to synthetic biology—the attempt to domesticate microbial, plant, and animal genomes, including those of humans. Synthetic biology has progressed in three distinct phases. The first was the era of genetic engineering or basic biotechnology. Starting in the 1970s, this was the time in which researchers “domesticated” microorganisms by modifying their genomes manually. They first got E. coli to produce insulin, erythropoietin, monoclonal antibodies, and other such substances. The tools they used to modify genomes, while seemingly advanced for their time, are nowadays viewed as more or less Stone Age devices.

  The second phase of synthetic biology is a period of growth and elaboration, with commercial synthetic genomics extended to a wider set of goals such as the discovery and production of new drugs, biofuels, and genetically modified foods. It’s also characterized by the use of more sophisticated tools in the form of automated techniques and machines, and the development of novel methodologies such as the use of induced pluripotent stem cells to create a range of narrowly targeted pharmaceuticals. Doing all this successfully on a mass, industrial scale further required the invention of implements that are comparable in their way to those developed in the mechanization and industrialization of agriculture by means of tractors, harvesters, threshers, combines, automatic cow-milking machines, and the like. In both instances, it was the age of commercial mass production of the respective commodities.

  A third phase of synthetic genomic enterprises is now in the making. These commercial enterprises will try to make a living out of synthesizing entire new genomes. At first glance, this may seem like an unprecedented and entirely novel development. But in actual fact, this advance, like the others we have considered, only recapitulates what nature had already done. Nature, after all, was the pioneer genome synthesizer. Nevertheless, if and when we can duplicate what nature has done and create a new genome with never before seen functionality from scratch, then we might finally be in a position to claim that we really know and understand how life works, from the molecules up.

  The immense impact that agriculture had on social complexity is paralleled by the impact of molecular technologies on the life science industry. In the Neolithic, the simplicity of spears and fire gave way to oxen pulling wheeled carts on roads, to food and seed storage, irrigation, and so on. In like manner, the simple elegance of the dawn of molecular biology can be seen in our ability to sketch out the essentials of the genetic code merely by looking at phage plaques on a petri plate and then binding oligoribonucleotides to ribosomes. That has changed beyond recognition.

  Biotech research has since followed an amusing downhill path of “progress.” In the early days, organismal and molecular biologists would compete with each other to enact hands-on feats of bravado, toughness, and self-reliance. They would hunt and gather in extreme, hostile locales (from snake-ridden jungles to biosafety fume hoods full of radioactive mutagens); they would stop whirring centrifuge rotors with their bare hands, make their own enzymes, and work with high levels of radioisotopes straight out of a nuclear reactor. Toiling for a couple weeks to make a single enzyme, however, people s
oon realized that it was just as easy to make enough to last them and all their friends a year as to make a batch that would last one person for a week. In the mid-1970s this realization prompted small companies, such as New England Biolabs and New England Nuclear, to make enzymes used for recombinant DNA and for labeling reagents. That development was similar to an earlier trend for making and supplying long lists of chemicals to researchers, for example, by Sigma and Aldrich (which merged in 1975).

  So the next generation of biology students got lazier and less in touch with the basics underlying the synthesis of the enzymes and the chemicals and reagents; instead of making them, they purchased them (while the old-timers moaned). The next step in the devolution was the idea of kits. Researchers found that they had large collections of expensive stuff that failed to perform as expected. This could have been due to generally bad protocols and lack of training, for example. But the solution that appealed to companies was the idea of selling kits—a set of enzymes and chemicals (often including an exotic ingredient known as water) that were individually necessary and jointly sufficient for success at a common lab task, and all of it was attractively packaged and presented.

  Big hit! Researchers suddenly became more productive, which meant more sales . . . and eventually repetitive stress syndrome.

  Another next step in the evolutionary degeneration of lab researchers was “instruments.” The kits could lead researchers by the hand, as it were, but human errors were still possible: why read that fat manual anyway? The solution this time was to take the human out of the loop. Translate the manual into robot code. And while we’re at it, have the robot use ninety-six pipettes instead of just one, so that ever more enzymes, chemicals, and reagents are needed. Lewis Carroll’s Red Queen comes to mind: “Now, here, you see, it takes all the running you can do, to keep in the same place. If you want to get somewhere else, you must run at least twice as fast as that!”

 

‹ Prev