There is nothing sophisticated about opening a gold capsule; you simply snip open one end and pour out the contents. First I washed the outside of the capsule in organic solvents to avoid any contamination by machine oil or fingerprints. Then I froze it in liquid nitrogen, so that the contents would not leak out during the snip. I positioned the capsule over a glass vial to catch the gold and its contents. Just a little snip … usually does the trick…. Kapow! The gold weld blasted off into some remote corner of the lab, propelled by the sudden release of what must have been several atmospheres of internal gas pressure. A bit shaken, I dropped the capsule into the bottom of the vial, where it lay dormant for a few seconds. But then it began to hiss and foam as a yellow-brown oily substance frothed out, coating the gold and the glass. A pungent odor not unlike Jack Daniels permeated the lab.
The pyruvate had clearly reacted, but it did not look anything like colorless, odorless oxaloacetate. What had we made?
Time to consult George Cody [Plate 1], a recent arrival at the lab and an organic geochemist trained to analyze messy, oily stuff. George is enthusiastic, loquacious, and—luckily for us—he can't seem to say no. He is also an expert in the chemistry of coal; he tends to snow visitors to his office with a blizzard of arcane chemical names and reactions. He thinks out loud and scribbles diagrams of molecules and reactions on any available surface, including the protective windows of his lab's chemical hoods. When I showed him the smelly goo, he knew just what to do.
“GCMS,” he said, “We probably don't need CI.” I nodded as if I understood what he was talking about. “Let's use BF3 propanol as the derivatizing agent. The Supelco column should work fine.”
He had proposed that we analyze our suite of products by passing them, together with a chemically inert gas, through a long, thin tube filled with specially prepared organic molecules. This technique, gas chromatography (the “GC” of GCMS), separates different molecules according to how fast they move through the column. In general, smaller, less reactive molecules move faster than bigger, “sticky” molecules. The gas chromatograph sorts a collection of different molecules into separate little pulses, typically over a period of 30 or 40 minutes.
Then comes the mass spectrometer (the “MS” of GCMS), which measures the relative masses of molecules and their fragments. George's mass spectrometer blasts molecules into lots of smaller pieces of distinctive weights, so each pulse from the GC can be analyzed separately as a suite of characteristic mass fragments, providing a kind of fingerprint of the product molecule.
It took a couple of hours of chemical processing to prepare the concentrated liquid sample for analysis. George filled a syringe with the pale yellow liquid and injected a tiny drop into the GCMS with a practiced, swift motion. We sat back to watch as a spectrum gradually appeared on the computer monitor. The first peak showed up at 10.79 minutes—a small molecule with probably only two or three carbon atoms. Then another peak at 11.71 minutes, and another at 11.96. Faster and faster peaks appeared, piling in on top of each other, every spike representing additional molecular products. By the 20-minute mark, a broad hump decorated with hundreds of sharp spikes was emerging.
“Humpane,” George muttered in disgust. Pyruvate had reacted in our capsules, to be sure. But instead of the simple 3 + 1 = 4 reaction that Morowitz had proposed, we had produced an explosion of molecules—tens of thousands of different kinds of molecules. Not a trace of oxaloacetate was to be found, but a bewildering array of other molecular species had emerged. It might take a lifetime to decipher the contents of just one such molecular suite.
One conclusion was obvious. Some very dynamic organic reactions proceed rapidly at hydrothermal conditions. In one sense, Morowitz's hypothesis had failed: Pyruvate doesn't react with carbon dioxide to form oxaloacetate under those conditions. But we had caught our first glimpse of a robust, emergent carbon chemistry in a hydrothermal environment. This was chemistry worth exploring.
Where to begin? We were faced with choosing from among thousands of simple carbon-based molecules over a wide range of pressure, temperature, and other experimental variables—work to devour a hundred scientific lifetimes.
What were we getting ourselves into?
A diverse suite of molecules emerges when pyruvate is subjected to high temperature and pressure. These products appear as numerous sharp peaks superimposed on a broad “humpane” feature on a gas chromatogram.
Part I
Emergence and the Origin of Life
All origin-of-life researchers face the baffling question of how the biochemical complexity of modern living cells emerged from a barren, primordial geochemical world. The only feasible approach is to reduce biological complexity to a comprehensible sequence of chemistry experiments that can be tackled in the human dimensions of space and time—a lab bench in a few weeks or months. George Cody, Hat Yoder, and I were eager to continue our hydrothermal experiments, but what should come next? We knew that the simplest living cell is intricate beyond imagining, because every cell relies on the interplay of millions of molecules engaged in hundreds of interdependent chemical reactions. Human brains seem ill suited to grasp such multi-dimensional complexity.
Scientists have devised countless sophisticated chemical protocols, and laboratories are overflowing with fancy analytical apparatus. Chemists have learned to synthesize an astonishing array of paints, glues, cosmetics, drugs, and a host of other useful products. Yet when confronted with the question of life's ancient origin, it's easy to become mired in the scientific equivalent of writer's block. How does one begin to tackle the chemical complexity of life?
One approach to understanding life's origin lies in reducing the living cell to its simpler chemical components, the small carbon-based molecules and the structures they form. We can begin by studying relatively simple systems and then work our way up to systems of greater complexity. In such an endeavor, the fascinating new science of emergence points to a promising research strategy.
1
The Missing Law
It is unlikely that a topic as complicated as emergence will submit meekly to a concise definition, and I have no such definition.
John Holland, Emergence: From Chaos to Order, 1998
Hot coffee cools. Clean clothes get dirty. Colors fade. People age and die. No one can escape the laws of thermodynamics.
Two great laws, both codified in the nineteenth century by a small army of scientists and engineers, describe the behavior of energy. The first law of thermodynamics establishes the conservation of energy. Energy, which is a measure of a system's ability to do work, comes in many different forms: heat, light, kinetic energy, gravitational potential, and so forth. Energy can change from any one form to another over and over again, but the total amount of energy does not change. That's the first law's good news.
The bad news is that nature places severe limitations on how we can use energy. The second law of thermodynamics states that heat energy, for example, always flows from warmer to cooler regions, never the other way, so the concentrated heat of a campfire or your car's engine gradually radiates away. That dissipated heat energy still exists, but you can't use it to do anything useful. By the same token, all natural systems tend spontaneously to become messier—they increase in disorder, or “entropy.” So any collection of atoms—be it your shiny new shoes or your supple young body—gradually deteriorates. The second law of thermodynamics is more than a little depressing.
But look around you. You'll find buildings, books, automobiles, bees—all of them exquisitely ordered systems. Despite the second law's dictum that entropy increases, disorder is not the only end point in the universe. Observations of such everyday phenomena as sand dunes, seashells, and slime mold reveal that the two laws of thermodynamics may not tell the entire story. Indeed, some scientists go so far as to claim that a fundamental law of nature, the law describing the emergence of complex ordered systems (including every living cell), is missing from our textbooks.
THE LAWS OF N
ATURE
The discovery of a dozen or so natural laws represents the crowning scientific achievement of the past four centuries. Newton's laws of motion, the law of gravity, the laws of thermodynamics, and Maxwell's equations for electromagnetism collectively quantify the behavior of matter, energy, forces, and motions in almost every human experience. The power of these laws lies in their universality. Each law can be expressed as an equation that applies to an infinite number of events, from the interactions of atoms to the formation of galaxies. Armed with these laws, scientists and engineers confidently analyze almost any physical system, from steam engines to stars.
So sweeping and inclusive are these natural laws that some scholars of the late nineteenth century suggested that the entire theoretical framework of science had been deduced. All that remained to be discovered were relatively minor details, like filling in the few remaining gaps in a stamp collection. Though this turned out not to be the case—modern physics research has revealed new phenomena at the relativistic scales of the very small, the very fast, and the very massive—the classic laws do indeed still hold sway in our everyday lives.
Yet in spite of centuries of labor by many thousands of scientists, we do not fully understand one of nature's most transforming phenomena—the emergence of complexity. Systems as a whole do tend to become more disordered with time, but at the local scale of a cell, an ant colony, or your conscious brain, remarkable complexity emerges. In the 1970s, the Russian-born chemist Ilya Prigogine recognized that these so-called complex emergent systems arise when energy flows through a collection of many interacting particles. The arms of spiral galaxies, the rings of Saturn, hurricanes, rainbows, sand dunes, life, consciousness, cities, and symphonies all are ordered structures that emerge when many interacting particles, or “agents”—be they molecules, stars, cells, or people—are subjected to a flow of energy. In the jargon of thermodynamics, the formation of patterns in these systems helps to speed up the dissipation of energy as mandated by the second law. Scientists and nonscientists alike tend to value the surprising order and novelty of such emergent systems.
The recognition and description of such emergent systems provides a foundation for origin-of-life research, for life is the quintessential emergent phenomenon. From lifeless molecules emerged the first living cell. If we can understand the principles governing such systems, we may be able to apply those insights to our experimental programs.
DESCRIBING EMERGENT SYSTEMS
If you want to enunciate a law that characterizes emergent systems, then the first step is to examine everyday examples. You can observe emergent behavior in countless systems all around us, including the interactions of atoms, or of automobiles, or of ants. This universal tendency for systems to display increased order when lots of objects interact, while fully consistent with the first and second laws of thermodynamics, is not addressed explicitly in either of those laws. We have yet to discover if all emergent systems possess a unifying mathematical behavior, though our present ignorance should not seem too unsettling. It took more than a half-century for each of the first two laws of thermodynamics—describing the behavior of energy and entropy, respectively—to develop from qualitative ideas into quantitative laws. I suspect that a mathematical formulation of emergence will be discovered much sooner than that, perhaps within the next decade or two.
Scientists have already identified key aspects of the problem. Many familiar natural systems lie close to equilibrium—that is, they are stable and unchanging—and thus they do not display emergent behavior. Water gradually cooled to below the freezing point equilibrates to become a clear chunk of ice. Water gradually heated above the boiling point similarly equilibrates by converting to steam. For centuries, scientists have documented such equilibrium processes in countless carefully controlled scientific studies.
Away from equilibrium, dramatically different behavior occurs. Rapidly boiling water, for example, displays complex, turbulent convection. Water flowing downhill in the gravitational gradient of a river valley interacts with sediments to produce the emergent landform patterns of braided streams, meandering rivers, sandbars, and deltas. These patterns arise as energetic water moves.
Emergent systems seem to share this common characteristic: They arise away from equilibrium when energy flows through a collection of many interacting particles. Such systems of agents tend spontaneously to become more ordered and to display new, often surprising behaviors. And as patterns arise, energy is dissipated more efficiently, in accord with the second law of thermodynamics. Ultimately, the resulting behavior appears to be much more than the sum of the parts.
Emergent patterns in water and sand may seem a far cry from living organisms, but for scientists studying life's origins there's a big payoff in understanding such simple systems: Of all known emergent phenomena, none is more dramatic than life, so studies of simpler emergence can provide a conceptual basis, a jumping-off point, for origin-of-life research.
QUANTIFYING THE COMPLEXITY OF EMERGENT SYSTEMS
Even though emergent systems surround us, a rigorous definition (much less a precise mathematical formulation) remains elusive. If we are to discover a natural law that describes the behavior of emergent systems, then we must first identify the essential properties of such systems. But what characteristics distinguish emergent systems from other less interesting collections of interacting objects?
All emergent systems display the rather subjective characteristic of “complexity”—a property that thus far lacks a precise quantitative definition. In a colloquial sense, a complex system has an intricate or patterned structure, as in a complex piece of machinery or a Bach fugue. “Complexity” may also refer to information content: An advanced textbook contains more detailed information, and is thus more complex, than an elementary one. In this sense, the interactions of ants in an ant colony or neurons in the human brain are vastly more complex than the behavior of a pile of sand or a box of Cheerios.
Such complexity is the hallmark of every emergent system. What scientists hope to find, therefore, is an equation that relates the properties of a system on the one hand (its temperature or pressure, for example, expressed in numbers), to the resultant complexity of the system (also expressed as a number) on the other. Such an equation would in fact be the missing “law of emergence.” But before that is possible we need an unambiguous, quantitative definition of the complexity of a physical system. How to proceed?
A small band of scientists, many of them associated with the Santa Fe Institute in New Mexico, have thought long and hard about complex systems and ways to model them mathematically. But their efforts yield surprisingly diverse (some would say divergent) views on how to approach the subject.
John Holland, an ace at computer algorithms and a revered founder of the field of emergence, models emergent systems as computer programs with a fixed set of operating instructions. He suspects that any emergent phenomenon, including sand ripples, ant colonies, the conscious brain, and more, can be reduced to a set of selection rules. Holland and his followers have made great strides in mimicking natural phenomena with a few lines of computer code. Indeed, for Holland and his followers the complexity of a system is closely related to the minimum number of lines of computer code required to mimic that system's behavior.
A delightful example of this approach is BOIDS, a simple program written by California programmer Craig Reynolds that duplicates the movements of flocking birds, schooling fish, swarming insects, and other collective animal behaviors with astonishing accuracy. (To check it out on the Internet, just Google “BOIDS.”) Lest you think that this effort is idle play, remember that computer programmers of video games and Hollywood special effects have made a bundle on this type of simulated emergent behavior. Think of BOIDS the next time you watch dinosaur herds on the run in Jurassic Park, swarming locusts in The Mummy, or schools of fish in Finding Nemo.
Physicist Stephen Wolfram, a mathematical prodigy who made millions in his twenties from the eleg
ant, indispensable computer package Mathematica, provides a complementary vision of emergent complexity from simple rules. Like Holland, Wolfram was captivated by the power of simple instructions to generate complex visual patterns. Sensing a new paradigm for the description and analysis of the natural world, he has spent the past 20 years developing what he calls “a new kind of science” (NKS for short). A mammoth tome by that title published in 2002 and an elaborate Web site (www.wolframscience.com) illustrate some of the stunning ways whereby geometric complexity may arise from simple rules. Perhaps, Wolfram argues, the complex evolution of the physical universe and all it contains can be modeled as a set of sequential instructions.
Many other ways to view complex systems have been proposed. The late Danish physicist Per Bak described complex systems in terms of a mathematical characteristic called “self-criticality.” These systems evolve by repeatedly achieving a critical point at which they falter and regroup, like a growing pile of sand that avalanches over and over again as new grains are added. Santa Fe theorist Stuart Kauffman proposes another tack, focusing on the emergence of chemical complexity via competitive “autocatalytic networks,” by which collections of chemical compounds catalyze their own formation. And Nobel laureate Murray Gell-Mann, who also works at the Santa Fe Institute, has recently introduced a new parameter he calls “nonextensive entropy”—a measure of the intrinsic complexity of a system—as a path to understanding complex systems.
Genesis: The Scientific Quest for Life's Origin Page 3