Book Read Free

Life's Ratchet: How Molecular Machines Extract Order from Chaos

Page 9

by Hoffmann, Peter M.


  3

  The Entropy of a Late-Night Robber

  It has been [our] principal indeavour to enlarge and strengthen the Senses . . . by . . . outward Instruments. . . . By this means [we] find . . . that those effects of Bodies, which have been commonly attributed to Qualities, and those confess’d to be occult, are perform’d by the small Machines of Nature, which are not to be discern’d without these helps.

  —ROBERT HOOKE, MICROGRAPHIA

  So nat’ralists observe, a flea

  Hath smaller fleas that on him prey,

  And these have smaller fleas that bite ’em,

  And so proceed ad infinitum.

  —JONATHAN SWIFT

  WHEN ROBERT HOOKE PEERED THROUGH HIS PRIMITIVE microscope, he found a new world of tiny “Machines of Nature,” from dimples on poppy seeds to the sting of a bee. The “machines” he saw through his microscope in the late 1600s were just the beginning: As micro scopes improved, all of Hooke’s machines were found to be made of cells, which themselves were entire factories of even smaller machines, each made of smaller parts yet—all the way down to atoms and molecules. It became clear that living things, while immensely complex, were made of the same stuff as the rest of nature.

  How do atoms and molecules assemble into a flower or a human? Where do we cross the threshold from lifeless atoms and molecules to living organisms? What makes an object alive? These hard questions puzzled scientists and philosophers for millennia. Yet, we may be the first generation to glimpse answers to these questions. To understand these answers, we must begin with the basic building blocks of nature: atoms and molecules. Atoms are tiny clumps of matter, so tiny, that it takes 300,000 carbon atoms to span the width of a single human hair. A humble E. coli bacterium is only one-quadrillionth (10−15) the mass of a human, and yet it contains 100,000 billion atoms. Molecules are assemblies of atoms bound together by strong electrical bonds. Molecules can contain as few as two atoms and as many as tens of thousands.

  Atoms and molecules are restless. Democritus, Epicurus, and their fellow atomists already understood this important point. In air, molecules of nitrogen, oxygen, carbon dioxide, and water vapor randomly swirl around, colliding at high speeds. Without noticing, we are continuously bombarded by supersonic gas molecules from the surrounding air. The calmness we see around us is an illusion. We are surrounded—no, immersed—in chaos. Yet from such chaos order can arise: On cold winter days, randomly swirling water molecules, high in the clouds, find each other and create beautiful, symmetric snowflakes. The world we see around us—the macroscopic world—is one of order and regularity. A book on a table does not jump suddenly; nor does it spontaneously burst into flames. Yet, seen at a very small scale, a book is a mass of atoms that rattle and shake, collide, and send each other hurling off into space. How can visible order and life’s complexity arise from such chaos?

  In the late 1800s, this question occupied physicists such as Ludwig Boltzmann in Austria, James Clerk Maxwell in Scotland, and Josiah Willard Gibbs in the United States. For them, the relatively simple example of a gas provided the perfect starting point. A macroscopic volume of gas follows simple laws that relate pressure, volume, and temperature, but how did these laws arise? To find the answers, these scientists turned to the new science of statistics, and invented statistical mechanics. This discipline applies statistics to the mechanics of atoms and molecules. In their thinking, if statistics can describe the height of a thousand men or the marriage age of a thousand women, it sure should be able to describe the behavior of a billion billion atoms. In everyday life, we use statistics to calculate such figures as average income, IQ (which is defined as a standard deviation from average intelligence), and income distributions. Similarly, physicists discovered how to calculate averages, deviations, and distributions of the speeds and energies of atoms. Although atoms move randomly, their collisions conform to physical rules. Maxwell and Boltzmann showed that the distribution of speeds in a gas was just a normal distribution—originally derived to generalize Pascal’s gambling formula. Applying statistics to the chaos of atoms and molecules, they found that averaged over time and space, the randomness of atomic motion gives way to order and regularity.

  Life is based on molecules. These molecules are subject to the underlying chaos of the molecular storm—which at first glance seems to be a destructive force. How can life survive and possibly benefit from this chaos? This was Schrödinger’s famous question. Schrödinger saw a contradiction between the chaos of atoms and the structure of life. But today we know that the chaotic motions of atoms and molecules—controlled by life’s intricate structure—give rise to life’s activity. There is no contradiction. Life emerges from the random motions of atoms, and statistical mechanics can capture the essence of this emergence.

  Money on the Rocks

  When I was a graduate student in Baltimore in the early 1990s, I had the unfortunate experience of being robbed at gunpoint. All I had on me was ten dollars, so it was not a big loss, but it was an upsetting experience nevertheless. Money is what makes the world go round, but it also makes people do unpleasant things. In physics and biology, we have a different currency that makes things happen—we call it energy. Money and energy have a lot in common. In a transaction, where one party gives money to another party, the total amount of money is conserved: The robber gained ten dollars, and I lost ten dollars. Is it possible, in the same transaction, for me to lose eight dollars and the robber to gain ten dollars? No. Money does not appear out of nowhere.

  The same is true of energy. As Helmholtz had shown, energy conservation is the strictest law of nature. In “energy transactions,” the energy before equals the energy after the transaction. Energy can be transferred from one object to another or converted to a different form, but energy is never gained or lost.

  Imagine you are standing on the moon, in the absence of air friction, and you pick up a rock. As the rock rests in your outstretched hand, it has gravitational energy (stored in the attraction between the rock and the moon). When you drop the rock, it accelerates as it falls. Motion is associated with a form of energy called kinetic energy. Where does the energy for the rock’s motion originate? It comes from the gravitational potential energy stored in the attraction between the rock and moon when you lift the rock off the ground. The falling rock “pays” for the kinetic energy (motion) by using up gravitational energy. Throughout the rock’s fall, the total energy of the rock (gravitational plus kinetic) is always constant. Energy is conserved.

  Suddenly, your rock hits the ground in a cloud of moon dust and stops moving. Having lost gravitational energy through the fall, it loses its kinetic energy as well. If energy conservation is true, where does the energy go on impact?

  Throughout the eighteenth and nineteenth centuries, physicists studied what happened when moving objects were slowed down by impact or friction. The law of energy conservation was new, and situations like our falling rock presented a conundrum. This changed when Sir Benjamin Thompson, Count Rumford, studied the heat generated while boring a cannon from a cylinder of metal. Finding that motion (kinetic energy) could be continuously turned into heat, he therefore concluded that heat had to be a form of energy. Before this realization, heat was thought to be a fluid (called caloric) that flowed from hot to cold objects, a fluid that eventually ran out. But Rumford refuted this idea: “It is hardly necessary to add that anything which any insulated body can continue to furnish without limitation cannot possibly be a material substance; and it appears to me to be extremely difficult . . . to form any distinct idea of anything capable of being excited and communicated, in the manner the heat was being excited and communicated in these experiments, except it be motion.” With the knowledge that heat could be created from motion, or kinetic energy, scientists wondered, “What kind of energy is heat?”

  Matter is made of atoms, which are in perpetual motion. How do we know this? For a gas sealed in a container, an increase of temperature is always associated with an inc
rease in pressure. Early work by Maxwell, Boltzmann, and others in the kinetic theory of gases explained this pressure increase by relating both temperature and pressure to the motion of atoms. In this view, pressure was the result of innumerable impacts by atoms with the walls of the container. The faster the atoms moved, the harder they hit the walls, and therefore, the greater the pressure. The atoms could be made to move faster if heat was added and temperature was increased. Temperature seemed to be related to the kinetic energy of the atoms in the gas. Kinetic theory neatly explained the macroscopic laws governing gases, but it presumed the existence of hypothetical, continually moving atoms—tiny objects no one had ever observed. Boltzmann, the Austrian father of statistical mechanics, suffered extreme ridicule for suggesting the existence of atoms. This added to his deep depression, which ended in his suicide in 1906.

  Yet the idea that gases such as air were made of restless atoms was not a new idea. Atomic motion was indirectly discovered by the botanist Robert Brown in 1827. Like his successor, Charles Darwin, Brown made his mark as a naturalist serving on a British surveying expedition. During his voyage, Brown collected thousands of Australian plant specimens, many of them previously unknown species—only to lose most of them in a shipwreck. Nevertheless, he became a well-respected naturalist, who is credited with naming the cell nucleus. Despite his daring travels, physicists best remember Brown for a discovery he made in the safety of his own home. Brown observed that pollen grains suspended in air or liquid perform a jittery dance, as if pushed by an invisible, random force. Today we call this dance Brownian motion. In some sense, Brown really only rediscovered what Democritus had observed two thousand years earlier—the “motes in the air” that were “always in movement, even in complete calm.”

  When Brown discovered the random motion of pollen grains, he wondered if their motion had something to do with the fact that pollen grains were alive. He decided to study suspended dust particles of similar size. They, too, performed the same strange dance. Brown concluded that the random motion was not due to the pollen’s being alive, but was due to some inherent motion in matter. The solution to the mystery came almost a hundred years later, when Albert Einstein proved that the random movement of much smaller particles caused the jittery dance of the pollen or dust grains. Dust and pollen grains are jostled around by random collisions with countless atoms. A pollen grain moves because of small temporary imbalances between the number of atoms hitting from one direction and the number hitting from the other. Einstein suggested experiments to prove his theory of Brownian motion. Using the most sophisticated micro scopes of the early 1900s, the French physicist and Nobel laureate Jean Perrin (1870–1942) used Einstein’s theory to prove once and for all that atoms exist and are always in motion. Boltzmann, just two years after his tragic death, was vindicated.

  The tiny scale of atoms and molecules is dominated by continuous motion. Scientists call this continuous motion of atoms and molecules thermal motion. Thermal motion does not mean gently floating atoms: At room temperature, air molecules reach speeds in excess of the fastest jet airplane! If we were reduced to the size of a molecule, we would be bombarded by a molecular storm—a storm so fierce, it would make a hurricane look like a breeze. Yet, despite their stupendous speeds, molecules in the air do not get very far, because they frequently collide with each other. When this happens, the colliding molecules bounce like tiny billiard balls. The jittery dance that Brown observed and that Einstein explained is the result of this underlying tempest of colliding atoms.

  The Mystery of the Missing Energy

  Now we can return to the falling moon rock. If energy is supposed to be conserved, what happened to the rock’s kinetic energy at impact? Where did the kinetic energy of the rock disappear to?

  Atoms cannot move as freely in a solid as they do in a gas or liquid. But the atoms in a solid are still moving; they oscillate at high frequencies about a central position, like the head on a bobblehead doll. Physicists often think of solids as collections of atoms connected by springs, all wobbling about, while on average staying at particular positions. When the rock hit the ground, the kinetic energy of the rock did not disappear. Instead, the kinetic energy of the rock transferred to atoms in the rock and in the ground, making the atoms wobble more vigorously. As a result, the rock and the ground got warmer. The kinetic energy of the rock was converted into thermal energy or heat. Energy remained conserved.

  The conservation of energy, coupled with the fact that heat is a type of energy, is called the first law of thermodynamics (we will meet the second law shortly). Thermodynamics is the science that deals with thermal energy (the word comes from the Greek therme, meaning “heat,” and dynamis, meaning “power”) and is the macroscopic “sister science” of statistical mechanics. Thermodynamics is what emerges when we average the random motions of atoms using the tools of statistical mechanics.

  Now that we have solved the mystery of where the energy of the falling rock went on impact, let me ask a dumb question (science has advanced by asking a lot of these): Why don’t rocks extract heat from the ground and jump up spontaneously? This would not violate energy conservation. The rock could take heat from the ground, making the ground cooler, and turn the extracted heat into kinetic energy. Yet, we never see this happen. Rocks don’t spontaneously jump off the ground. Why not?

  After our rock hit the ground, the atoms in the rock and the ground started to shake more violently. Both the rock and the ground became warmer. Atoms in solids are attached to other atoms, so if one atom shakes, neighboring atoms will soon shake as well. When an atom excites its neighbors, the atom loses some energy, which its neighbors gain. In turn, the atom’s neighbors excite their neighbors, and the extra energy provided by the rock’s impact is soon randomly distributed among an astronomical number of atoms in the rock and in the ground.

  In the story of the late-night robber, the robber stole my money, he spent some of it, and the people who received money from him spent their money, too. Imagine that the robber stole a thousand pennies instead of a ten-dollar bill. After a while, one thousand people could potentially each have a penny of my money. It would be highly unlikely that my pennies would be spontaneously reunited, as this would require one thousand people (probably unacquainted) to go to the same merchant at the same time to spend their pennies. Similarly, it would be impossible for all the atoms in our rock to spontaneously concentrate their energy to make the rock jump. If we can believe that it is close to impossible for one thousand pennies to be spontaneously reunited, consider the immense number of atoms in the rock (something like a trillion trillion atoms!). This giant number of atoms would have to simultaneously push in the same direction for the rock to jump up from the ground. Yet, there is no master choreographer that tells the atoms in which direction to shake. They all shake randomly.

  Consider the argument I have just made. I did not say that it is impossible for a rock to extract heat from the ground and spontaneously jump up from the ground. The word impossible has no place in science. Instead I have made a probabilistic argument: While it is not impossible for the rock to jump up by itself, it is extremely unlikely. Remember, we are dealing with statistical mechanics, so every statement is probabilistic in nature. This is quite different from the physics we learn in school: There is supposed to be only one correct answer, and all others are wrong. When I drop a rock, I know it will fall and not rise. But in reality, it could rise—it is just highly improbable, and nobody has yet seen it happen or likely ever will.

  Not All Energies Are Created Equal

  Impact and friction readily turn kinetic energy into heat, but heat does not easily revert back to kinetic energy. Different types of energy are not always interchangeable. The law of energy conservation tells us that we cannot create or destroy energy, but it does not tell us if a particular type of energy can be converted to some other type. What makes some types of energy more convertible than others?

  So far, we have encountered three types o
f energy: gravitational energy, kinetic energy, and heat (or thermal energy). Each type of energy is associated with certain properties of a system (system is physicist lingo for a situation containing objects, energies, and forces). Gravitational energy is completely determined by the height of the object above the ground. Similarly for kinetic energy, the only parameter needed is the speed of the object. However, to completely describe the state associated with thermal energy, we need to know the speeds and locations of all the atoms contained in our system—that is, we would need an astronomical amount of information to fully describe the state of a system that contains thermal energy. Because this is not a realistic proposition, physicists use average values instead. For example, the temperature of a gas is given by the average kinetic energy of the atoms multiplied by a constant. Individually, the atoms in a gas can have different kinetic energies. Since temperature is an average, this tells us little how kinetic energy is distributed among all the atoms of the gas.

 

‹ Prev