Life's Ratchet: How Molecular Machines Extract Order from Chaos

Home > Other > Life's Ratchet: How Molecular Machines Extract Order from Chaos > Page 16
Life's Ratchet: How Molecular Machines Extract Order from Chaos Page 16

by Hoffmann, Peter M.


  Smoluchowski’s Trap Door

  Marian von Smoluchowski (1872–1917) spent a great deal of time thinking about the second law. An avid mountain climber, the Polish-Austrian physicist wrote prodigiously about everything, from the folding of mountains and erosion by glaciers to diffusion in colloids and heat transfer in liquids. He published his explanation of Brownian motion almost simultaneously with Albert Einstein, although he later admitted that Einstein’s solution had the correct prefactor, and his didn’t.

  One of his many papers, “Experimentally Demonstrable Molecular Phenomena, Which Contradict Standard Thermodynamics,” was published in 1912. In this paper, he discussed special states of matter where fluctuations—spontaneous deviations from the average value of some property (pressure or density, for example)—suddenly become very large. This happens for some gases close to a phase transformation. Smoluchowski asked, What if we could use these pressure fluctuations to push a one-way door? When the pressure gets high enough, the door will open; when it is too low, the door will remain closed. This would be an automated Maxwell demon: Smoluchowski’s trap door would only let high-velocity (high-pressure) molecules through, while rejecting low-velocity molecules. According to Smoluchowski, several physicists of the age considered such hypothetical contraptions a serious objection to the second law. But Smoluchowski dismissed this idea. The problem, according to him, was in how strong the door was. If the door was very weak, or easy to open, it would be subject to thermal motion and would randomly open by itself—letting slow molecules through when it shouldn’t, or letting fast molecules escape back into the slow pool. If the door could be made strong enough to avoid this problem, it might not open at all. And for values in between? The door would be unreliable. Sometimes it would not open when it should, and sometimes it would open when it shouldn’t. Smoluchowski did not present a rigorous calculation in this paper, but he asserted that such a trap door would never work. The second law could not be violated using an automated trap door.

  Toward the end of the paper, Smoluchowski emphatically pointed out that the second law could be violated—if we were willing to wait long enough. This is charmingly illustrated in physicist George Gamow’s series of novels in which the hero, bank clerk Mr. Tompkins, learns about physics. In one book, Maxwell’s demon makes Mr. Tompkins’s highball boil, which prompts Mr. Tompkins’s friend, the “professor,” to excitedly exclaim how lucky they are: “In the billions of years to come, we will still, probably, be the only people who ever had the chance to observe this extraordinary phenomenon!” The second law is a statistical law—which states that most of the time (usually very close to “always”), systems tend toward their most probable state. However, if we are willing to wait long enough, strange things can happen by chance. I could win a million in roulette. Your cold coffee could spontaneously boil. But we’d be waiting a very, very long time for these things to happen.

  How long? The waiting time depends on the probability, which in turn depends on the size of the system. In a large system, one visible with an optical microscope or larger, a violation of the second law will, for all practical purposes, never happen. However, a really small system (a single molecule, for example) can seemingly violate the second law relatively often. This was Maxwell’s point when he invented the demon. Maxwell was not out to disprove the second law. He simply wanted to show that the second law emerges once we talk about a large number of molecules. It is a statistical law. This is why we usually do not apply the second law to single molecules.

  Smoluchowski went on to point out that even though a small system may be able to violate the second law once, you cannot build, from such a small system, a device that can act like Maxwell’s demon. The reason is that violating the second law happens randomly and not repeatedly. Consequently, the second law should be stated this way: You cannot repeatedly extract energy from a uniform heat bath.

  But if it could happen once, why not repeatedly? Let’s look at why not.

  The Demon and the Reset Button

  Maxwell’s demon has been giving physicists a headache for a long time. Although Maxwell invented the little creature to show that the second law was only a statistical law, physicists later saw it as a serious assault on the law itself. In the interest of professional pride, the demon had to be exorcised.

  Several solutions were proposed. One solution, proposed by the physicists Leo Szilard (1898–1964) and Leon Brillouin (1889–1969), posited that the very act of measuring the speeds of molecules would necessarily involve some energy, which would be dissipated. For example, the demon could try to measure the molecular speed with light pulses—but not all the light energy would be recoverable. The dissipated energy would lead to an increase in entropy larger than the decrease in entropy from the sorting of the molecules. However, more recently, computer scientists Rolf Landauer (1927–1999) and Charles Bennett (b. 1943) have pointed out that the measurement could be done without energy loss or entropy increase. Instead, the entropy increase would occur when the demon erases his short-term memory to make room for the next measurement. This solution to the demon conundrum created an intriguing connection between entropy and information storage, which is still a hot topic today.

  Maxwell’s demon illustrates why there is a connection between missing information and entropy. As discussed in Chapter 3, the reason the entropy of a gas at equilibrium is high is that it could be in many different microstates, which all would be compatible with the observed macrostate (pressure, temperature, and so on). But we do not know what the particular microstate of the gas is at each moment in time. Maxwell’s demon, however, would know the microstate of the system, and thus, in some sense, reduce the entropy of the system just by having more information about the system (reducing the missing information). But in order to repeatedly learn more about the microstate of the gas (which changes all the time as gas molecules collide), the demon would need to erase old information to make space for the new, turning knowledge back into missing information. This erasure, according to Landauer and Bennett, comes with an energy cost. In other words, each time you erase information, you dissipate energy and increase entropy.

  In the case of the demon, erasing information restores or “resets” the system to its original state, allowing a new measurement cycle to begin. But since this erasure leads to an increase in entropy, the demon could do his demonic deed only once without paying a price. He would not be able to do it “for free” repeatedly. The same applies to machines. To make a small machine perform repeated motions, a reset step is needed, as the machine needs to be returned to its original state before it can begin a new cycle. And it is this reset step that leads to an inevitable increase in entropy.

  More recently, it has been pointed out that the entropy increase due to erasure of information assumes the second law, and therefore cannot be used to prove the second law. That would be a circular argument. Maxwell’s demon may thus continue to haunt physicists’ dreams. Whatever the correct answer to Maxwell’s demonic challenge, my personal feeling is that Landauer and Bennett were on the right track. Clearly, such demons do not exist—otherwise, highballs would boil spontaneously, as in Gamow’s tale of Mr. Tompkins. Any attempt to make a demon has failed (as evidenced by our inability to devise a perpetual-motion machine). Moreover, recent nanotechnology-based experiments have confirmed Landauer’s conjecture that erasure of information creates entropy.

  At the 2011 American Physical Society meeting in Dallas, physicists Yonggun Jun and John Bechhoefer of Simon Fraser University in Burnaby, Canada, reported on an experiment where they stored and erased information using a 200-nm plastic bead suspended in water. Following its motion with a microscope, they could calculate the heat associated when each bit of information was erased. A bit is a unit of information, contained in “yes” or “ no,” or “1” or “0.” Landauer had postulated a minimum amount of dissipated energy when a bit of information is erased. This minimum is given by Boltzmann’s entropy formula, times the tem
perature at which the bit is erased. Jun and Bechhoefer found that sometimes the heat released was less than Landauer’s limit. This was due to water molecules occasionally helping the bead along. But this only happened at random times. In the long run —i.e., in the statistical limit—Landauer’s limit stood. This work brilliantly confirmed Smoluchowski’s and Maxwell’s hunch that the second law can be violated, but not repeatedly and not predictably. It is a statistical law.

  Reversibility

  In 1918, Smoluchowski (posthumously) published a paper titled, “About the Concept of Chance and the Origin of the Laws of Probability in Physics.” When this paper was published, the kinetic picture of matter, pioneered by Maxwell and Boltzmann, was already solidly accepted. Even the old critics, who had given Boltzmann such a hard time, had grudgingly accepted the existence of atoms and molecules. Probability now reigned supreme in theories of gases and liquids. The final blow to the anti-atomists was Perrin’s experimental results, which completely confirmed the Einstein-Smoluchowski theory of Brownian motion.

  Yet, just a few decades earlier, Boltzmann seemed to be fighting a losing battle. The kinetic theory was heavily criticized. One of the main objections was what Manfred Eigen would much later call Loschmidt’s demon. Boltzmann had tried to show that the second law was a direct result of the motions of atoms—in other words, even in the case of just a few interacting atoms, the second law would hold. Boltzmann reasoned as follows: If molecules were initially in some low-entropy state, their collisions with other molecules would make their distribution of velocities more random, and increase entropy. However, the Austrian physicist Josef Loschmidt (1821–1895), a friend of Boltzmann’s, pointed out that if that were true, what would happen if we reversed time? Loschmidt’s demon was a powerful creature that could reverse time at will. In the realm of atoms, a collision obeys all known laws of physics, no matter if you play time forward or backward. Think of two billiard balls: Ignoring the player and the cue, concentrate on the moment when the two billiard balls collide. If you were to film this instant and play it to an audience forward or backward in time, it would be impossible to tell which is which. Simple elastic collisions, like collisions between molecules, are time reversible—they look the same run forward or backward. With this in mind, how could a time-irreversible law, like the second law, emerge from the reversible mechanics of molecules?

  The answer to this conundrum was twofold. First, for molecules to move toward a more probable velocity distribution, they must be starting out with a less probable distribution. Thus, to see the second law in action, we have to assume that initially, the velocity distribution was improbable, and the entropy low. Then collisions shook things up, making the distribution more probable and increasing entropy. Thus, irreversibility came from the fact that the initial system was not at equilibrium. That is, it was not in a state of maximum entropy. This has consequences for the entire universe we live in: If there is such a thing as the arrow of time, which points from past to future, this arrow can only be there because the universe started in a very low-entropy state. Stars, galaxies, planets, and living beings have been feeding off the low entropy ever since.

  The second part of the answer to how irreversibility can emerge from the reversible mechanics of particles is that the system has to be large enough—must contain enough molecules—so that collisions always mix things up. This is because in a large system, motions are generally uncorrelated, and molecular chaos reigns. If this is the case, what would happen in small systems?

  In the late 1990s, a Los Alamos nuclear physicist, Christopher Jarzynski, derived an equality that electrified physics, especially the study of small molecular systems. Jarzynski’s equality quantified how often small molecular systems violate the second law. As we have seen, small systems can violate the law at random times—and this is why, strictly speaking, we should not apply the second law to such small systems. Leaving this caveat aside, how often do molecules violate the law, and what would be the consequences?

  The second law tells us any directed motion of a system will always encounter the resistance of friction. Friction is the result of many randomly moving molecules scavenging energy away from any nonrandom motion. Now let’s imagine that a clever high school student has just learned about the conservation of energy. She devises a scheme for measuring the height of a mountain: Roll a ball down the mountain, starting the ball from rest, and measure its speed at the bottom. Then calculate the height. This calculation is an easy exercise, and I give problems like this to my introductory physics students. All you have to do is realize that according to energy conservation, the initial gravitational energy (ball on top of the mountain) has to equal the final kinetic energy when the ball reaches the bottom of the mountain. Gravitational potential energy is proportional to height, and thus equating the two energies, we can solve for the height (it comes out to h = v2 / (2g), where h is the height of the mountain, v is the speed of the ball at the bottom, and g is the acceleration of falling). But in practice, the measurement always falls short of the calculation. The kinetic energy of the ball at the foot of the mountain is a little bit less than the gravitational potential energy at the top of the mountain. This is because some of the energy is lost as heat, due to friction. Now the clever high school student decides to improve her accuracy by repeating the experiment a hundred times. Would that help? Not really; friction will always be there, and every single measurement will fall short.

  Now let us imagine a similar experiment at the nanoscale. Shrunk to the nanoscale, our high school student repeats her experiment on a nanosize mountain. Most of the time, her measurements show the same trend as the macroscopic measurements: The speed is less than expected from the height of the mountain, because friction has taken its toll. But much to the nanoscale student’s surprise, rarely and at completely random times, the speed of the ball is more than what is expected. The randomly moving atoms in the surroundings did not resist the motion of the ball, as one would expect, but actually pushed the ball along! When systems are small enough, there is a finite probability, though rare, that the atomic chaos surrounding the system actually adds energy to the system, rather than stealing energy.

  Is there a way to combine measurements and find the height of the nanomountain? Yes, there is. Jarzynski’s equality makes this possible, by averaging over an exponential function of the kinetic energy and not over the measured kinetic energies. Jarzynski showed that theoretically, you could obtain energy differences between two states (for example, top of the mountain and bottom of the mountain) from measurements in the presence of molecular chaos, and thus friction.

  Experimental confirmation of this astonishing theorem did not have to wait long: Using laser tweezers, the biophysicists Carlos Bustamante and Jan Liphardt at University of California–Berkeley pulled on a single RNA molecule containing a loop. They wanted to know the energy difference between RNA molecules with the loop closed and with the loop open. But how could this be measured? Each time they pulled, they got a different answer. The surrounding water molecules created friction and made the measured energy difference between the open- and closed-loop states larger than the actual energy difference between the two states. One way to get close to the correct answer was to do the measurement very, very slowly. Going slow helps, because slow motion is associated with low kinetic energy, and if the kinetic energy is low, the surrounding atoms cannot steal as much. However, when they pulled on the loop with high speed, the measured energy difference was almost always higher than the values measured at slow speeds. Friction had taken its toll.

  Sometimes, they saw the opposite, and the energy difference they measured was less than the minimum energy required to open the loop. This meant that the second law was occasionally violated. In these rare cases, randomly moving water molecules helped open the loop instead of resisting. Applying Jarzynski’s formula, Bustamante and Liphardt averaged all their data, and the correct answer emerged. It was now experimentally confirmed: Nanoscale systems
occasionally violate the second law of thermo dynamics. At the molecular scale, entropy can sometimes spontaneously decrease (although, strictly speaking, entropy is not defined at this scale). When that happens, it is as if time has reversed.

  Thus at the nanoscale, and for short times, Loschmidt’s and Maxwell’s demons can rouse from their slumbers and seemingly violate the second law. Could life’s machines be Maxwell demons, creating order out of chaos by relying on the rare and unpredictable occasions when the second law is violated?

  Perpetuum Mobile

  To answer the question posed in the previous section, the answer is clearly no. All available evidence shows that life is not based, in any shape or form, on violating the second law. How do we know this? We know this since Lavoisier, Helmholtz, and many others determined that our bodies do not create energy, but rather waste energy. The efficiency of a human body (i.e., the amount of physical work obtained compared with the food energy intake) is about 20 percent. The rest (80 percent of food energy intake) is either directly turned into heat through friction or serves to maintain basic metabolic processes in our cells.

 

‹ Prev