Book Read Free

Life's Ratchet: How Molecular Machines Extract Order from Chaos

Page 10

by Hoffmann, Peter M.


  Imagine a gas consisting of just five atoms. All five atoms have the same kinetic energy of 40 meV (meV stands for milli-electron volt, a very small energy unit used by physicists). Then the average energy of the atoms is 40 meV as well. Now think of a different situation. Another gas also consists of five atoms, but two of them have 5 meV each, one atom has 10 meV, another has 50 meV, and the final atom has 130 meV of kinetic energy. Now, the average kinetic energy is (5 + 5 + 10 + 50 + 130)/5 = 200/5 = 40 meV, or the same kinetic energy as our first example. But clearly the situation is not the same. In one case, all the atoms have the same energy, while in the other case, the atoms have very different energies. The difference is the distribution of energy among the atoms. An everyday example of the difference between the average and the distribution of items is household income. The average income in the United States was $50,233 in 2007. But we know that some families scrape by on much less than this, while for others, $50,000 is mere pocket change. The interesting story lies in the distribution, not in the average.

  Now, we are ready to measure how convertible energy is. An important property for convertibility is the distribution of energy among all the atoms in a system. All atoms in a rock have about the same height above ground, and therefore, the same gravitational energy. The distribution of gravitational energy in this case is quite simple and can be accurately described by the height of the rock above the ground. But the distribution of thermal energy among all the atoms is much more difficult to describe. Each atom has different energy and vibrates randomly at its own pace. All we know about the rock’s thermal energy is its average, given by the temperature, but we know very little about the energy of each atom.

  The Curious Case of the Missing Information

  When physicists talk about the state of a system, they distinguish between macrostates and microstates. The macrostate of a rock can be described by all the things we know about the rock: its height above the ground, its speed, its temperature, and so forth. The microstate is the exact state of all the parts of the rock, that is, the distribution of speeds and positions of all of its atoms—information we do not know. As you can imagine, there are a huge number of microstates compatible with any observed macrostate. Atoms in a rock can wiggle in many different ways, but on average, the rock would still have the same temperature, similar to our example above with the five atoms. Knowing macroscopic parameters, such as temperature, tells us little about the particular micro state of the system.

  All this talk of microstates allows us to zero in on a mysterious, yet powerful quantity: entropy. By one definition, entropy is the amount of unknown information about a system or, in other words, the amount of information we would need to fully describe the microstate. Think back to the robber. When I still had my ten dollars, the microstate was very easy to describe: The ten dollars was in my pocket. After the crook stole my money and spent it, the money spread through many hands. The macrostate stayed the same (the amount of money was still ten dollars), but the microstate became more and more unknown (even to the robber). As the robber gave other people part of my money, they in turn spent their money (and so on). The entropy, as it were, of my money increased.

  Here is another example: Think of the room of a teenager. The macrostate of his room may be stated as follows: It contains sixty-seven pieces of clothing, twenty-three books, a desk, a chair, a lamp, a laptop, and miscellaneous junk. To describe the microstate, we need to know where all these items are located. There are only a few arrangements (microstates) compatible with a tidy room: clothes in the closet, grouped by long-sleeve shirts, short-sleeve shirts, slacks, jeans, and so on; books on the shelves, sorted alphabetically by author; and so forth. However, there are almost unlimited ways the room could be messy: jeans on top of the computer, stat-mech books on the floor, shirts strewn across the bed. The entropy of a tidy room is much lower (there are fewer possible tidy rooms) than the entropy of a messy room. Now you may ask yourself the same question I ask myself daily: Why is my room always a mess?

  This question (not exactly in this form) occupied physicists in the 1800s, and their answer was the second law of thermodynamics. The actual question they addressed is the one we are trying to answer as well: Why are some types of energy more useful than others, specifically, why can some types of energy be converted, while others appear difficult to convert, thus making them useless? Moreover, useful energy eventually turns into useless energy. These unfortunate facts can be understood from our tidy-versus-messy room example. Most of us leave items in random places—places where the items don’t belong. Keeping a room tidy requires a lot of work, but without this work, the room inevitably becomes messy. You may wonder, Why does the random placement of items about the room not lead to a tidy room? Why does it always end up messy? Why can’t you randomly put your books back in alphabetical order on the bookshelf ? This is where our microstates come in: If you leave items lying about in random places, you are more likely to end up with a messy rather than a tidy room because there are many ways (microstates) corresponding to a messy situation and only a few corresponding to a tidy one. In a sense, by leaving items in random places, you randomly pick a microstate from all possible microstates, and because there are more microstates that are messy (and fewer tidy states), chances are high you picked one of the messy states.

  Coming back to energy, few microstates are compatible with a certain amount of gravitational energy. This is obvious, as the only relevant parameter is the height of each atom above the ground. Gravitational energy is a low-entropy energy; it is the equivalent of a tidy room. The situation for heat is quite different. Heat is like a messy room; there are so many possibilities for energy distribution among atoms at a particular temperature. Heat is high-entropy energy.

  During energy transformation, it is difficult to keep track of the energy among all of the atoms involved. Just like the socks in your room, some energy will be left here, some there—in the form of random atomic motion. Friction and impact are great randomizers of energy. When a rock hits the ground, energy is “lost” to thermal motion of atoms. A tidy situation turns messy. Energy that is completely organized, concentrated, and tidy is a rather artificial, low-probability situation. If we let a system do whatever it “wants,” it behaves like an unruly teenager. Energy becomes scattered, dispersed, messy—and unusable.

  This tendency of energy to become more and more dispersed and thus unusable is what the second law of thermodynamics is all about. The second law states that in any transformation in a closed system, entropy always increases. The room gets messier, energy disperses, and the money slips through my fingers.

  The Second Law

  The second law of thermodynamics is one of the most profound, far reaching, and misunderstood laws of physics (because people do not read the fine print!). Let me restate the second law, but in a more precise fashion: There can be no process whose only result is to convert high-entropy (randomly distributed) energy into low-entropy (simply distributed, or concentrated) energy. Moreover, each time we convert one type of energy into another, we always end up overall with higher-entropy energy. In energy conversions, overall entropy always increases.

  In these statements of the second law, I put certain words in italics— only and overall. As innocuous as these words may look, they are of great importance in our understanding of the second law and how it relates to life. By ignoring these words, creationists have been able to claim that life and evolution violate the second law of thermodynamics. Not at all!

  Let’s look at what these words mean, starting with only. The second law does not say that a process that converts high-entropy (distributed) energy into low-entropy (concentrated) energy is impossible. If it were, you could not eat ice cream tonight, because a refrigerator is a machine that locally reduces entropy (by cooling things down). But cooling ice cream is not the only result of refrigeration. Your fridge also gobbles up electricity (a low-entropy source of energy) and turns most of it into heat (highentropy energ
y), which it releases into the kitchen. This is why you cannot cool down your kitchen by leaving the refrigerator door open.

  Overall, your refrigerator increases entropy by a large amount even though it locally decreases entropy. You can locally reduce entropy, but you need to do a lot of work and consume a lot of low-entropy energy in the process (think of tidying your room—it is not impossible, but you end up sweating and cursing—and increasing the entropy of your surroundings by burning low-entropy food energy and turning into high-entropy heat and waste). The same applies to how life works: Life uses a low-entropy source of energy (food or sunlight) and locally decreases entropy (creating order by growing) at the cost of creating a lot of high-entropy “waste energy” (heat and chemical waste). The creationist claim that the second law does not allow for life or evolution does not hold water.

  Entropy and the second law of thermodynamics are among the most important concepts of physics and among the most misunderstood. Part of the problem is there are several definitions of entropy and none are straightforward. Often, entropy is simply equated with disorder. This is not a good description unless we clearly understand what is meant by disorder. Again, creationists have exploited this confusion, claiming increases in entropy are incompatible with life. Their argument is that because entropy is disorder and life is order, the second law proves that life could not have emerged spontaneously.

  Equating entropy with disorder is convenient, but it is not a precise definition by any means. Scientists run into this problem frequently. When casting (usually mathematical) definitions into everyday language, the details get lost in translation. Entropy is not the same as the disorder we think of every day (our tidy-versus-messy room example was only an analogy to explain the concept of microstates). Instead, entropy measures the degree to which energy is spread out. Sometimes an orderly-appearing system may have energy that is more dispersed than that of a “disordered” system. Such a system, although seemingly more ordered, would have higher entropy.

  A surprisingly simple example illustrating this difference between entropy and the everyday concept of disorder is a collection of hard spheres (think of marbles). As we put more and more marbles into a container, the marbles reach a critical density (number of marbles per volume) at which the highest entropy state (i.e., the one we would expect to be most disordered) is the state in which the marbles are perfectly stacked in an ordered pattern. How is this is possible? Marbles can be filled into a container in two fundamentally different ways: We can just pour them into a jar and let them fall where they fall—or we can carefully stack them into an ordered array. Just pouring them leads to a situation where marbles are touching, but are otherwise in random positions. Such an arrangement is called random stacking by physicists. When marbles are randomly stacked, some marbles will become completely wedged and will not be able to move. Random stacking reduces freedom of motion, leading to a more narrow energy distribution and thus lower entropy. Nicely stacked marbles, on the other hand, have on average a little bit more wiggle room, and therefore higher entropy. Marbles are a good illustration of a simple system where higher entropy means more order, not less. Simply equating entropy with disorder can be misleading. In biological cells, there are many ordered structures that form spontaneously while increasing entropy. These include assemblies of proteins, cell membrane structures, and fibers. In these cases, the entropy is increased by exporting the disorder to the surrounding water molecules. More about how this works in Chapter 4.

  Once again, we find that the second law does not preclude the emergence and continued presence of life. Entropy can be reduced locally if it is increased globally (in our refrigerator, for example), and sometimes, an increase in entropy (hard spheres, for example) leads to more order. Life takes advantage of both of these apparent loopholes of the second law.

  Free Energy

  Life reduces entropy locally while increasing it globally. The concept of entropy describes this situation reasonably well, but it has one drawback. To square life’s processes with the second law of thermodynamics, we need to analyze the entropy of an organism and the entropy of its surroundings (because the surroundings ensure that the second law is not violated). However, details about the environment may not be well known, and all we really want to know is whether heat (and entropy) can flow to the environment or not. For this we only need to know the temperature of the surroundings. This fact led scientists to establish a new way to analyze thermodynamic systems that are immersed in constant-temperature environments (for example, a living organism or a cell).

  How can the temperature of a system be kept constant? If you are chemist and your system is a test tube, you can place the test tube into a large bath maintained at constant temperature (for example, a bucket of ice and water at 0 degrees Celsius). If the system begins to heat up from an exothermic (heat-releasing) reaction, the heat will quickly flow into the surrounding bath until the system’s temperature equals that of the bath. If the system cools down (an endothermic reaction), it will draw heat from the surrounding bath. If the bath is large enough, the heat flow from the system will not change the temperature of the bath. Thus the bath serves as an energy reservoir and a temperature control.

  Living organisms also act as heat baths. Our cells are immersed in a large 37 degree Celsius temperature bath—our bodies. All chemical reactions in our cells happen at this constant temperature. If we want to apply the second law to this situation, it becomes tricky. The second law states that overall entropy increases. Therefore, if I look at a reaction happening in one of my kidney cells, I would need to know not only how the entropy of the participant molecules in the reaction changes, but also how entropy of the surrounding tissue changes. I would need to analyze the change of entropy in legions of complicated cells just to determine the outcome of one tiny reaction.

  There is an easier way. The exchange of energy and entropy with the surrounding heat bath can be represented simply by the temperature, as long as we know the temperature is kept constant by the bath. To do this, we use the concept of free energy. Free energy F is the total energy E minus the product of temperature T and the entropy S of the system (or F = E – TS). Because entropy represents how much energy has become dispersed and useless, free energy represents that part of the energy that is still “concentrated” and useful (because we are subtracting the useless part, TS).

  To analyze a system within the constant-temperature bath of our own bodies, we need to know how the system’s free energy changes. In the language of free energy, the second law is restated this way: At constant temperature, a system tends to decrease its free energy, until, at equilibrium, free energy has reached a minimum. The second law tells us that useful energy will become degraded, and eventually we will only be left with dispersed, unusable energy.

  Because free energy includes both energy and entropy, the concept of free energy explains many of the paradoxical situations we discussed above. For example, in the random-stacking example, there is not much difference between the energy of marbles stacked randomly or orderly, but the entropy is higher for the ordered stacking. Therefore, the free energy, given by energy minus entropy times temperature, is lower for the ordered marbles, and because the system tends to minimize its free energy, it will tend to an ordered state at equilibrium.

  We can also think of examples where both the entropy and the free energy of the system are lowered. What is so special about this? When we lower entropy, the free energy tends to go up, because we subtract the entropy term from the total energy. For example, if the total energy of a molecule is 5 eV, and entropy times temperature is 2 eV, then the free energy is F = 5 eV − 2 eV = 3 eV. Now, if the entropy term decreases to 1 eV (for example, by cooling the system), the free energy would increase to 5 eV − 1 eV = 4 eV.

  Can both free energy and entropy decrease at the same time? Keeping the temperature constant, this is only possible if the total energy of the system, E, also decreases, and decreases more than the entropy term. Fo
r example, if total energy decreases from 5 eV to 3 eV, while the entropy term decreases, as before, from 2 eV to 1 eV, the new free energy would be 3 eV − 1 eV = 2 eV. This is less than the original 3 eV, despite the fact that entropy is decreased.

  Why doesn’t this example violate the second law? Isn’t entropy supposed to increase? When the total energy of the system changed from 5 eV to 3 eV, the “missing” energy had to go somewhere (remember energy conservation). Energy passed into the environment, heated it up, and increased the environment’s entropy. As before, overall (system + environment) entropy increased, even though, by itself, the entropy of our small system decreased. You can see how the language of free energy makes it easier to think about such a scenario: As long as the free energy of a system decreases, we are obeying the second law.

  Does nature provide examples of spontaneous decreases in entropy? All the time! Take the creation of a snowflake. Compared with a liquid drop of water, a snowflake has much lower entropy (it is also much more ordered). Yet, snowflakes form spontaneously under the right conditions. This is because the energy (E in our formula) of a snowflake is much lower than the energy of a water droplet. Once the temperature has fallen below the freezing point of water, the entropy reduction is overwhelmed by this reduction in energy, and the free energy is reduced when water freezes into beautiful, ordered crystals. Thus, at low temperature, a snowflake has lower free energy than a water droplet. At high temperature, however, the entropy term wins again (it is multiplied by the now higher temperature) over the energy (E) term, and water turns liquid. At higher temperature, liquid water has lower free energy than frozen water. This is shown in Figure 3.1.

 

‹ Prev