How Big is Big and How Small is Small

Home > Other > How Big is Big and How Small is Small > Page 8
How Big is Big and How Small is Small Page 8

by Smith, Timothy Paul


  So back to our A above middle C, which is also called A4 and is the 49th key on the keyboard. Its first overtone, or second harmonic, is 880 Hz, which is also the frequency of A5, the A in the next octave up the scale. If I double the frequency, I jump up an octave. If I divide the frequency in half, I drop down an octave. This is the origin of the name diatonic. So for a musical scale 2 is our multiplying factor, much like the Pogson ratio for stars, in which the factor is Likewise, the word “octave” plays a similar role for music as “magnitude” does for stars.

  But what about all those other notes besides A? You will recall that I said the diatonic scale is also called the heptatonia prima. “Hept-” is seven and tells us about the seven whole notes, or white keys, in an octave. But there are also five black keys, giving us twelve distinct notes in an octave. When we move up the keyboard from we increase the frequency at each step by 5.946%. This is curious: a step up the keyboard or up the scale is a jump by a percentage, and not by a set frequency. This also means that the percentage difference between two adjacent keys anywhere on the keyboard is identical. There are twelve steps in an octave, and after those dozen steps our frequency will have doubled. So the key-to-key multiplying factor is the twelfth root of two: or 1 plus 5.946%.

  This curious number, 1.05946, also shows up in a more visual and beautiful place: on the fret board of a guitar (see Figure 4.6). The distance from the saddle or bridge—the point where the string of the guitar is attached to the body—to a fret determines the frequency, and so the note, that that string will make when plucked. If you now move your fingers up one fret, you are a note lower, and the active string is 5.946% longer. This means that the frets of a guitar are a type of logarithmic scale. When a rock star steps out onto a stage with a Fender Stratocaster, the frets under his or her fingers are laid out like a slide rule, star magnitudes and the Richter scale.

  Figure 4.6 The frets and related frequencies of a guitar. The frets are laid out logarithmically. In the upper figure, each fret is labeled by the note and frequency it will produce when the E-string (bottom string) is plucked. The lower plot shows the frequencies on a logarithmic scale. Here the frets are evenly spaced.

  ***

  Somewhere near the end of this book we are going to understand how ancient and how big the universe is and how small and fleeting quarks are. So why have we just spent this whole chapter talking about wind speeds, earthquakes, star brightness and sound? It is firstly because logarithmic scales are all around us and we use them every day when we talk about loud speakers, hurricanes and pianos. But, more importantly, nature organizes itself in terms of these sorts of scales. When later we move from stellar distances to galactic distances we will not say the distances have had 1018 km added, but rather that distances have been multiplied by a million. When we go from cells to periwinkles to sequoias we do not add sizes, we multiply them. Nature organizes itself into different distance scales. The atomic scale is 10−10 m, the scale for the nucleus of an atom is 10−15 m and that is a whole different world with a whole different set of phenomena.

  5

  Little Numbers; Boltzmann’s and Planck’s Constants

  I remember as a small child being told about air and, quite frankly, being skeptical. Here was a substance that I was told was more important for life than either food or water, yet I could not even see it. I could smell good things like lilacs and foul things like barnyards, but I could not really smell air. I could put out my hand and try to grab it, but it was too etherial and I would always come up empty handed. Air was far to elusive for me to understand.

  This was in contrast with wind. Wind is much easier to understand than air. Wind lifts up kites and blows away dandelion seeds. Wind can drive a pinwheel and chase a sailboat across the water. Wind is not as invisible as air because I can see it splash through a tree, shaking its leaves and branches. I never doubted wind; it was always very real. And then I understood what air was. Wind is all about something in motion, something all around us, something that flows. Wind is that stuff that other people call air, but only when it is in motion. Therefore air must be that stuff that, when it moves, causes flags to flap, hay fields to wave, and dry leaves to skip across the lawn. Wind is air with motion and so conversely, air is wind, with or without motion. But to me as a child, wind was much more interesting than air because it was happening; it had motion and dynamics. Wind had energy.

  ***

  Our world is intrinsically a dynamic place, always with things in motion. In fact our brains seem to be hardwired to notice motion. You can stare into the woods and not see a single animal among the thousands of branches, leaves and trunks until a chipmunk or a sparrow moves. Then our consciousness focuses on that creature. It is the animal’s dynamics that grabs our attention. We are attuned to motion, dynamics and the expenditure of energy.

  A ball that has been pitched and is curving over the green earth has energy. A puck sliding on ice, a planet in orbit, or an electron circling the nucleus of an atom all have energy. Energy is what makes things happen. Without energy we would (at best) be living in a still life, a place where time is meaningless, a static and silent world.

  Wind is air with energy.

  Energy gives the world life and dynamics, but energy is more than just motion. Energy is also the possibility or the potential of motion. A skier at the top of a snow filled slope has the potential of great motion by just pointing their skis downhill; dynamite and gasoline can cause motion with the aid of a spark; a sweet dessert ought to drive us to greater motion than it often does.

  There is one more category of energy beside motion and potential: heat. Heat is a type of energy that does not fit. It is not the pent up energy of the skier or a wound up spring, and it does not appear to have much to do with motion. The glowing iron in a blacksmith’s forge just sits there. But heat really is energy and it was the study of heat that led us to quantum mechanics. And in that field, in the act of quantizing light and energy, we will find Boltzmann’s and Planck’s constants and the Planck length, one of the smallest things in nature.

  ***

  Heat is energy. This most basic statement eluded scientists for centuries. Up until the eighteenth century heat was seen as being its own substance: caloric. Wood that could burn contained a lot of caloric and by burning the wood you were releasing the calories. But then Count Rumford (Benjamin Thompson 1753–1814) made a most astute observation. Thompson was an American inventor who, as a loyalist, left New England during the American Revolution. He passed through London and eventually settled in Bavaria where he was attached to the army. It was at the arsenal in Munich where we start our story of heat.

  At the arsenal cannons are cast, but the casting is rough and they need to have their bores drilled out. This would involve a large drill bit and a pair of horses walking around and around. At the best, it was a long process, and at the worst, with a dull bit, it produced a lot of heat. So Rumford convinced the arsenal to perform an experiment for him. He place the raw cast cannon in a barrel of water, with a very dull drill bit turning in the bore. He then started the horses turning the bit and then measured the temperature of the water. As time went on the water became hotter and hotter. It was wrapped in flannel to insulate it and keep the heat in. After two and a half hours the water started to boil! Rumford had calculated how much caloric should have been in the metal and there was not enough to boil that much water. In fact, according to the caloric theory, he should have eventually been able to drive all the calories out of the metal, at which time there would be no more heat. But that is not what he observed. As long as the horses kept walking and the drill kept turning, heat was produced and the calories were not exhausted. Rumford concluded that the energy in the motion of the drill, which was really the energy from the horses, was being converted into heat and that heat was energy and not its own separate substance.

  ***

  The fact that energy could be transformed into heat was only one side of the coin. Nicolas Léonard Sadi C
arnot (1796–1832) was a French military engineer who was studying steam engines’ efficiency and wrote a book in 1824 entitled, Reflections on the Motive Power of Fire. In other words, he was addressing how you get motion out of heat, which is what a steam engine is doing. One of his chief discoveries is something that we still call a Carnot cycle. A Carnot cycle is often presented as a graph of the state of an engine in terms of heat, pressure and temperature. Various stages of a cycle are characterized by expansion of gasses, or temperature changes, or pressure falling. In a steam engine, steam is heated and the pressure rises. The piston expands and the pressure drops. The steam is condensed or released. The temperature drops and the cycle starts again.

  What was revealed in Carnot’s graphs and analysis was the efficiency of the engine. The efficiency told you how much fuel you had to put into an engine to get a certain amount of motion out of it. This was terrifically useful at the dawn of the age of steam, but it also contained a curiosity. It showed that there was always an inefficiency, a piece of the energy in the wood or coal that we could never quite tap into, no matter how clever we were at pressures, pistons, temperatures and condensers. Here we have seen the first hint of entropy, heat energy of an unusable form.

  One way to picture entropy is to think of the energy in a ball propped up on a hillside. We can extract energy from the ball; that is, we can convert its potential energy into motion by tipping the ball and letting it roll down the hill to the bottom of the valley. The amount of motion is related to the differences in height between the hilltop and the valley bottom. But the ball has not given up all of its energy. If the valley flowed down into a plain or the ocean the ball could roll further and gather greater speed. Or if there was a deep mine shaft it could fall further. However, it does not. It is at the bottom of the basin and there is no place left for it to roll. Local geography gives it no further opportunity. It is at its lowest potential and we can extract no more energy from it.

  Carnot’s heat engine can only be perfectly efficient if the heat is infinitely hot, or the condensers work at absolute zero. Otherwise there is still energy we cannot tap into.

  ***

  I am trying to tell a story of how our ideas of heat and thermodynamics lead us logically along a path where first we realize that heat is a type of energy, then that energy is conserved (the first law of thermodynamics), and finally that entropy always increases (the second law of thermodynamics); it would be nice if history followed that same sequence. But history has its own agenda and did not produce events in the order I would like them to have done. I would have liked to start with the most basic statement about energy, namely that energy is conserved. But that idea really was a latecomer. After Carnot it looked like some energy was lost into entropy every time it was transformed. It is only after demonstrating that heat is energy and that heat may have a very real but unusable form called entropy, that we can formulate the law of the conservation of energy.

  This is what Rudolf Julius Emanuel Clausius (1822–1888) finally did in 1850. He welded together energy, heat and entropy in a coherent form that we now know as the first two laws of thermodynamics. First, energy is conserved. It is neither created nor destroyed, but it can change its form. Secondly, when it changes its form some of it may become an unusable form that is called entropy. Within a system entropy will always rise and that rise cannot be reversed.

  The second law of thermodynamics was a very new and different type of law to what people were used to. It states that as time goes on entropy can only rise. This is in contrast to the great laws of physics: Newton’s laws of motion. Newton’s findings served as the archetypes of such laws: they showed us what physical laws should be like. Newtonian mechanics is also about moving and energy. According to Newton’s laws, if billiard balls collide, energy is conserved by being transferred from one ball to another. But all the events that people used these laws for could be reversed. Let a bat hit a ball and Newton’s laws tell us what to expect. If you reversed time and ran the video backwards, you would see a ball hitting a bat and still Newton’s laws would tell you the outcome: Newton’s laws work forward and backwards in time. Thermodynamics is very different; time flows in only one direction.

  There have been a number of scientists, including the astronomer Eddington and physicist Einstein, who have said that the second law of thermodynamics is the most fundamental principle in nature. It has also been suggested that entropy is tied to the fact that time only flows into the future. Without entropy we could, in some sense, not distinguish yesterday from tomorrow. So in the second half of the nineteenth century the second law, the continuous increase of entropy, was not only seen as fundamental, but as sacred.

  It would take Ludwig Boltzmann to finally link Newton’s laws and the second law of thermodynamics, and he would trigger a revolution in the process. Because he dared to touch the second law he would bring an avalanche of criticism and persecution upon himself, and would be driven to the brink of insanity.

  ***

  Back in Chapter 3 we encountered James Maxwell and his description of a collection of molecules bouncing around inside a bottle. What he was developing was his kinetic theory of gasses. His goal was to start with a model of molecules bouncing around, let them follow Newton’s laws of motion, and try to derive such properties as pressure and temperature. Raise the temperature and the molecules have more energy; they move faster and have harder collisions with the walls of the container, which we see as an increase in pressure.

  A number of other people, including Daniel Bernoulli (1700–1782) had tried to derive gas laws from Newton’s equations, but Maxwell brought a great deal of mathematical prowess to the problem. For instance, instead of just describing the molecules as all having some motion, Maxwell derived the distribution of motions the molecules must have. His work was a real tour de force, but its sophistication may have defeated a number of his contemporaries. In Austria, however, it struck a resonance.

  In the 1860s Joseph Stefan (1835–1893) was building up a physics institute at the University of Vienna. He had already attracted Loschmidt, who made the first estimate of the size of a molecule. He also captured a young student, Ludwig Eduard Boltzmann (1844–1906), and gave him a copy of Maxwell’s writings and a book of English grammar to help him read it. Boltzmann completed his PhD in 1866 and in 1869 obtained the chair of mathematical physics at Graz, in the southwest corner of modern Austria. From his position there he published a paper on the kinetic theory of gasses, in which he put Maxwell’s distribution on even firmer mathematical footings. In fact we now call that distribution the Maxwell–Boltzman distribution.

  Boltzmann also derived the second law of thermodynamics from the kinetic theory of gasses. In doing so, he stepped into a hornets’ nest that would dominate his reputation and career for the rest of his life. Boltzmann had a hard time publishing his description of heat because, as the journal’s editor reminded him, molecules were only “hypothetical.” Chemists, following Dalton, were well ahead of physicists at this time.

  The key to Boltzmann’s success was that instead of dealing with the motion of individual molecules he dealt with the statistical distribution of their motion. Instead of talking about a molecule traveling at “930 feet per second” (283 m/s) he would talk about the probability of there being molecules traveling at that speed. He would also talk about the evolution of the probability distribution and not the evolution of the molecule’s trajectory. This was taking Maxwell’s ideas of distribution and extending them to one more level of abstraction. We now call the method developed by Boltzmann statistical mechanics, and it concerns the dynamics and evolution of the statistics and not the dynamics of the particles.

  In 1873 Boltzmann took a post at Vienna, back at the cosmopolitan capital of the country and at the center of active physics. It was here that his old friend Loschmidt explained to him the problem of statistical mechanics—what is now referred to as Loschmidt’s paradox. Statistical mechanics is based on Newton’s laws of moti
on, which are reversible. However, Boltzmann had derived the second law of thermodynamics—the fact that entropy always increases—which is not time reversible. The two theories therefore appeared to be logically inconsistent and incompatible. A number of Boltzmann’s contemporaries felt that there must be a deep flaw in statistical mechanics to have produced this inconsistency, but it was not clear how.

  To Boltzmann, a system starts with a high degree of order and decays into a state of disorder. Order and entropy are seen as being opposites. To illustrate this point, picture a pool table with fifteen balls racked up in a triangle with a game ready to start (see Figure 5.1). At this moment there is a high degree of order. Now when the cue ball breaks the racked balls, the balls scatter in all directions and roll for a few seconds. The balls are now chaotic, with a low degree of order and a high degree of entropy, even though every ball and every collision exactly followed Newton’s laws of motion.

 

‹ Prev