How Big is Big and How Small is Small

Home > Other > How Big is Big and How Small is Small > Page 14
How Big is Big and How Small is Small Page 14

by Smith, Timothy Paul


  Time is something that we can measure very well because we can isolate atoms and get them to do the same atomic transition again and again in a very repeatable fashion. The heart of a cesium clock is the transition between two atomic states in cesium-133 atoms. Light from this transition has a unique frequency and when that light oscillates 9,192,631,770 times, that is our modern definition of one second. The challenge in accurate clocks has been to separate those cesium atoms from external effects. So the “fountain” in the clock’s name is a way of letting the atoms free-fall, to remove gravitational effects. They are also cooled with lasers to remove thermal jitter.

  With an accuracy of 10−16 these clocks represent our best measurement of any type, better than length or mass or energy. The second is exactly 9,192,631,770 oscillations and a day is defined at 86,400 of these cesium-based seconds. It is close to the time for the Earth to rotate, but no longer does the rotation rate define time.

  With time measured so much better than anything else it also gives us a new way of setting distance standards. In fact we now define the speed of light by saying that in one second light travels 299,792,458 meters exactly. We will never add more digits to that number. It is 299,792,458.000,000. That means that the meter is the distance light can travel in 1/299,792,458 s. This also matches that ideal that the original designers of the metric system had: anyone anywhere with a good clock can create a standard meter.

  ***

  Now let us take our clocks and start to measure our world. We will start by looking at things that take about a second and then look at the types of events that take even less time. In the next chapter we will look at timescales greater than a second.

  As I said at the beginning of this book we can think of a second as being about the time between heartbeats. It is also about the time it takes for an old-fashioned clock to “tick-tock.” In playground games it is the time it takes to say “one thousand” or “Mississippi.”

  A bit faster then a second is the timescale of sports. One of the fastest baseball pitches ever recorded was thrown by Aroldis Chapman of the Cincinnati Reds, clocked at 105 mph (46.9 m/s). Since it is 60 feet 6 inches (18.44 m) from the pitcher’s mound to home plate that means the ball took 0.39 s to get there.

  Soccer goalkeepers have even a greater challenge. In a penalty kick, the ball is typically moving at about 30 m/s over a distance of 11 m. So the goalkeeper has about 0.36 s to respond. One of the fastest soccer balls recorded was nailed by David Hirst, clocked at 114 mph (51 m/s) in 1996. He was playing for Sheffield Wednesday against Arsenal. The shot was taken from about 15 m out, which means the goalkeeper had 0.22 s to respond. Unfortunately for Sheffield Wednesday the ball hit the cross bar and bounced out.

  What makes most ball-based sports exciting is this matching of human response time to the motion of the ball. Human response time for young adults is about 0.2 s, which is well matched to sports. If all shots on goal were from 30 m out, or pitchers threw from second base, goalkeepers and batters would almost always intercept the ball. If much closer, they could rarely stop it.

  ***

  A bit faster than sports times are the frame rates of film and video. The brain can process about a dozen images a second and so if it is presented with more than that an illusion of continuous motion is created. Some earlier movies were filmed at 14 frames per second (FPS) but, over time, the frame rate has generally increased. Presently most films and videos are 24, 25, or 30 FPS, but there is a push to increase this. A lot of sports are recorded at much higher rates, so that they can be played back in slow motion. Also, modern screens and projectors are more versatile than projectors of a century ago, so these numbers can easily change. Still, 25 FPS is a time of 0.04 s per image.

  Nerve pulses in mammals move at a speed of about 100 m/s. The longest human nerve (the foot to the center of the back) is about 1 m long, which means a signal will travel the length of a nerve in about 0.01 s. By modern standards that may not seem like such a short period of time, but before Hermann von Helmholtz (1821–1894) made his measurement in 1850 it was thought to be so fast that it was not even worth trying to measure. What Helmholtz did was to take frog’s legs with the nerves attached and exposed and mounted them next to a swiftly rotating drum. The legs were attached to a stylus such that when they twitched they left a mark on the drum. Helmholtz then stimulated the nerves. By looking at the marks from the stylus on the drum, and how far the drum had rotated, he determined the muscle’s response time. He then changed the point of stimulation on the nerves and was able to measure a nerve pulse propagation speed of about 30 m/s. In mammals, nerves are a bit different than in frogs and the speed can be up to 120 m/s.

  Track events are now recorded to the hundredth of a second. Actually they are timed at an even faster level, but then rounded off and reported to the nearest hundredth. To get the times that accurate requires electronic starts and stops, but the human eye can actually see the difference. For the one hundred meter dash at Olympic speeds, one hundredth of a second is the same as 10 cm, a distinction a sharped-eyed finish-line judge can see.

  A thousandth of a second is called a millisecond (ms). It is about the time is takes a nerve pulse to jump across the synaptic gap between nerves. A fly can flap its wing in about 3 ms.

  ***

  Lightning is a spark of electricity that passes between the Earth and a thundercloud in about 10−5 s or 10μ s. Actually that is just the central step of lightning. Before a strike a leader will form, a region of ionized air that is more conducive to lightning. That formation can take tenths of a second. The discharge itself is really too fast for us to see, but in the process the electric discharge superheats the air. What we see is hot glowing air, which lingers after the discharge itself.

  The muon, a subatomic particle that acts like a heavy electron lives for about 2 μ s(2 × 10−6 s) if it is standing still. In fact the lifetime of the muon was one of the first experiments to test special relativity. If a large number of muons were traveling at nearly the speed of light, and there were no relativistic effects, half of them would decay in about 600 m. So experimenters counted the number of muons streaming down from the sky, first on a mountain top. Then they repeated their measurement at a lower elevation and found that far less than half of them had decayed after 600 meters. This was exactly what special relativity had predicted. The lifetime of a particle is according to its own measurement of time, and when something is in motion, especially as it approaches the speed of light, its internal clock slows down. According to the muon, it had traveled that 600 m in less than 2μ s (time dilation) or, equivalently, the distance it traveled had shrunk (length contraction).

  Relativity also affects very accurate clocks that travel at much slower rates. A GPS satellite travels at about 14,000 km/h so its clocks will be slowed by about 7 μ s a day by special relativity. Actually there is a larger effect due to the way satellites travel through the Earth’s gravitational field. General relativity speeds up the satellites’ clocks by about 45 μ s a day. If these were not corrected, the position of the satellites would be off by a few kilometers a day, getting worse as time went on.

  Quicker than a microsecond is a nanosecond (ns), a billionth of a second. Gigahertz means a billion cycles per second, so a 2-GHz computer has a clock generating two ticks per nanosecond. Light can travel 30 cm, or about a foot, in a nanosecond. Electrical pulses in a wire are slower than light, which means that if I had a computer with a 10-GHz clock, two pulses would be separated by about the width of a chip, which must make a very interesting design challenge.

  During the building of the first atomic bomb the term shake was coined to refer to 10 ns. This term was inspired by the phrase “two shakes of a lamb’s tail,” but actually referred to the amount of time between steps of a nuclear chain reaction. In the previous chapter we briefly described the fission of a uranium nucleus into krypton, barium and three neutrons. Those neutrons may travel a few centimeters before they are absorbed into another nucleus to star
t the next step of the chain reaction. It takes about 10 ns, or one shake for that to happen. Depending upon the particular fuel and the other materials in the reactor core or bomb, the designer can vary what this time is a little bit.

  As we look at faster and faster events we are approaching the intrinsic speed of our clocks. The standard transition in cesium-133 produces photons with a frequency of 9,192,631,770 Hz, which means a wave passes by in 10−10 s. At about 3 cm this is actually a pretty long wave. We could have used a different transition. For example, the red light from a helium–neon (HeNe) laser has a wavelength of 633 nm and a frequency of 4.7 × 1014 Hz, so a wave goes by every 2 × 10−15 s. The reason we do not use these for clocks is because they are not as stable or as reproducible as the cesium-133 transition. However, this uncertainty in faster atomic transitions is something we can use to push our measurements to even shorter times.

  ***

  One of the most discussed, most mysterious and most misunderstood concepts in quantum mechanics is the Heisenberg uncertainty principle. It is a relationship between how well you can know two related quantities such as the position and momentum of a particle. Its most infamous misuse is to declare the unknowability of something: “according to Heisenberg, I can’t know anything.” What Heisenberg really tells us is that if we measure one quantity of a particle with great accuracy and rigor we will have disturbed the system so much that the complementary quantity can only be measured with limited accuracy. So if you measure position well it affects momentum; if you measure energy well it affects the time of the event.

  For example, imagine you want to measure the energy of a ball that has been kicked or thrown. You could hang a block of foam from a rope in front of the ball. When the ball hits the block, the block is swung back. The swing is the measurement; the more it swings, the more energy the ball had. But in the process you most certainly have affected the ball. It would now be hard to answer the question of how far the ball would have traveled if the block was not there. You can try to come up with better methods of measuring energy; a smaller block will effect the ball less, or perhaps you could just scatter light off the ball, which is essentially what a radar gun does. But even these more benign techniques will still affect the trajectory of the ball.

  This is where Heisenberg started, and his principle transcends any particular apparatus or technique and talks about interactions at the quantum-particle and wave levels. The most familiar form of the uncertainty principle is a relationship between position and momentum. But the princple also says that there is a relationship between energy and time. The uncertainty in energy multiplied by the uncertainty in time is greater than or equal to Planck’s constant.

  Planck’s constant is a very small number so this relationship rarely impedes laboratory measurements. But it is not just a limit of experiments designed by humans. It is also a limit on measurements nature can make upon itself. This may seem odd, but it is true. “Measurement” sounds like such a deliberate act, but in fact measurements are happening all of the time. If a cue ball hits another ball on a pool table, the second ball goes off with a well-determined speed and direction. The second ball’s momentum is set because it “measured” the momentum lost by the cue ball. An interaction or a collision is a measurement, and therefore Heisenberg’s principle is relevant.

  An application of the uncertainty principle is to look at the width of a spectrum line. An atomic spectrum usually looks like a black band with some thin strips of different colors on it. The way it was created was that a gas was heated or sparked until it glowed. The light from the glow was then passed through a prism or grating that separated the colors into a rainbow with red at one end, green in the middle and violet at the other end. If we have a simple and pure gas and a good set of optics we will see distinct bright lines.

  For example, if we use hydrogen we will see four lines: one red, two blue and one purple. There is also an ultraviolet line that we do not see, as well as lines in the infrared. Photons that contribute to that red line were created when the electron in a hydrogen atom was excited into the third orbit and then fell back into the second orbit. The energy lost in that fall from the third to the second orbit was converted to a photon and that energy determined its wavelength and hence the fact that it is red. But if we look closely at that spectrum line we see that it is not a perfect mathematical line; rather it is a band with a width. If we look across that band the hue of the red slightly changes. That is because the photons that make this band have slightly different energies: there is a spread or uncertainty in their energies. This spread is exactly related to the amount of time the electrons spent in that excited state, that third orbit. The longer an electron dwells in an excited orbit, the more time it has had to settle, the more precise its energy. So we can use the spread in energy or color to calculate the time spent in the excited orbit. By looking at line width, we measure time.

  A well-studied spectral line is the one related to the jump between the first and second orbit of the hydrogen atom. We do not see this one because it is in the ultraviolet. The jump itself has an energy of 10 eV. The line itself is exceedingly narrow at only 4 × 10−7 eV. The time related to this line is a bit more than a nanosecond:

  In fact most atomic transitions are measured in nanoseconds or slower.

  ***

  If we want to find events in nature that happen faster than atomic transitions we need to look at things that are smaller than atoms. So we will look at the decay of exotic particles. In Table 8.1 I include a very short list of particles and their lifetimes. It is by no means exhaustive. I have selected these because they show trends.

  The pions (π+, π−, π0) are related to the nuclear force. The sigma particles (∑+, ∑−, ∑0) are much like protons or neutrons except they have a strange-quark in them. The omega particle (Ω−) is made of just strange-quarks. The delta particles (Δ−− Δ− Δ0 Δ+) are like neutrons or protons with complex spins. We will talk more about these later.

  Table 8.1 Exotic subnuclear particles; lifetimes, forces and decay.

  Particle

  Lifetime (sec)

  Decay force

  π+ π−

  2.6 × 10−8

  weak

  π0

  8.5 × 10−17

  strong

  ∑+ ∑−

  8.0 × 10−11

  weak

  ∑0

  7.4 × 10−20

  strong

  Ω−

  8.2 × 10−11

  weak

  Δ−− Δ− Δ0 Δ+

  5.0 × 10−24

  strong

  All three pions are nearly the same in mass and structure, but π0 decays a billion times faster than the other two. All three sigma particles are nearly the same in mass and structure, but ∑0 decays a billion times faster. Clearly there is something special going on. A billion is a huge factor when comparing the lifetimes of such similar particles. It would be as if a germ that lived an hour had a cousin that lived 100,000 years. That is a factor of a billion.

  Figure 8.2 The discovery of the omega particle. Left: a photograph of a bubble chamber and particle tracks. An accelerator creates a spray of various subnuclear particles. When the particles pass through the chamber they leave a trail of bubbles that we can see. Right: Each track is identified as a particular type of particle. Courtesy of Brookhaven National Laboratory, 1964.

  A picture is said to be worth a thousand words, so I am including a photograph of the discovery of the omega particle (see Figure 8.2). On the left is a photograph taken at Brookhaven National Laboratory. The white lines are really a line of bubbles left by particles passing through liquid hydrogen. On the right are the interesting tracks with the particle names. The whole image is about half a meter wide. Because the energy and speed of each particle was known, its lifetime could be determined. That omega particle traveled only a few centimeters, which translates to a lifetime of about 10−10 seconds. In this way we can measure the lifetime of fast-
moving, but short-lived particles. Also relativity helps. Much like when we talked about muons a few pages ago, fast-moving particles live longer and travel farther.

  But why do π0 and ∑0 decay quickly when compared to their siblings? The reason is that these decays are due to the strong force, whereas the π± and ∑± decay due to the weak force. We touched upon these forces briefly in the last chapter on energy. Weak decays are the transition that set the burn rate for hydrogen in the Sun, as well as the decay of neutrons. They set the reaction times for both nuclear fission and fusion. And as we can see in the table above, weak decay is a billion times slower than strong decay. In fact it was this factor of a billion that helped people identify that the weak and strong forces are unique and distinct types of interaction.

  Once we identify that the ∑± is undergoing a weak decay in which a strange quark is decaying, it is then easy to recognize that this is also what is happening inside of the omega particle. However, the delta particle is something else.

  ***

  We often think of a proton or a neutron as a solid, hard sphere in the center of an atom. We are use to the idea that an atom can be excited and we know that, related to that excitation, atoms can radiate light. But the idea that a proton or neutron might also be excited seems foreign. But that is what a delta particle is.

  Inside of a proton or neutron there are quarks that travel around each other, much as electrons swirl about the nucleus of the atom. The quarks are confined to an area about 1 × 10−15 m across. That is an region 100,000 times smaller than an atom, and the energies are about a billion times greater. Each quark has a number of properties that affect how it interacts with the other quarks. For the moment I want to focus only on one property called spin. Spin interactions are like the interactions between bar magnets. If I have two magnets next to each other they would like to align themselves with opposite poles together: opposites attract. In a neutron or a proton one quark will have its spin opposite the other two, to minimize this magnetic tension. If you twist around the spin direction of that quark until they are all aligned you need to add a lot of energy. You have effectively “excited” that neutron or proton, creating a new particle called the delta.

 

‹ Prev