Shufflebrain
Page 5
I mention phase-only holograms at this juncture to make a point about hologramic mind. The frequencies and energy levels in the nervous system do not remotely approach those of light. For this reason, we can't make a literal comparison between optical and neural holograms, at least not in using hologramic theory. Also, because of phase-only holograms, amplitude variations would not play a necessary and indispensable role in the storage of information in the brain. Phase is the essence of hologramic mind!
Before I supply more background on holograms, per se, let me raise still another important preliminary question. What do we actually mean when we use the word wave ? Let's take an introspective look.
***
Many of us first became acquainted with the word wave when someone hoisted us from the crib or playpen, gently kissed us on the side of the head, and coaxed, "Wave bye-bye to Uncle Hoibie!" Later, we may have thought "wave" as we pressed a nose against the cool windowpane and watched little brother Ben's diapers waving on the clothesline in the autumn breeze. Then one Fourth of July or Saint Patrick's Day, our mother perhaps gave us a whole quarter; we ran to the candy store on the corner, and, instead of baseball cards and bubble gum, we bought a little American flag on a stick. We ran over to the park and waved the little flag to the rhythm of the march; then we began to laugh our heads off when the mounted policemen jiggled by in their saddles, out of time with each other and the beat of the drums and the cadence we were keeping with the little waving flag. Still later, perhaps, we learned to wave Morse code with a wigwag flag, dot to the right, dash to the left. Up early to go fishing, when radio station W-whatever-the-heck-it-was signed on, we may have wondered what "kilocycles" or "megahertz" meant. And it was not until after we began playing rock flute with the Seventh Court that the bearded electronic piano player with the Ph.D. in astronomy said that "cycle" to an engineer is "wavelet" to a sailor, and that the hertz value means cycles per second--in other words, frequency. If we enrolled in physics in high school, we probably carried out experiments with pendulums and tuning forks. An oscillating pendulum scribed a wave on a smoked, revolving drum. A vibrating tuning fork also created waves, but of higher frequency: 256 cycles per second when we used the fork with the pitch of middle C on the piano. Moving down an octave, according to the textbook, would give us 128 hertz.
Are our usages of wave metaphorical? The word metaphor has become overworked in our times. While I certainly wouldn't want to deny waves to poets, I don't think metaphor is at the nexus of our everyday usage of wave . Analog is a better choice: something embodying a principle or a logic that we find in something else. (Notice the stem of analog.)
To and fro, rise and fall, up and down, over and under, in and out, tick and tock, round and round, and so on... Cycles. Periodicities. Recurrences. Undulations. Corrugations. Oscillations. Vibrations. Round-trip excursions along a continuum, like the rise, fall, and return of the contour of a wavelet, the revolutions of a wheel, the journey of a piston, the hands of a clock. These are all analogs of waves.
Do we really mean that pendular motion is a symbolic expression of the rotations of a clock's hands? No. The motion of one translates into continuous displacements of the other. Is the ride on a roller coaster an allegorical reference to the course of the tracks? Of course not. The conduct of the one issues directly from the character of the other, to borrow a phrase from a John Dewey title. And why would we suppose that a pendulum or a tuning fork could scribe a wave? The answer is that the same logic prevails in all periodic events, patterns, circumstances, conditions, motions, surfaces, and so forth.
No, a child's hand isn't the same thing as a fluttering piece of cloth or the ripples on a pond. And yes, there's imprecision and imperfection in our verbal meanings; we wouldn't want it otherwise. Poetry may exist in all of this. Yet by our literal usages of wave we denote what Plato would have called the idea of waviness, the universal logic revealed by all things wavy. And that logic translates, completely, into amplitude and phase. And if the medium stores phase information, we have a species of hologram.
***
Not all physics is about waves, of course. The liveliest endeavor in that science today, the pursuit of the quark, is a search for fundamental particles -- discrete entities -- of mass-energy. The photon is a light particle. Light is both particles and waves. The same is true of all mass-energy at the atomic level. The electron microscope, for example, depends on electrons, not as the particles we usually consider them to be but as the electron waves uncovered in the 1920s as the result of de Broglie's theories. And one of the tenets of contemporary physics is that mass-energy is both particulate and wavy. But when we are dealing with particles, the wavy side of mass-energy disappears; and when it is measured as waves, mass-energy doesn't appear as particles. If you want to concentrate on corpuscles of light, or photons, you must witness the transduction of a filament's mass-energy into light, or from light into some other form, as occurs in the quantized chemical reactions in our visual pigment molecules. But if the choice is light on the move between emission and absorption, the techniques must be suitable for waves.
Physics would be a side show in our times if the logic of waves had been left out of science. And waves might have been left out of science, were it not for Galileo's discovery of the rules of the pendulum. The pendulum taught us how to build accurate clocks. Without a reliable clock, astronomy would be out of the question. And how could anybody contemplate timing something such as the speed of light without a good clock? It was in 1656 that a twenty-seven-year-old Dutchman named Christian Huygens invented the pendular clock.
Huygens wasn't just a back-room tinkerer. His work with the pendulum was the result of his preoccupation with waves. The reader may recognize Huygens's name from his famous wave principle. He had studied the question of how waves spread to make an advancing wave front. Have you ever observed ripples diverging in ever-expanding circles from the point where you drop a rock into the water? If not, fill a wash basin halfway, and then let the faucet drip...drip...drip! Huygens explained how one set of ripples gives rise to the next. He postulated that each point in a ripple acts just like the original disturbance, creating tiny new waves. The new waves than expand and combine to make the next line of ripples, the advancing wave front. A diagram in a treatise Huygens published in 1690 is still the prototype for illustrations of his principle in today's physics textbooks.
Nor is it a coincidence that Huygens, "during my sojourn in France in 1678," proposed the wave theory of light.
[3]
(He didn't publish his
Treatise on Light
for another twelve years.)
We can't see light waves. Even today, the light wave is a theoretical entity. And to scholars in Huygen's times, nothing seemed sillier or more remote from reality than light waves.
But on November 24, 1803, Thomas Young, M.D.., F.R.S., thought it right "to lay before the Royal Society a short statement of the facts which appear so decisive to me..."
"I made a small hole in a window-shutter, and covered it with a piece of thick paper, which I perforated with a fine needle." [sniff!] Outside the shutter "I placed a small looking-glass...to reflect the sun's light, in a direction nearly horizontal, and upon the opposite wall." And with perforated cards in the path of "the sunbeam," Young produced interference patterns and demonstrated, conclusively, the wavy nature of light.
Young's experiment is a laboratory exercise in physics courses today. It involves two baffles, one perforated in the center by a single pinhole, the other with two pinholes in line with each other but off-center. The experimenter places the baffles between a tiny light source and a screen, locating the one with the single hole nearest the light. When light passes through the single pinhole and then through the two pinholes in the far baffle, interference fringes, dark and light stripes, appear on the screen. What happens if we place a finger over one pinhole in the far baffle? Let's let Thomas Young tell us: "One of the two portions [our pinholes
] was incapable of producing the fringes alone." Changes in the intensity of the light don't affect the results. Interference fringes require
two
sets of waves.
Interference patterns guarantee waves. But Young's work wasn't immediately accepted by British scientists. In fact, if he had not been active in many other fields (the range of his intellect is astonishing), he might never have been allowed to lay another thing before the Royal Society. Young's critics, according to his biographer, George Peacock, "diverted public attention from examination of the truth."[4]
It's as though a new wavefront starts out at the slit, whether the waves are light or water.
But across the English channel, Napoleon notwithstanding, Young's work found an eloquent and persuasive champion in the person Francois Arago. And by the time Victoria became Queen, even the English believed in light waves.
It wasn't Arago's original research that vindicated and extended Young's theory, however, but that of Augustin Jean Fresnel, with whom Arago collaborated.
Fresnel! When my mind says "Fray-nel!" in poor Ph.D. language-examination French, certain words of Charles Peirce also surface: "These are men whom we see possessed by a passion to learn...Those are the naturally scientific men."[5] Fresnel's work brought him little renown in his lifetime. But optical physicists today use his name as an adjective. For Fresnel demonstrated just what interference is all about.
Interference occurs whenever waves collide. You've probably seen waves of water cancel each other upon impact. This is destructive interference, which occurs when the rising part of one wave meets the falling part of another. Conceptually, destructive interference is like adding a positive number to a negative number. On the other hand, when waves meet as both are moving up together, interference still occurs but it is constructive , or additive, and the resulting wave crests higher than its parents. In order to have an interference pattern, some definite phase relationship must exist between two sets of colliding waves. A well-defined phase relationship is coherent and is often referred to as "in step". When colliding waves are incoherent, their interaction produces random effects. An interference pattern is not random; and a basic requirement in an interference pattern is coherency.
Ordinary light is decidedly incoherent, which is why optical interference patterns aren't an everyday observation. Today, Heisenberg's uncertainty principle[6] accounts for this: wicks and filaments emit light in random bursts. Even if we filter light waves -- screen out all but those of the same amplitude, wavelength, and frequency -- we can't put them in step. In other words, we can't generate a definite phase relationship in light waves from two or more sources.
Young's experiment circumvented the uncertainty principle in a remarkably simple way. Recall that his sunbeam first passed through a single pinhole. Therefore, the light that went through both pinholes in the far baffle, having come from the same point source, and being the same light, had the same phase spectrum. And, coming out of the other side of the far baffle, the two new sets of waves had a well-defined phase relationship, and therefore the coherency to make interference fringes.
Here's what Fresnel did. He let light shine through a slit. Then he lined up two mirrors with the beam, aiming them so as to reflect light toward a screen. But he set the two mirrors at unequal distances from the screen. In so doing, he introduced a phase difference between the waves reflected by each mirror. But because the waves came from the same source (the slit), their phase differences were orderly; they were coherent. And when they interfered, they produced fringes in the form of Fresnel rings.
Interference patterns not only depend on an orderly phase difference, they are precisely determined by that difference. If you are ever in a mood to carry out Young's experiment, see what happens when you change the distance between the two holes (create a phase variation, in other words). You'll find that the closer the openings, the narrower the fringes (or beats) will be and the greater the number of fringes on the screen.
The hologram is an interference pattern. The distinction between what we call hologram and what Young and Fresnel produced is quantitative, not qualitative. Now, in no way am I being simplistic or minimizing the distinction (no more so than between a penny and a dollar). Ordinary interference patterns do not contain the code for a scene, because no scene lies in the waves' paths. Such patterns do record phase variations between waves, though, which is the final test of all things hologramic. Just to keep matters straight, however, unless I explicitly say otherwise, I will reserve the term
hologram
for interference patterns with actual messages.
***
The hologram was born in London on Easter Sunday morning, 1947. It was just a thought that day, an abstract idea that suddenly came alive in the imagination of a Hungarian refugee, the late Dennis Gabor. The invention itself, the first deliberately constructed hologram, came a little later. But even after Gabor published his experiments in the British journal Nature the following year, the hologram remained virtually unknown for a decade and a half. Gabor's rudimentary holograms had none of the dramatic qualities of holograms today; perhaps as few as two dozen people, worldwide, appreciated their profound implications. Not until 1971, after holography had blossomed into a whole new branch of optics, did Gabor finally receive the Nobel Prize.
Gabor often related his thinking on that fateful Easter morning. He hadn't set out to invent the hologram. His thoughts were on the electron microscope, then a crude and imperfect device. In theory, the electron microscope should have been able to resolve atoms.[7] (Indeed, some instruments do today.) But in 1947, theory was a long way from practice. "Why not take a bad electron picture." Gabor recounted in his Nobel lecture, "but one which contains the whole of the information, and correct it by optical means?"[8]
The entire idea hinged on phase. And Gabor solved the phase problem with the conceptual approach Albert Einstein had taken in theorizing mass-energy and, eventually, the universe itself. No wonder we lose the phase, Gabor thought, if there is nothing to compare it with! He would need a reference . He would have to deal with phase information in relative, not absolute, terms.
The big technical hurdle was coherency. How could he produce a coherent source? Gabor's answer was very similar in principle to Young's and Fresnel's: Let light shine through a pinhole. If he set a tiny transparent object in the path of the beam, some waves--object waves--would pass through it, while others would miss; the waves that missed the object would still collide with the object waves downstream, and that ought to create interference patterns. Those waves that missed the object would become his reference. And the interference patterns would become a record of the phase and amplitude differences between object and reference waves. He'd use a photographic plate to capture that pattern, he decided.
Recall the discussion about objects warping the amplitude and phase of waves? If the interference pattern is completely determined by the amplitude and phase spectra of interacting sets of waves, then what? The hologram should retain not only amplitude changes but also the relative phase variations imposed on the object waves.
It is hard to believe that such records had already been produced by other physicists. But as a matter of fact, x-ray crystallographers' diffraction patterns, generically speaking, are holograms. Crystallographers take the information from the x-ray diffraction patterns and use the equations Kraut was talking about to deduce the images of the atoms in crystals. Gabor realized that he could do the same thing with a beam of light. He could physically decode the image. He realized that if he passed the original light through the hologram plate, instead of through the object, the shadows in the hologram would put the warp into those waves and the complete image would appear where the object had been. For this would reconstruct the image-bearing wave front. When he tried it, it worked.
***
The object Gabor used for his very first hologram was a tiny transparent disc with a microscopic but monumental message etched ont
o it. It read, "Huygens, Young, Fresnel."
Gabor did not succeed with the electron microscope. In fact, his first hologram just barely reconstructed the message. "It was far from perfect," he quipped. But it was not his reconstruction that had made history. It was the idea.
***
Gabor's principle is very simple, in retrospect--so simple that only a genius could have seen through the taboos of the time to find the hologram amid the arcane and abstract properties of waves.
The hologram is an interference pattern, and interference is a subject taught in high school physics courses. To engineers and physicists, the hologram is a straightforward extension of elementary optics. Even the mathematics of the hologram are fairly simple. Crystallographers for some time had been doing the construction step of holography without calling it that. And a color technique developed in 1894 (the Lippman process) suggested even the reconstruction step. How then was it possible for the hologram to escape modern science until 1947?
Science is people. Scientists seldom try out in their laboratories what they do not believe in their guts. Recording the phase of light waves would violate the uncertainty principle. Nothing yet known has withstood the incredible power of the uncertainty principle, including the hologram. There's a subtle distinction, though, between phase and a code resulting from phase. But only an extraordinary mind could see this; and only an extraordinary person had the courage to proceed from there.