Space Chronicles: Facing the Ultimate Frontier

Home > Other > Space Chronicles: Facing the Ultimate Frontier > Page 10
Space Chronicles: Facing the Ultimate Frontier Page 10

by Neil Degrasse Tyson; Avis Lang


  The act of discovery can take many forms beyond “look what I’ve found!” Historically, discoverers were people who embarked on long ocean voyages to unknown places. When they reached a destination, they could see, hear, smell, feel, and taste up close what was inaccessible from far away. Such was the Age of Exploration through the sixteenth century. But once the world had been explored and the continents mapped, human discovery began to focus not on voyages but on concepts.

  The dawn of the seventeenth century saw the near-simultaneous invention of what are arguably the two most important scientific instruments ever conceived: the microscope and the telescope. (Not that this should be a measure of importance, but among the eighty-eight constellations are star patterns named for each: Microscopium and Telescopium.) The Dutch optician Antoni van Leeuwenhoek subsequently introduced the microscope to the world of biology, while the Italian physicist and astronomer Galileo Galilei turned a telescope of his own design to the sky. Jointly, they heralded a new era of technology-aided discovery, whereby the capacities of the human senses could be extended, revealing the natural world in unprecedented, even heretical, ways. Bacteria and other simple organisms whose existence could be revealed only through a microscope yielded knowledge that transcended the prior limits of human experience. The fact that Galileo revealed the Sun to have spots, the planet Jupiter to have satellites, and Earth not to be the center of all celestial motion was enough to unsettle centuries of Aristotelian teachings by the Catholic Church and to put Galileo under house arrest.

  Telescopic and microscopic discoveries defied “common sense.” They forever changed the nature of discovery and the paths taken to achieve it; no longer would common sense be accepted as an effective tool of intellectual investigation. Our unaided five senses were shown to be not only insufficient but untrustworthy. To understand the world required trustworthy measurements—which might not agree with one’s preconceptions—derived from experiments conducted with care and precision. The scientific method of hypothesis, unbiased testing, and retesting would rise to significance and continue unabated thenceforth, unavoidably shutting out the ill-equipped layperson from modern research and discovery.

  Incentives to Discovery

  Travel was the method of choice for most historic explorers because technology had not yet progressed to permit discovery by other means. Apparently it was so important for European explorers to discover something that the places they found were declared “discovered”—and ceremonially planted with flags—even when indigenous peoples were there in great numbers to greet them on the shores.

  What drives us to explore? In 1969, the Apollo 11 astronauts Neil Armstrong and Buzz Aldrin Jr. landed, walked, and frolicked on the Moon. It was the first time in history that humans had landed on the surface of another planet. Being Westerners as well as discoverers, we immediately fell back to our old imperialist ways—the astronaut-emissaries planted a flag—but this time no natives showed up to greet us. And the flag needed to have a stick inserted along its upper edge to simulate the effects of a supportive, photo-friendly breeze on that barren, airless world.

  The lunar missions are generally considered to be humanity’s greatest technological achievement. But I would propose a couple of modifications to our first words and deeds on the Moon. Upon stepping onto the lunar surface, Neil Armstrong said, “That’s one small step for [a] man, one giant leap for mankind” and then proceeded to plant the American flag in lunar soil. If indeed his giant leap was for “mankind,” perhaps the flag should have been that of the United Nations. If he had been politically honest, he would have referred to “one giant leap for the United States of America.”

  The revenue stream that fed America’s era of space-age discovery derived from taxpayers and was motivated by the prospect of military conflict with the Soviet Union. Major funded projects require major motivation. War is a preeminent motivator, and was largely responsible for projects such as the Great Wall of China, the atomic bomb, and the Soviet and American space programs. Indeed, as a result of two world wars within thirty years of each other and the protracted Cold War that followed, scientific and technological discovery in the twentieth century was accelerated in the West.

  A close second in incentives for major funded projects is the prospect of high economic return. Among the most notable examples are the voyages of Columbus, whose funding level was a nontrivial fraction of Spain’s gross national product, and the Panama Canal, which made possible in the twentieth century what Columbus had failed to find in the fifteenth—a shorter trading route to the Far East.

  Space Tweet #13

  Columbus took three months to cross the Atlantic in 1492. The Shuttle takes 15 minutes

  May 16, 2011 9:30 AM

  When major projects are driven primarily by the sheer quest to discover, they stand the greatest chance of achieving major breakthroughs—that’s what they’re designed to do—but the least chance of being adequately funded. The construction of a superconducting supercollider in the United States—an enormous (and enormously expensive) underground particle accelerator that was to extend human understanding of the fundamental forces of nature and the conditions in the early universe—never got past a big hole in the ground. Perhaps that shouldn’t surprise us. With a price tag of more than $20 billion, its cost was far out of proportion to the expected economic returns from spin-off technologies, and there was no obvious military benefit.

  When major funded projects are driven primarily by ego or self-promotion, rarely do the achievements extend beyond architecture per se, as in the Hearst Castle in California, the Taj Mahal in India, and the Palace of Versailles in France. Such lavish monuments to individuals, which have always been a luxury of either a successful or an exploitative society, make unsurpassed tourist attractions but do not reach the level of discovery.

  Most individuals cannot afford to build pyramids; a mere handful of us get to be the first on the Moon or the first anywhere. Yet that doesn’t seem to stop the desire to leave one’s mark. Like animals that delineate territory with growls or urine, when flags are unavailable ordinary people leave a carved or painted name instead—no matter how sacred or revered the discovered spot may be. If the Apollo 11 had forgotten to take along the flag, the astronauts just might have chiseled into a nearby boulder “NEIL & BUZZ WERE HERE—7/20/69.” In any case, the space program left behind plenty of evidence on each visit: all manner of hardware and other jetsam, from golf balls to automobiles, is scattered on the Moon’s surface as testament to the six Apollo missions. The litter-strewn lunar soil simultaneously represents the proof and the consequences of discovery.

  Amateur astronomers, who monitor the sky far more thoroughly than anybody else, are especially good at discovering comets. The prospect of getting something named after oneself is strong motivation: to discover a bright comet means the world will be forced to identify it with your name. Well-known examples include Comet Halley, which needs no introduction; Comet Ikeya-Seki, perhaps the most beautiful comet of the twentieth century, with its long and graceful tail; and Comet Shoemaker-Levy 9, which plunged into Jupiter’s atmosphere in July 1994, within a few days of the twenty-fifth anniversary of the Apollo 11 Moon landing. Although among the most famous celestial bodies of our times, these comets endured neither the planting of flags nor the carving of initials.

  If money is the most widely recognized reward for achievement, then the twentieth century was off to a good start. A roll call of the world’s greatest and most influential scientific discoveries can be found among the recipients of the Nobel Prize, endowed in perpetuity by the Swedish chemist Alfred Bernhard Nobel, from wealth accrued through the manufacture of armaments and the invention of dynamite. The impressive size of the prize—currently approaching a million and a half dollars—serves as a carrot for many scientists working in the fields of physics, medicine, and chemistry. The awards began in 1901, five years after Nobel’s death—which is fortunate because scientific discovery was just then attaining
a rate commensurate with an annual reward. But if the volume of published research in, say, astrophysics can be used as a barometer, then as much has been discovered in the past fifteen years as in the entire previous history of the field. Perhaps there will come a day when the Nobel science prizes will be awarded monthly.

  Discovery and the Extension of Human Senses

  If technology extends our muscle and brain power, science extends the power of our senses beyond inborn limits. A primitive way we can do better is to move closer and get a better look; trees can’t walk, but they don’t have eyeballs either. Among humans, the eye is often regarded as an impressive organ. Its capacity to focus near and far, to adjust to a broad range of light levels, and to distinguish colors puts it at the top of most people’s list of desirable features. Yet when we take note of the many bands of light that are invisible to us, we are forced to declare humans to be practically blind—even after walking closer to get a better look. How impressive is our hearing? Bats clearly fly circles around us, given their sensitivity to pitch that exceeds our own by an order of magnitude. And if the human sense of smell were as good as that of dogs, then Fred rather than Fido might be sniffing out the drugs and bombs.

  The history of human discovery is a history of the boundless desire to extend the senses, and it is because of this desire that we have opened new windows to the universe. Beginning in the 1960s with the early Soviet and NASA missions to the Moon and the solar system’s planets, computer-controlled space probes—which we can rightly call robots—became (and still are) the standard tool for space exploration. Robots in space have several clear advantages over astronauts: they are cheaper to launch; they can be designed to perform experiments of very high precision without interference from a cumbersome pressure suit; and since they are not alive in any traditional sense of the word, they cannot be killed in a space accident. Nevertheless, until computers can simulate human curiosity and human sparks of insight, and until computers can synthesize information and recognize a serendipitous discovery when it stares them in the face, robots will remain tools designed to discover what we already expect to find. Unfortunately, profound insights into nature lurk behind questions we have yet to ask.

  The most significant improvement of our feeble senses is the extension of our sight into the invisible bands of what is collectively known as the electromagnetic spectrum. In the late nineteenth century the German physicist Heinrich Hertz performed experiments that helped unify conceptually what had previously been considered unrelated forms of radiation. Radio waves, infrared, visible light, and ultraviolet were all revealed to be cousins in a family of light whose members simply differed in energy. The full spectrum, including all parts discovered after Hertz’s work, runs from the low-energy part, called radio waves, and extends, in order of increasing energy, to microwaves, infrared, visible (comprising the “rainbow seven”: red, orange, yellow, green, blue, indigo, and violet), ultraviolet, X-rays, and gamma rays.

  Superman, with his X-ray vision, has few advantages over modern scientists. Yes, he is somewhat stronger than your average astrophysicist, but astrophysicists can now “see” into every major part of the electromagnetic spectrum. Lacking this extended vision, we would be not only blind but ignorant, because many astrophysical phenomena reveal themselves only in certain “windows” within the spectrum.

  Let’s peek at a few discoveries made through each window to the universe, starting with radio waves, which require very different detectors from those found in the human retina.

  In 1931 Karl Jansky, then employed by Bell Telephone Laboratories and armed with a radio antenna he himself built, became the first human to “see” radio signals emanating from somewhere other than Earth. He had, in fact, discovered the center of the Milky Way galaxy. Its radio signal was so intense that if the human eye were sensitive only to radio waves, then the galactic center would be one of the brightest sources in the sky.

  With the help of some cleverly designed electronics, it’s possible to transmit specially encoded radio waves that can then be transformed into sound via an ingenious apparatus known as a radio. So, by virtue of extending our sense of sight, we have also, in effect, managed to extend our sense of hearing. Any source of radio waves—indeed, practically any source of energy at all—can be channeled so as to vibrate the cone of a speaker, a simple fact that is occasionally misunderstood by journalists. When radio emissions from Saturn were discovered, for instance, it was simple enough for astronomers to hook up a radio receiver equipped with a speaker; the signal was then converted to audible sound waves, whereupon more than one journalist reported that “sounds” were coming from Saturn, and that life on Saturn was trying to tell us something.

  With much more sensitive and sophisticated radio detectors than were available to Karl Jansky, astrophysicists now explore not just the Milky Way but the entire universe. As a testament to the human bias toward seeing-is-believing, early detections of radio sources in the universe were often considered untrustworthy until they were confirmed by observations with a conventional telescope. Fortunately, most classes of radio-emitting objects also emit some level of visible light, so blind faith was not always required. Eventually radio telescopes produced a rich parade of discoveries, including quasars (loosely assembled acronym of “quasi-stellar radio source”), which are among the most distant and energetic objects in the known universe.

  Gas-rich galaxies emit radio waves from their abundant hydrogen atoms (more than 90 percent of all atoms in the cosmos are hydrogen). Large arrays of electronically connected radio telescopes can generate very high resolution images of a galaxy’s gas content, revealing intricate features such as twists, blobs, holes, and filaments. In many ways, the task of mapping galaxies is no different from that facing fifteenth- and sixteenth-century cartographers, whose renditions of continents—distorted though they were—represented a noble human attempt to describe worlds beyond one’s physical reach.

  Microwaves have shorter wavelengths and more energy than radio waves. If the human eye were sensitive to microwaves, you could see the radar emitted by the speed gun of a highway patrol officer hiding in the bushes, and microwave-emitting telephone relay towers would be ablaze with light. The inside of your microwave oven, however, would look no different than it does now, because the mesh embedded in the door reflects microwaves back into the cavity to prevent their escape. Your eyeballs’ vitreous humor is thus protected from getting cooked along with your food.

  Microwave telescopes, which were not actively used to study the universe until the late 1960s, enable us to peer into cool, dense clouds of interstellar gas that ultimately collapse to form stars and planets. The heavy elements in these clouds readily assemble into complex molecules whose signature in the microwave part of the spectrum is unmistakable because of their match with identical molecules that exist on Earth. Some of those cosmic molecules, such as NH3 (ammonia) and H2O (water), are household standbys. Others, such as deadly CO (carbon monoxide) and HCN (hydrogen cyanide), are to be avoided at all costs. Some remind us of hospitals—H2CO (formaldehyde) and C2H5OH (ethyl alcohol)—and some don’t remind us of anything: N2H+ (dinitrogen monohydride ion) and HC4CN (cyanodiacetylene). More than 150 molecules have been detected, including glycine, an amino acid that is a building block for protein and thus for life as we know it. We are indeed made of stardust. Antoni van Leeuwenhoek would be proud.

  Without a doubt, the most important single discovery in astrophysics was made with a microwave telescope: the heat left over from the origin of the universe. In 1964 this remnant heat was measured in a Nobel Prize–winning observation conducted at Bell Telephone Laboratories by the physicists Arno Penzias and Robert Wilson. The signal from this heat is an omnipresent, omnidirectional ocean of light—often called the cosmic microwave background—that today registers about 2.7 degrees on the “absolute” temperature scale and is dominated by microwaves (though it radiates at all wavelengths). This discovery was serendipity at its finest. Penzias and
Wilson had humbly set out to find terrestrial sources of interference with microwave communications; what they found was compelling evidence for the Big Bang theory. It’s a little like fishing for a minnow and catching a blue whale.

  Moving further along the electromagnetic spectrum, we get to infrared light. Invisible to humans, it is most familiar to fast-food fanatics, whose French fries are kept lukewarm under infrared lamps for hours before being purchased. Infrared lamps also emit visible light, but their active ingredient is an abundance of invisible infrared photons, which are readily absorbed by food. If the human retina were sensitive to infrared, then a midnight glance at an ordinary household scene, with all the lights turned off, would reveal all the objects that sustain a temperature in excess of room temperature: the metal that surrounds the pilot lights of a gas stove, the hot water pipes, the iron that somebody had forgotten to turn off after pressing crumpled shirt collars, and the exposed skin of any humans passing by. Clearly that picture is not more enlightening than what you would see with visible light, but it’s easy to imagine one or two creative uses of such amplified vision, such as examining your home in winter to spot heat leaks from the window panes or roof.

  As a child, I was aware that, at night, infrared vision would reveal monsters hiding in the bedroom closet only if they were warm-blooded. But everybody knows that your average bedroom monster is reptilian and cold-blooded. Thus, infrared vision would completely miss a bedroom monster, because it would simply blend in with the walls and door.

 

‹ Prev