Borderlands of Science
Page 33
After Luis Alvarez ruled out a couple of other candidate ideas, including a nearby supernova explosion, a colleague of the Alvarez's, Chris McKee, suggested that an asteroid, maybe ten kilometers across, might have hit the Earth. That size of asteroid was consistent with the observed iridium content of the clay layer. The energy release of such an impact would equal that of a hundred million medium-sized hydrogen bombs. The effect of the impact would be the disintegration of the asteroid, which along with many cubic kilometers of the Earth's crust would have been thrown high into the atmosphere. The dust would have stayed there for six months or so, halting photosynthesis in plants, preventing plant growth, and thereby starving to death all the larger animals. Upon the dust's return to Earth it would create the thin layer of clay seen between the red and white limestones. The same phenomenon of atmospheric dust had been observed on a much smaller scale following the huge volcanic explosion of Krakatoa in 1883. It is also the mechanism behind the idea of "nuclear winter," a palling of the whole Earth with atmospheric dust which some scientists worry could be one consequence of a large-scale nuclear war.
Twenty years ago, however, the idea of nuclear winter had not been taken seriously. Thus the theory presented by Luis and Walter Alvarez for the vanishing of the dinosaurs was pure scientific heresy. It was widely criticized by paleontologists, and even ridiculed. Today it is widely accepted as the most plausible extinction mechanism, and the same idea has been used to examine other major disappearances of many life forms from Earth. The greatest of these, known as the Permian extinction, occurred about 230 million years ago, when nine-tenths of all Earth's species vanished. The search for asteroid evidence has been less persuasive in this case.
Also unresolved in the case of the K/T extinction is the question of where the incoming destroyer hit the Earth. The most popular theory at the moment is that it struck in what is now the Gulf of Mexico, but that is not fully proved.
Many other details of the asteroid impact theory remain to be defined. However, those who do not believe the idea at all must face one inevitable and awkward question: If it was not the impact of a huge asteroid that suddenly and swiftly killed off all the dinosaurs, then what was it?
If you think it would be a nice idea to write a story in which a large asteroid descends on Earth today and causes all sorts of problems, be warned. The book has already been written, several times. I'll mention just two examples: Lucifer's Hammer (Niven and Pournelle, 1977), and Shiva Descending (Benford and Rotsler, 1980). If you want to read about a large object hitting the Moon, and what that can do to Earth, read Jack McDevitt's splendid Moonfall (1998).
13.3 Gaia: the Whole Earth Mother. This, too, borders on scientific respectability, though scientists as well-known as Stephen Jay Gould and Richard Dawkins have dismissed it as pseudoscience.
It began in the late 1970s, when James Lovelock published a controversial book, Gaia: A New Look at Life on Earth (Lovelock, 1979). In it he set forth his idea, long gestating, that the whole of Earth's biosphere should be thought of as a single, giant, self-regulating organism, which keeps the general global environment close to constant and in a state appropriate to sustain life. In Lovelock's own words, Gaia is "the model, in which the Earth's living matter, air, oceans and land surface form a complex system which can be seen as a single organism and which has the capacity to keep our planet a fit place for life."
Lovelock says that the notion is an old one, dating back at least to a lecture by James Hutton delivered in 1785. However, the modern incarnation of that idea is all Lovelock's, although the name Gaia as a descriptor for such an interdependent global entity was provided by the late William Golding (a Nobel laureate for literature, Lovelock's neighbor in England, and author of the classic Lord of the Flies).
Something like Gaia seems to be needed from the following simple physical argument: Life has existed on Earth for about three and a half billion years. In that time, the sun's energy output has increased by at least thirty percent. If Earth's temperature simply responded directly to the Sun's output, based on today's global situation we would expect that two billion years ago the whole Earth would have been frozen over. Conversely, if Earth was habitable then it should today be too hot to support life.
But in fact, the response of Earth's biosphere to temperature changes is complex, apparently adapting to minimize the effects of change. For example, as the amount of solar energy delivered to Earth increases, the rate of transpiration of plants increases, so the amount of atmospheric water vapor goes up. That means more clouds—and clouds reflect sunlight, and shield the surface, which tends to bring surface temperatures down. In addition, increased amounts of vegetation reduce the amount of carbon dioxide in the air, and that in turn reduces the greenhouse effect by which solar radiation is trapped within the atmosphere. Again, the surface temperature goes down. There are many other processes, involving other atmospheric gases, and the net effect is to hold the status quo for the benefit of living organisms. According to Lovelock, it is more than a matter of convenience. Only the presence of life has enabled Earth to remain habitable. If life had not appeared on this planet when it did, over three billion years ago, then by this time the surface of Earth would be beyond the range of temperatures at which life could exist.
Why, then, does the Gaia idea qualify as a scientific heresy? It sounds eminently reasonable, and something like it seems necessary to explain the long continuity of life on the planet.
Part of the problem is that at first thought it seems as though the whole Earth must be engaged in some sort of activist role. Many readers have assumed that intention is a necessary part of the Gaia idea, that the biosphere itself somehow knows what it is doing, and acts deliberately to preserve life. A number of nonscientific writers have embraced this "Earth as Ur-mother" thought in a way and with an enthusiasm that Lovelock neither intended nor agrees with. At the other extreme, two biologists, Doolittle and Dawkins, have offered the rational scientific criticism that the Gaia idea seems to call for global altruism, i.e. some organisms must be sacrificing themselves for the general good. That runs contrary to everything we believe to be true about genetics and the process of evolution.
Lovelock seemed at first to encourage such a viewpoint, when he wrote, "But if Gaia does exist, then we may find ourselves and all other living things to be parts and partners of a vast being who in her entirety has the power to maintain our planet as a fit and comfortable habitat for life." There is more than a suggestion here of a being which acts by design. However, Lovelock has later shown through simplified models that neither global intention nor global altruism is needed. The standard theory of evolution, in which each species responds in such a way as to assure its own survival and increase its own numbers, is sufficient to create a self-stabilizing total system.
Today the Gaia hypothesis, that the whole Earth biosphere forms a single, self-regulating organism, is still outside the scientific mainstream. However, over the past fifteen years it has gained some formidable supporters, notably the biologist Lynn Margulis, who has championed Gaia more actively than Lovelock ever did. The theory also provides a useful predictive framework for studying the way in which different parts of the biosphere interact, and particular chemicals propagate among them. Nonetheless, if it is not today outright heresy, to many scientists Gaia remains close to it.
Lovelock ironically comments that we may have come " . . . the full circle from Galileo's famous struggle with the theological establishment. It is the scientific establishment that now forbids heresy. I had a faint hope that Gaia might be denounced from the pulpit; instead I was asked to deliver a sermon on Gaia at the Cathedral of St. John the Divine in New York."
The Gaia concept sometimes permits Lovelock to take an unusually detached attitude to other global events. Some years ago I was driving him from suburban Maryland to the Museum of Natural History in Washington, D.C. On the way we somehow got onto the subject of all-out nuclear war. Lovelock surprised me very much by remarking
that it would have very little effect. I said, "But it could kill off every human!"
He replied, "Well, yes, it might do that; but I was thinking of the effects on the general biosphere."
I leave the subject of Gaia with this story idea: suppose that the biosphere did know what it was doing, and acted deliberately to preserve life. How do you think it would deal with humans?
13.4 Dr. Pauling and Vitamin C. When sea voyagers of the fifteenth and sixteenth centuries began to undertake long journeys out of sight of land, and later when Arctic explorers were spending long winters locked in the ice, they found themselves afflicted by a strange and unpleasant disease. Joints ached, gums blackened, teeth became loose and fell out, and bodies showed dark, bruise-like patches. Eventually the sufferers died after a long and painful illness. No one was immune, and as the trip went on more and more people were affected. Thus Vasco da Gama, sailing round the Cape of Good Hope in 1498, lost a hundred of his hundred and sixty crew members. Travelers gave the disease a name, scurvy, but they had no idea what caused it.
After many years of trial and error, sea captains and physicians learned that scurvy could be held at bay by including regular fresh fruit and vegetables in the diet. In 1753, the Scottish physician James Lind showed that the same beneficial effect could be produced by the use of concentrated orange and lemon juice. However, no one knew quite what these dietary additives were doing. That understanding had to wait for almost two more centuries, until 1932, when a substance called Vitamin C, or ascorbic acid, was isolated.
Vitamins are part of our necessary diet, but unlike proteins, carbohydrates, or fats, they are needed only in minute quantities. A daily intake of one thousandth of an ounce of Vitamin C is enough to keep us free from scurvy. Most animals can manufacture for themselves all the Vitamin C that they need; just a few species—humans, monkeys, and guinea pigs—rely on their food to provide it (humans, monkeys, and guinea pigs?! A story here, perhaps). Certain foods, such as broccoli and black currants, are especially rich in this vitamin, but almost all fresh fruit and vegetables contain enough to supply human needs. Without it in our diet, however, people sicken and die. Fortunately, Vitamin C is a simple molecule, and by 1933 chemists had learned how to produce it synthetically. It can be made in large quantities and at low cost. No one today needs to suffer from scurvy.
That might seem to be the end of the story of Vitamin C, except that in 1970, the scientist Linus Pauling came forward with an extraordinary claim. In his book Vitamin C and the Common Cold (Pauling, 1970), Pauling stated that large doses of Vitamin C, thirty to a hundred times the normal daily requirement, would help to ward off the common cold, or would reduce the time needed for a sufferer to recover.
Most people coming forward with such a notion would have been brushed aside by the medical profession as either a harmless crank, or some charlatan peddling his own patent nostrum or clinic.
There was just one problem. Linus Pauling was a recognized scientific genius. During the 1930s he had, almost single-handed, used quantum theory to explain how atoms bond together to form molecules. For this work he received the 1954 Nobel Prize for Chemistry. Rather than resting on his laurels, he had then gone on to study the most important molecules of biochemistry, in particular hemoglobin and DNA, and was the first person to propose a form of helical structure for DNA.
James Watson and Francis Crick, whom we met earlier in Chapter 6, elucidated the structure of DNA. What did they worry about as they worked? As Watson said in his book, The Double Helix (Watson, 1968), they knew that "the prodigious mind" of Linus Pauling was working on the problem at the California Institute of Technology. In the early spring of 1953 they believed that he would discover the correct form of the molecule within a few weeks if they failed to do so. With a little change in timing, or with better experimental data, Linus Pauling might well have won or shared the 1962 Nobel Prize that went to Crick, Watson, and Maurice Wilkins.
However, Pauling had no reason to feel too disappointed in that year. For he was in fact awarded a 1962 Nobel Prize—for Peace, acknowledging his work toward the treaty banning the atmospheric testing of nuclear weapons.
Faced with a two-time Nobel Laureate who was close to being a three-time Laureate, a man still intellectually vigorous at age 69, the medical profession could not in 1970 dismiss Pauling's claims out of hand. Instead they investigated them, performing their own controlled experiments of the use of Vitamin C to treat the common cold. Their results were negative, or at best inconclusive.
That should have quieted Pauling. Instead it had just the opposite effect. In a new book, Vitamin C and the Common Cold and the Flu (Pauling, 1976), he claimed that the medical tests had used totally inadequate amounts of Vitamin C. Massive doses, a gram or more per day, were needed. And he went further. He asserted that Vitamin C in such large doses helps with the treatment of hepatitis, mumps, measles, polio, viral pneumonia, viral orchitis, herpes, and influenza. He proposed mechanisms by which Vitamin C does its job, both as a substance that mops up free chemical radicals in cells and as a component of a cancer-cell inhibiting chemical called PHI. He also pointed out that there was no danger of a vitamin overdose, since excess Vitamin C is harmlessly excreted from the body.
Again, the medical control experiments were done. Again, Pauling's claims were denied, and dismissed. That is where the question stands today. Books have been written, proposing Vitamin C as a practical panacea for all ailments. Others have totally rejected all its beneficial effects. The use of large doses of Vitamin C remains a scientific heresy.
However, in discussing this subject with scientists, I find that a remarkably high percentage of them take regular large doses of Vitamin C. Perhaps it is no more than a vote of solidarity for a fellow-scientist. Perhaps it is a gesture of respect toward Linus Pauling, who died in August 1994 in his ninety-fourth year.
Or perhaps it is more the attitude of the famous physicist Niels Bohr. He had a horseshoe nailed up over the doorway of his country cottage at Tisvilde, for good luck. A visitor asked if Bohr, a rational person and a scientist, really believed in such nonsense. "No," said Bohr, "but they say it works even if you don't believe in it."
13.5 Minds and machines. In Chapter 10, we described the extraordinary advance of computers. The first ones, in the 1940s, were used for straightforward calculations, of tables and payrolls and scientific functions. Since then the applications have spread far beyond those original uses. Computers today perform complex algebra, play chess and checkers better than any human, control power generating plants, keep track of everything from taxes to library loans to airplane reservations, check our spelling and the accuracy of our typing, and even accept vocal inputs that may soon make typing unnecessary.
Given a suitable program, no human effort of calculation and record-keeping seems to be beyond computer duplication. This raises natural questions: Is every function of the human mind really some form of computer program? And at some time in the future, will computers be able to "think" as well as humans?
To most of the scientists represented in Chapter 10, the answer to these questions is an unequivocal "Yes." Our thought processes operate with just the same sort of logic as computers. Our brains are, as Marvin Minsky said, "computers made of meat." The field of Artificial Intelligence, usually abbreviated as AI, seeks to extend the range of those functions, once thought to be uniquely powers of the human mind, that computing machines are able to perform. The ultimate goal is a thinking and "self-conscious" computer, aware of its own existence exactly as we are aware of ours.
That ultimate goal seems far off, but not unattainable—unless a distinguished mathematician, Roger Penrose, is right. In 1989, he offered a radically different proposal. This is the same Penrose that we met in Chapter 2. He is the Rouse-Ball Professor of Mathematics at Oxford University, and a man with a reputation for profound originality. Over the past thirty years he has made major contributions to general relativity theory, to numerical analysis, to the global geometr
y of space-time, and to the problem of tiling the plane with simple shapes. His work is highly diverse, and it is characterized by ingenuity and great geometrical insight. More important, many of his results are surprising, finding solutions to problems that no one else had suspected might exist, and stimulating the production of much work by other investigators. Even his harshest critics admit that Roger Penrose is one of the world's great problem solvers. He cannot be dismissed outright as a crank, or as an intellectual lightweight.
What then, does he propose?
In a book that was a surprising best-seller, The Emperor's New Mind (Penrose, 1989), he claimed that some functions of the human brain will never be duplicated by computers that develop along today's lines. The brain, he asserts, is "non-algorithmic," which means that it performs some functions for which no computer program can be written.
This idea seems like perfect scientific heresy, and it was received with skepticism and even outrage by many workers in the field of AI and computer science (for a brief summary, see How the Mind Works [Pinker, 1997]). For one thing, prior to this book, Penrose was very much one of their own kind. Now he seemed like a traitor. Marvin Minsky even called Penrose a "coward," which is a perplexing term since it takes a lot of nerve to propose something so far out of the scientific mainstream.
What does Penrose say that is so upsetting to so many? In The Emperor's New Mind, he argues that human thought employs physics and procedures quite outside the purview of today's AI and machine operations. The necessary physics is drawn from the world of quantum theory. In Penrose's words, "Might a quantum world be required so that thinking, perceiving creatures, such as ourselves, can be constructed from its substance?" (Penrose, 1989).