The Music of Pythagoras

Home > Other > The Music of Pythagoras > Page 34
The Music of Pythagoras Page 34

by Kitty Ferguson


  Scientists are not the only ones who adopt a Pythagorean view of numbers as the strongest vehicles on the avenue to truth and progress. Pythagorean faith in mathematics shows up at nearly all school curriculum meetings. Though no one proposes resurrecting the quadrivium, educators seem to have decided that a child who can talk and read and calculate holds the essential keys to all knowledge, and many would argue that the third—“calculate”—is potentially the most powerful by far. Music has, however, tended to fall by the wayside.

  When Hawking wrote in the late twentieth century about his high hopes that he and others would find the Theory of Everything that would unify all of physics, and when he brought that quest into the public mind in his Brief History of Time—even for those who only read about that book—he was expressing another Pythagorean theme. Many physicists were hoping, indeed expecting, complete knowledge of the universe to turn out, ultimately, to be unified, harmonious, and simple. This hope was not based only on wishful thinking. Listen, for example, to the way the physicist Richard Feynman traced its history.

  There was a time, wrote Feynman, when we had something we called motion and something else called heat and something else again called sound,

  but it was soon discovered, after Sir Isaac Newton explained the laws of motion, that some of these apparently different things were aspects of the same thing. For example, the phenomena of sound could be completely understood as the motion of atoms in the air. So sound was no longer considered something in addition to motion. It was also discovered that heat phenomena are easily understandable from the laws of motion. In this way, great globs of physics theory were synthesized into a simplified theory.3

  In the early twentieth century, physics seemed to be coming together in a thoroughly Pythagorean unity. Einstein unified space and time and explained gravity in a way that the physicist John Archibald Wheeler could encapsulate in one short sentence: “Spacetime grips mass, telling it how to move; mass grips spacetime, telling it how to curve.”4 Einstein’s theory of special relativity could be summarized in an equation on a T-shirt: E = mc2. The Russian mathematician Alexander Friedmann predicted that anywhere we might stand in the universe we would see the other galaxies receding from us, just as we do from Earth, and better understanding of the expansion of the universe has shown he was undoubtedly right, although no one has been able to try it yet. Just as Nicholas of Cusa thought in the fifteenth century, the universe is homogenous.

  Two of four forces of nature known to underlie everything that happens in the universe—the electromagnetic force (already a unification) and the weak nuclear force—were combined by the “electroweak theory” in the early 1980s. There was also work going on that promised to show that, if we could observe the extremely early universe, it would be obvious that all four forces were originally united and that nature was composed of symmetries well concealed in our own era of the universe’s history. James Watson and Francis Crick and their colleagues discovered the simple pattern of the structure of DNA, the double helix. Those who were insisting that Darwin’s nineteenth-century theory of evolution was no threat to religious faith were pointing out that it was difficult to imagine anything that could more eloquently support the conviction that there was a brilliant and unified (and some would add, pitiless) rationality behind the universe. John Archibald Wheeler wrote his essentially Pythagorean poem:

  Behind it all

  is surely an idea so simple,

  so beautiful,

  so compelling that when—

  in a decade, a century,

  or a millennium—

  we grasp it,

  we will all say to each other,

  how could it have been otherwise?

  How could we have been so stupid

  for so long?

  All was not, however, a story of undiluted success for the Pythagorean vision of a “unity of all being.” Einstein, a firm believer in the unity of nature, spent thirty years trying to construct a theory that would explain electromagnetism in terms of space-time, as he had explained gravity. He never succeeded, and many physicists would blame his failure in part on the fact that he so stubbornly refused to admit quantum mechanics into the picture. But a new theory, called string theory, that saw the elementary particles as tiny strings or loops of string and that certainly had no qualms about accepting quantum mechanics, was gaining supporters in the 1980s. It offered hope of doing what Einstein had failed to do: gathering into the fold the most rebellious of the four forces (when it came to unification)—gravity. As the first decade of the twenty-first century progressed, however, physicists were becoming impatient with string theory. It had been able to come up with no prediction that could be tested in a way that would show whether the theory was correct. Aristotle would have been happier with this development than Pythagoras or Plato, not because Aristotle wanted to tear down theories, but because twenty-first-century mathematical physicists were clearly not out of touch with the need for truth to be linked with the perceptible world. However, even with string theory looking less promising than it had, no one really questioned the essential unity of the universe.

  Such faith is hard to lose, especially when no evidence definitively shows that it is wrong. However, some serious mathematical and scientific blows to Pythagorean convictions have occurred during the past one hundred years. Humans seem fated to discover again and again that the universe is not so rational after all—at least, not by the best current human standards of rationality. Such discoveries have challenged and stretched scientists to dig deeper in search of a level of reality where the Pythagorean principles still hold. One of the greatest manifestations of symmetry, harmony, unity and rationality in the universe is the fact that, although drastic changes do occur over time and from situation to situation, and although things can look dramatically different in different parts of the universe—and act in what even seem contradictory ways—the underlying laws that govern how change occurs apparently do not change. Maybe this is convincing evidence that our Pythagorean assumption of unity is correct, or it might be that our assumption is leading us to a false impression. We can only answer by pointing to past experience.

  The search for a more fundamental law often begins with the discovery that something that has seemed fundamental and unchanging fails to hold under some circumstances. When that happens, the Pythagorean assumption of unity and symmetry kicks in and compels everyone to conclude that whatever it is they have been regarding as bedrock is not that at all. It is merely an approximation. Researchers put their noses back to the grindstone and explore for a deeper underlying law that does not change.

  There have been many examples of this process of discovery. Newton’s laws of gravity hold true except when movement approaches the speed of light or when gravity becomes enormously strong, as it does near a black hole. Einstein’s newer, more fundamental description in terms of space-time does not break down, as Newton’s laws do, in these extreme circumstances. But Einstein’s description also presents problems that challenge the assumption of unity and harmony. They predict that there will be singularities—points of infinite density—at the origin of the universe and at the center of black holes. At a singularity, all the laws of physics break down. And so the search must go on for a more fundamental set of laws, on the Pythagorean assumption that at absolute bedrock there are laws that break down in no situations whatsoever. The underlying unchanging laws, whatever they are, and the nearest approaches to them that have been found, do obviously allow a vast range of changes and events to occur, a vast range of behavior and experience. How far we have come from the early Pythagoreans, as they hurriedly and superficially applied this same faith in numbers! How unfathomably deep beyond their imagination the true connections lie! Beyond ours, too, perhaps.

  The first challenge to the Pythagorean assumption of rationality in the universe to occur in the twentieth century was Russell’s paradox, the discovery of Bertrand Russell that was discussed in Chapter 18. That happened earl
y, in 1901. Another, in 1931, was Austrian Kurt Gödel’s “incompleteness theorem.” Gödel was then a young man working in Vienna; he would later join Einstein at the Institute for Advanced Study in Princeton. Gödel’s discovery was that in any mathematical system complex enough to include the addition and multiplication of whole numbers—hardly fringe territory; any schoolchild is familiar with that—there are propositions that can be stated, that we can see are true, but that cannot be proved or disproved mathematically within the system. This means that all significant mathematical systems are open and incomplete. Truth goes beyond the ability to prove that it is true. Gödel also showed that it is not possible to prove whether or not any system rich enough to include addition and multiplication of whole numbers is self-consistent.

  These discoveries constituted a serious reversal of hopes for some, and a serious undermining of assumptions for others. The great mathematician David Hilbert and his colleagues had previously been able to demonstrate that logical systems less complex than arithmetic were consistent, and it seemed certain that they would be able to go on to demonstrate the same for all of arithmetic. Not so. With Gödel, the soaring Pythagorean staircase to sure knowledge, built of numbers, became something more resembling a staircase in an Escher drawing, and it is no wonder that the most famous book about Gödel is Douglas R. Hofstadter’s Gödel, Escher, Bach. The Bach is Johann Sebastian Bach. Bertrand Russell was one of those who were badly shaken by Gödel’s theorems—particularly so because he misread Gödel and thought he had proved that arithmetic was not incomplete but inconsistent. Instead, Gödel had demonstrated that no one ever would be able to prove whether it was consistent or not. David Hilbert was not so discouraged as Russell: Until his death in 1943, he refused to recognize that Gödel had put paid to his hopes. The influence of Gödel’s discoveries was profound, and yet, on one level, rather inconsequential. As John Barrow wrote in 1992, “It loomed over the subject of mathematics in an ambiguous fashion, casting a shadow over the whole enterprise, but never emerging to make the slightest difference to any truly practical application of mathematics.”5

  Though Gödel’s discoveries may have undermined some forms of faith in mathematics, in a manner that seemed to resemble the Pythagorean discovery of incommensurability, Gödel’s view of mathematics was, in fact, Pythagorean. He believed that mathematical truth is something that actually exists apart from any invention by human minds—that his theorems were “discoveries” about objective truth, not his own creations.

  This was not a popular idea in the 1930s. Many mathematicians disagreed. In fact, the concept of anything existing in an objective sense—waiting out there to be discovered and not in any way influenced by the actions of the investigator—had been called into question by a development in physics. A far more dramatic and far-reaching crisis than the one caused by Gödel’s incompleteness theorem had occurred in the 1920s and was having a profound effect on the way scientists and others viewed the world. It was the discovery of the uncertainty principle of quantum mechanics.

  The way cause and effect work had long seemed good evidence that the universe is rational. It also seemed that if cause and effect operate as they do on levels humans can perceive, they surely must operate with equal dependability in regions of the universe, or at levels of the universe, that are more difficult—or even impossible—to observe directly. Cause and effect could be used as a guide in deciding what happened in the very early universe and what conditions will be like in the far distant future. No one was thinking of belief in cause and effect as a “belief” at all, though, in fact, there was nothing to prove that cause and effect would not cease to operate in an hour or so, or somewhere else in the universe. Then, in the 1920s, came developments that required reconsideration of the assumption that every event has an unbroken history of cause and effect leading up to it.

  The quantum level of the universe is the level of the very small: molecules, atoms, and elementary particles. It is on that level that a commonsense description breaks down. Here there are uncaused events, happenings without a history of the sort it is normally assumed any event must have. Atoms are not miniature solar systems. You cannot observe the position of an electron orbiting the nucleus and predict where it will be at a later given moment and what path it will take to get there or say where it was an hour ago—as you could with fair accuracy for the planet Mars in the solar system. An electron never has a definite position and a definite momentum at the same time. If you measure precisely the position of a particle, you cannot at the same time measure its momentum precisely. The reverse is also true. It is as though the two measurements—position and momentum—are sitting at opposite ends of a seesaw. The more precisely you pin down one, the more up-in-the-air and imprecise the other becomes. This is the Heisenberg uncertainty principle of quantum physics—the twentieth century’s “incommensurability.” It was first articulated by Werner Heisenberg in 1927. Not only did it undermine faith in a rational universe, it also seemed to undermine the notion that truth was something objective, something waiting out there to be discovered. On the quantum level, your measurement affects what you find.

  On the other hand, the existence of quantum uncertainty itself was apparently a very unwelcome piece of objective truth waiting out there that no physicist could change, as much as he or she might wish to, no matter what observational methods he or she used. Einstein in particular rebelled at the notion that no future advance in science and no improvement in measuring equipment was ever going to resolve this uncertainty. Until his death, he went on trying to devise thought experiments to get around it. He never succeeded, nor has he succeeded posthumously as others have found ways to carry out experiments he invented in his head. “God does not play dice!” Einstein wrote on one occasion to Niels Bohr, who was far more ready to accept quantum uncertainty than Einstein. “Albert, don’t tell God what he can do!” Bohr answered. The Bohr-Einstein debate about how to interpret the quantum level of the universe continued and became famous.

  It is easy to sympathize with Einstein. The quantum world and the paradoxes implicit in it did not seem to be the work of a rational mind. Einstein might have rephrased the complaint Kepler registered when faced with a similar problem: “Heretofore we have not found such an ungeometrical conception in His other works!” How could what happened to one particle affect another over time and space with no link between them? How could a cat be both dead and alive at the same time—as one had to accept in the famous example of “Schrödinger’s cat”? How could something be a wave some times and a particle at others, depending on the experimental situation? It was a Through the Looking Glass world—and still is, in spite of the reassurance that it is possible to predict things on the quantum level of the universe, if one can be satisfied with probabilities. It does seem that the staircase to knowledge about the universe can have a firm footing on the quantum level, with probabilities forming a sort of superstructure above the quagmire. All is far from lost for the Pythagorean climb.

  The dawning awareness of a new aspect of the universe, in chaos and complexity theories developed later in the twentieth century, was not nearly so great a shock as quantum uncertainty. However, it did seem to hint that science had been discovering one orderly, predictable system after another only because it was impossible or at least terribly discouraging to try to study any other kind of system in a meaningful fashion. The relatively easy to study predictable systems actually turned out to be the exception rather than the rule. But for those of a Pythagorean cast of mind, it was the discoveries of the repeating patterns in chaos—the pictures deep in the Mandelbrot and Julia sets, and also in nature itself—that gloriously seemed to uphold, as never before, the ancient conviction that beauty and harmony are hidden everywhere in the universe and have nothing to do with any invention of humans. Less immediately mind-boggling, but no less impressive, was the realization in the study of chaos and complexity that there seem to be mysterious organizing principles at work. There are probabil
ities, but by some calculations they are vanishingly low, that the universe would have organized itself into galaxies, stars, and planets; that life on this earth would have been organized into ecosystems and animal and human societies. Yet that is what has happened. Thus, as with the other challenges to faith in the Pythagorean assumptions underlying science, when scientists began to get a handle on chaos and complexity, the theories having to do with them became not threats but new avenues in the search for better understanding of nature and the universe.

  Twentieth-century “postmodern” thinking, combined with suspicions raised by the discovery of quantum uncertainty and our inability to examine the quantum world without affecting it, led to fresh doubts about other Pythagorean pillars of science. Is there really such a thing as objective reality? Is anything real, waiting to be discovered? Does the fact that science continues to discover things that make sense, and suspects or dismisses anything that does not, mean that we are finding out more and more about a rational universe . . . or only that we are selecting the information and discoveries that fit our very Pythagorean expectations?

  The assumption of rationality lies at the root of modern arguments about “intelligent design.” It is true that the world’s design, as the Pythagoreans found out, is intelligent to a degree that would send any discoverer of a new manifestation to his or her knees—but before what, or whom? Does discovering rationality necessarily mean one has glimpsed the Mind of God? On the other hand, does a good scientist have to repress the strong impression that it does? Those who attack belief in God do so from several directions. One is rather old-fashioned now, but still heard: Everything is so perfectly laid out, in so tight and orderly a design, that there is no room for God to act at any point. It all goes like clockwork. Or, a newer argument: Everything happens—and has always happened—entirely by chance. The impression of any underlying rationality in nature is an illusion. The “anthropic principle” says that if things had not fallen out just the way they have, we could not be here to observe them—and that is the only reason we find a universe that is amenable to our existence. Or . . . our entire picture of the universe is created, by us, in the self-centered image of our own minds, and we are discovering something not far different from the ten heavenly bodies of the Pythagoreans. Plato might have enjoyed the late-twentieth-century discussions about whether mathematical rationality might be powerful enough to create the universe, without any need for God. Quantum theory made possible the suggestion that “nothingness” might have been unstable in a way that made it statistically probable that “nothingness” would decay into “something.”

 

‹ Prev