Dancing With Myself

Home > Other > Dancing With Myself > Page 25
Dancing With Myself Page 25

by Charles Sheffield


  They had isolated examples already. For example, the chemical systems that rejoice in the names of the Belousov-Zhabotinsky reaction and the Brusselator exhibit a two-state cyclic behavior. So does the life cycle of the slime mold, Dictyostelium discoideum. However, such systems are very tricky to study for the occurrence of such things as bifurcations, and involve all the messiness of real-world experiments. Iterated function theory was something that could be explored in the precise and austere world of computer logic, unhindered by the intrusion of the external world.

  We must get to that external and real world eventually, but before we do so, let’s take a look at another element of iterated function theory. This one has become very famous in its own right (rather more so, in my opinion, than it deserves to be for its physical significance, but perhaps justifiably most famous for its artistic significance).

  The subject is fractals, and the contribution to art is called the Mandelbrot Set.

  5.SICK CURVES AND FRACTALS

  Compare the system we have just been studying with the case of the pendulum. There we had a critical curve, rather than a critical value. On the other hand, the behavior on both sides of the critical curve was not chaotic. Also, the curve itself was well-behaved, meaning that it was “smooth” and predictable in its shape.

  Is there a simple system that on the one hand exhibits a critical curve, and on the other hand shows chaotic behavior?

  There is. It is one studied in detail by Benoit Mandelbrot, and it gives rise to a series of amazing objects (one hesitates to call them curves, or areas).

  We just looked at a case of an iterated function where only one variable was involved. We used x to compute y, then replaced x with y, and calculated a new y, and so on. It is no more difficult to do this, at least in principle, if there are two starting values, used to compute two new values. For example, we could have:

  y = (w2 - x2) + a

  z = 2wx + b

  and when we had computed a pair (y,z) we could use them to replace the pair (w,x). (Readers familiar with complex variable theory will see that I am simply writing the relation z = z2 + c, where z and c are complex numbers, in a less elegant form.)

  What happens if we take a pair of constants, (a,b), plug in zero starting values for w and x, and let our computers run out lots of pairs, (y,z)? This is a kind of two-dimensional equivalent to what we did with the function y = rx(1-x), and we might think that we will find similar behavior, with a critical curve replacing the critical value.

  What happens is much more surprising. We can plot our (y,z) values in two dimensions, just as we plotted out speeds and positions for the case of the pendulum. And, just as was the case with the pendulum, we will find that the whole plane is divided up into separate regions, with boundaries between them. The boundaries are the boundary curves of the “Mandelbrot set,” as it is called. If, when we start with an (a,b) pair and iterate for (y,z) values, one or both of y and z run off towards infinity, then the point (a,b) is not a member of the Mandelbrot set. If the (y,z) pairs settle down to some value, or if they cycle around a series of values without ever diverging off towards infinity, then the point (a,b) is a member of the Mandelbrot set. The tricky case is for points on the boundary, since convergence is slowest there for the (y,z) sequence. However, those boundaries can be mapped. And they are as far as can be imagined from the simple, well-behaved curve that divided the two types of behavior of the pendulum. Instead of being smooth, they are intensely spiky; instead of just one curve, there is an infinite number of them.

  The results of plotting the Mandelbrot set can be found in many articles, because they have a strange beauty unlike anything else in mathematics. Rather than drawing them here, I will refer you to James Gleick’s book, Chaos: Making a New Science, which shows some beautiful color examples of parts of the set. All this, remember, comes from the simple function we defined, iterated over and over to produce pairs of (y,z) values corresponding to a particular choice of a and b. The colors seen in so many art shows, by the way, while not exactly a cheat, are not fundamental to the Mandelbrot set itself. They are assigned depending on how many iterations it takes to bring the (y,z) values to convergence, or to a stable repeating pattern.

  The Mandelbrot set also exhibits a feature known as scaling, which is very important in many areas of physics. It says, in its simplest terms, that you cannot tell the absolute scale of the phenomenon you are examining from the structure of the phenomenon itself.

  That needs some explanation. Suppose that you want to know the size of a given object—say, a snowflake. One absolute measure, although a rather difficult one to put into practice, would be to count the number of atoms in that snowflake. Atoms are fundamental units, and they do not change in their size.

  But suppose that instead of the number of atoms, you tried to use a different measure, say, the total area of the snowflake. That sounds much easier than looking at the individual atoms. But you would run into a problem, because as you look at the surface of the snowflake more and more closely, it becomes more and more detailed. A little piece of a snowflake has a surface that looks very much like a little piece of a little piece of a snowflake; a little piece of a little piece resembles a little piece of a little piece of a little piece, and so on. It stays that way until you are actually seeing the atoms. Then you at last have the basis for an absolute scale.

  Mathematical entities, unlike snowflakes, are not made up of atoms. There are many mathematical objects that “scale forever,” meaning that each level of more detailed structure resembles the one before it. The observer has no way of assigning any absolute scale to the structure. The sequence doubling phenomenon that we looked at earlier is rather like that. There is a constant ratio between the distances at which the doublings take place, and that information alone is not enough to tell you how close you are to the critical value in absolute terms.

  Similarly, by examining a single piece of the Mandelbrot set it is impossible to tell at what level of detail the set is being examined. The set can be examined more and more closely, forever, and simply continues to exhibit more and more detail.

  There is never a place where we arrive at the individual “atoms” that make up the set. In this respect, the set differs from anything encountered in nature, where the fundamental particles provide a final absolute scaling. Even so, there are in nature things that exhibit scaling over many orders of magnitude. One of the most famous examples is a coastline. If you ask “How long is the coastline of the United States?” a first thought is that you can go to a map and measure it. Then it’s obvious that the map has smoothed the real coastline. You need to go to larger scale maps, and larger scale maps. A coastline “scales,” like the surface of a snowflake, all the way down to the individual rocks and grains of sand. You find larger and larger numbers for the length of the coast. Another natural phenomenon that exhibits scaling is—significantly—turbulent flow. Ripples ride on whirls that ride on vortices that sit on swirls that are made up of eddies, on and on.

  There are classes of mathematical curves that, like coastlines, do not have a length that one can measure in the usual way. A very famous one, the “Koch curve,” is sketched in Figure 4. The area enclosed by the Koch curve is clearly finite; but when we set out to compute the length of its boundary, we find that it is 3 × 4/3 × 4/3 × 4/3…. Which diverges to infinity. Curves like this are known as pathological curves. The word “pathological” means diseased, or sick. It is a good name for them.

  There is a special term reserved for the boundary dimension of such finite/infinite objects, and it is called the Hausdorff-Besicovitch measure. That’s a bit of a mouthful. The boundaries of the Mandelbrot set have a fractional Hausdorff-Besicovitch measure, rather than the usual dimension (1) of the boundary of a plane curve, and most people now prefer to use the term coined by Mandelbrot, and speak of fractal dimension rather than Hausdorff-Besicovitch dimension. Objects that exhibit s
uch properties, and other such features as scaling, were named as fractals by Mandelbrot.

  Any discussion of chaos has to include the Mandelbrot set, scaling, and fractals, because it offers by far the most visually attractive part of the theory. I am less convinced that it is as important as Feigen Baum’s universality. However, it is certainly beautiful to look at, highly suggestive of shapes found in Nature and—most important of all—it tends to show up in the study of systems that physicists are happy with and impressed by, since they represent the result of solving systems of differential equations.

  Figure 4: A sick curve

  6.STRANGE ATTRACTORS

  This is all very interesting, but in our discussion so far there is a big missing piece. We have talked of iterated functions, and seen that even very simple cases can exhibit “chaotic” behavior. And we have also remarked that physical systems also often exhibit chaotic behavior. However, such systems are usually described in science by differential equations, not by iterated functions. We need to show that the iterated functions and the differential equations are close relatives, at some fundamental level, before we can be persuaded that the results we have obtained so far in iterated functions can be used to describe events in the real world.

  Let us return to one simple system described by a differential equation, namely, the pendulum, and examine it in a little more detail. First let’s recognize that the phase space diagram that we looked at in Figure 2 applies only to an idealized pendulum, not a real one. In the real world, every pendulum is gradually slowed by friction, until it sits at the bottom of the swing, unmoving. This is a single point in phase space, corresponding to zero angle and zero speed. That point in phase space is an attractor for pendulum motion, and it is a stable attractor. All pendulums, unless given a periodic kick by a clockwork or electric motor, will settle down to the zero angle/zero speed point. No matter with what value of angle or speed a pendulum is started swinging, it will finish up at the stable attractor. In mathematical terms, all points of phase space, neighbors or not, will approach each other as time goes on.

  A friction-free pendulum, or one that is given a small constant boost each swing, will behave like the idealized one, swinging and swinging, steadily and forever. Points in phase space neither tend to be drawn towards each other, nor repelled from each other.

  But suppose that we had a physical system in which points that began close together tended to diverge from each other. That is the very opposite of the real-world pendulum, and we must first ask if such a system could exist.

  It can, as we shall shortly see. It is a case of something that we have already encountered, a strong dependence on initial conditions, since later states of the system differ from each other a great deal, though they began infinitesimally separated. In such a case, the attractor is not a stable attractor, or even a periodic attractor. Instead it is called a strange attractor.

  This is an inspired piece of naming, comparable with John Archibald Wheeler’s introduction of the term “black hole.” Even people who have never heard of chaos theory pick up on it. It is also an appropriate name. The paths traced out in phase space in the region of a strange attractor are infinitely complex, bounded in extent, never repeating; chaotic, yet chaotic in some deeply controlled way. If there can be such a thing as controlled chaos, it is seen around strange attractors.

  We now address the basic question: Can strange attractors exist mathematically? The simple pendulum cannot possess a strange attractor; so far we have offered no proof that any system can exhibit one. However, it can be proved that strange attractors do exist in mathematically specified systems, although a certain minimal complexity is needed in order for a system to possess a strange attractor. We have this situation: simple equations can exhibit complicated solutions, but for the particular type of complexity represented by the existence of strange attractors, the system of equations can’t be too simple. To be specific, a system of three or more nonlinear differential equations can possess a strange attractor; less than three equations, or more than three linear equations, cannot. (The mathematical statement of this fact is simpler but more abstruse: a system can exhibit a strange attractor if at least one Lyapunov exponent is positive.)

  If we invert the logic, it is tempting to make another statement: Any physical system that shows an ultrasensitive dependence on initial conditions has a strange attractor buried somewhere in its structure.

  This is a plausible but not a proven result. I am tempted to call it the most important unsolved problem of chaos theory. If it turns out to be true, it will have a profound unifying influence on numerous branches of science. Systems whose controlling equations bear no resemblance to each other will share a structural resemblance, and there will be the possibility of developing universal techniques that apply to the solution of complicated problems in a host of different areas. One thing in common with every problem that we have been discussing is nonlinearity. Nonlinear systems are notoriously difficult to solve, and seem to defy intuition. Few general techniques exist today for tackling nonlinear problems, and some new insight is desperately needed.

  If chaos theory can provide that insight, it will have moved from being a baffling grab-bag of half results, interesting conjectures, and faintly seen relationships, to become a real “new science.” We are not there yet. But if we can go that far, then our old common sense gut instinct, that told us simple equations must have simple solutions, will have proved no more reliable than our ancestors’ common sense instinctive knowledge that told them the Earth was flat. And the long-term implications of that new thought pattern may be just as revolutionary to science.

  7.SOME READING

  Gleick, J., Chaos: Making A New Science, Viking, 1987. You will read this book, love it, be swept up and carried along by it, and thoroughly enjoy the exciting ride. And when you are done, I defy you to tell me, on the basis of what you have read there, what chaos theory is, or how its underlying ideas serve as an integrating influence in science.

  Prigogine, I., and I. Stenger’s Order Out of Chaos, Bantam, 1984. This book is not primarily a discussion of chaos theory, but in three chapters (5, 6, and 9) it provides more meat on the subject than Gleick’s whole book. From this work I drew the opening quotation of the article, “Classical Nightmares…and Quantum Paradoxes.” Also, Prigogine’s book, From Being to Becoming (Freeman, 1980) tackles the tough question of irreversible processes and quantum theory at a more advanced level than the Prigogine and Stenger’s text. Its third part should make your head ache. Both Prigogine books are harder reading than Gleick, and expect a lot more of the reader.

  Hofstadter, D. Scientific American, November, 1981, reprinted as Chapter 16 of Metamagical Themas. Basic Books, 1985. Hofstadter, as always, looks for elegance. He finds it in the results of chaos theory, and describes what he finds as clearly as one could ask. I am a fan of Hofstadter. His Gödel, Escher, Bach (Vintage Books, 1980) examines the same question of the nature of the human brain as Roger Penrose’s The Emperors New Mind—and reaches exactly opposite conclusions.

  ——————————————————————————————————

  story: the seventeen-year locusts

  The Seventeen-Year Locusts occupy a special place in the Washington, D.C., scene. As a natural event, these tree-grasshoppers induce in even the least imaginative visitor a sense of wonder at the strangeness of Nature.

  For sixteen years, the U.S. capital city has its usual swarms of cicadas. In July and August they fill the warm and muggy summer nights with their chiming and chittering. Then comes the seventeenth year. The locusts awaken in their hiding places underneath the tree roots and begin to crawl up out of the damp ground. For a few weeks they cover every tree and every shrub, so that you cannot walk in the garden without trampling them underfoot. At night they make a noise loud enough to drown the sound of low-flying aircraft on their runs along the Potomac
to Washington’s National Airport.

  Why seventeen years? Is it some unfathomable confluence of natural events, a resonance of the planetary orbits? Or is it a precise biological chronometer, ticking away inside the pinhead brains, to bring them out in the exact week of that seventeenth summer? No one knows. But once in position on their twigs, the locusts seems quite happy to sit, green bodies motionless and bright red eyes unwinking.

  The Seventeen-Year Locusts seem to serve no useful purpose. If they did not exist, it would be by no means necessary to invent them. They do not bite or sting, and although they suck the sap from tender young twigs, all evidence of that is gone in a couple of years. They do not scourge the earth, for the name “locust” is only a popular alternative to the more academic “periodical cicadas.” Perhaps their only function is to separate the old-timers in Washington from the new hands.

  “You don’t know about the locusts? Why, weren’t you here in 1986? They’ll be back again in 2003, just hang around awhile.”

  And then the arguments start, as other natives insist that they were last here in 1985, or 1987. For unlike the locusts, the Washingtonians seem to lack that precise memory and timing.

  The insects themselves take no interest in the arguments, or indeed in much else. As one might expect, they seem confused about everything. They popped down for a nap when Ronald Reagan was running the show, when gasoline was under a dollar a gallon, and when “Dallas” was a top TV show. Now in 2003 some total unknown has appeared as President, gas is up to nine dollars a pint, and TV sets are used only as neural interfaces.

  “What’s going on here?” they say to each other. “Close your eyes for one minute in this darn town, and next thing you know you feel like a complete outsider.”

  Confusion is probably their dominant emotion. But I have my own concerns—ones that may explain another of Nature’s great mysteries.

 

‹ Prev