Book Read Free

The Age of Spiritual Machines: When Computers Exceed Human Intelligence

Page 4

by Ray Kurzweil


  This is usually a short-lived victory for the aging technology. Shortly thereafter, another new technology typically does succeed in rendering the original technology into the stage of obsolescence. In this part of the life cycle, the technology lives out its senior years in gradual decline, its original purpose and functionality now subsumed by a more spry competitor. This stage, which may comprise 5 to 10 percent of the life cycle, finally yields to antiquity (examples today: the horse and buggy, the harpsichord, the manual typewriter, and the electromechanical calculator).

  To illustrate this, consider the phonograph record. In the mid-nineteenth century, there were several precursors, including Édouard-Léon Scott de Martinville’s phonautograph, a device that recorded sound vibrations as a printed pattern. It was Thomas Edison, however, who in 1877 brought all of the elements together and invented the first device that could record and reproduce sound. Further refinements were necessary for the phonograph to become commercially viable. It became a fully mature technology in 1948 when Columbia introduced the 33 revolutions-per-minute (rpm) long-playing record (LP) and RCA Victor introduced the 45-rpm small disc. The pretender was the cassette tape, introduced in the 1960s and popularized during the 1970s. Early enthusiasts predicted that its small size and ability to be rerecorded would make the relatively bulky and scatchable record obsolete.

  Despite these obvious benefits, cassettes lack random access (the ability to play selections in a desired order) and are prone to their to their own forms of distortion and lack of fidelity. In the late 1980s and early 1990, the digital compact disc (CD) did deliver the mortal blow. With the CD providing both random access and a level of quality close to the limits of the human auditory system, the phonograph record entered the stage of obsolescence in the first half of the 1990s. Although still produced in small quantities, the technology that Edison gave birth to more than a century ago is now approaching antiquity.

  Another example is the print book, a rather mature technojbgy tpday. It is now in the stage of the pretenders, with the software-based “virtual” book as the pretender. Lacking the resolution, contrast, lack of flicker, and other visual qualities of paper and ink, the current generation of virtual book does not have the capability of displacing paper-based publications. Yet this victory of the paper-based book will be short-lived as future generations of computer displays succeed in providing a fully satisfactory alternative to paper.

  The Emergence of Moore’s Law

  Gordon Moore, an inventor of the integrated circuit and then chairman of Intel, noted in 1965 that the surface area of a transistor (as etched on an integrated circuit) was being reduced by approximately 50 percent every twelve months. In 1975, he was widely reported to have revised this observation to eighteen months. Moore claims that his 1975 update was to twenty-four months, and that does appear to be a better fit to the data.

  MOORE’S LAW AT WORK

  The result is that every two years, you can pack twice as many transistors on an integrated circuit. This doubles both the number of components on a chip as well as its speed. Since the cost of an integrated circuit is fairly constant, the implication is that every two years you can get twice as much circuitry running at twice the speed for the same price. For many applications, that’s an effective quadrupling of the value. The observation holds true for every type of circuit, from memory chips to computer processors.

  This insightful observation has become known as Moore’s Law on Integrated Circuits, and the remarkable phenomenon of the law has been driving the acceleration of computing for the past forty years. But how much longer can this go on? The chip companies have expressed confidence in another fifteen to twenty years of Moore’s Law by continuing their practice of using increasingly higher resolutions of optical lithography (an electronic process similar to photographic printing) to reduce the feature size—measured today in millionths of a meter—of transistors and other key components.18 But then—after almost sixty years—this paradigm will break down. The transistor insulators will then be just a few atoms thick, and the conventional approach of shrinking them won’t work.

  What then?

  We first note that the exponential growth of computing did not start with Moore’s Law on Integrated Circuits. In the accompanying figure, “The Exponential Growth of Computing, 1900-1998,”19 I plotted forty-nine notable computing machines spanning the twentieth century on an exponential chart, in which the vertical axis represents powers of ten in computer speed per unit cost (as measured in the number of “calculations per second” that can be purchased for $1,000). Each point on the graph represents one of the machines. The first five machines used mechanical technology, followed by three electromechanical (relay based) computers, followed by eleven vacuum-tube machines, followed by twelve machines using discrete transistors. Only the last eighteen computers used integrated circuits.

  I then fit a curve to the points called a fourth-order polynomial, which allows for up to four bends. In other words, I did not try to fit a straight line to the points, just the closest fourth-order curve. Yet a straight line is close to what I got. A straight line on an exponential graph means exponential growth. A careful examination of the trend shows that the curve is actually bending slightly upward, indicating a small exponential growth in the rate of exponential growth. This may result from the interaction of two different exponential trends, as I will discuss in chapter 6, “Building New Brains.” Or there may indeed be two levels of exponential growth. Yet even if we take the more conservative view that there is just one level of acceleration, we can see that the exponential growth of computing did not start with Moore’s Law on Integrated Circuits, but dates back to the advent of electrical computing at the beginning of the twentieth century

  Mechanical Computing Devices

  Electromechanical (Relay Based) Computers

  Vacuum-Tube Computers

  Discrete Transistor Computers

  Integrated Circuit Computers

  THE EXPONENTIAL GROWTH OF COMPUTING, 1900-1998

  In the 1980s, a number of observers, including Carnegie Mellon University professor Hans Moravec, Nippon Electric Company’s David Waltz, and myself, noticed that computers have been growing exponentially in power, long before the invention of the integrated circuit in 1958 or even the transistor in 1947.20 The speed and density of computation have been doubling every three years (at the beginning of the twentieth century) to one year (at the end of the twentieth century), regardless of the type of hardware used. Remarkably, this “Exponential Law of Computing” has held true for at least a century, from the mechanical card-based electrical computing technology used in the 1890 U.S. census, to the relay-based computers that cracked the Nazi Enigma code, to the vacuum-tube-based computers of the 1950s, to the transistor-based machines of the 1960s, and to all of the generations of integrated circuits of the past four decades. Computers are about one hundred million times more powerful for the same unit cost than they were a half century ago. If the automobile industry had made as much progress in the past fifty years, a car today would cost a hundredth of a cent and go faster than the speed of light.

  As with any phenomenon of exponential growth, the increases are so slow at first as to be virtually unnoticeable. Despite many decades of progress since the first electrical calculating equipment was used in the 1890 census, it was not until the mid-1960s that this phenomenon was even noticed (although Alan Turing had an inkling of it in 1950). Even then, it was appreciated only by a small community of computer engineers and scientists. Today, you have only to scan the personal computer ads—or the toy ads—in your local newspaper to see the dramatic improvements in the price performance of computation that now arrive on a monthly basis.

  So Moore’s Law on Integrated Circuits was not the first, but the fifth paradigm to continue the now one-century-long exponential growth of computing. Each new paradigm came along just when needed. This suggests that exponential growth won’t stop with the end of Moore’s Law. But the answer to
our question on the continuation of the exponential growth of computing is critical to our understanding of the twenty-first century. So to gain a deeper understanding of the true nature of this trend, we need to go back to our earlier questions on the exponential nature of time.

  THE LAW OF TIME AND CHAOS

  Is the flow of time something real, or might our sense of time passing be just an illusion that hides the fact that what is real is only a vast collection of moments?

  —Lee Smolin

  Time is nature’s way of preventing everything from happening at once.

  —Graffito

  Things are more like they are now than they ever were before.

  .—Dwight Eisenhower

  Consider these diverse exponential trends:

  • The exponentially slowing pace that the Universe followed, with three epochs in the first billionth of a second, with later salient events taking billions of years.

  • The exponentially slowing pace in the development of an organism. In the first month after conception, we grow a body, a head, even a tail. We grow a brain in the first couple of months. After leaving our maternal confines, our maturation both physically and mentally is rapid at first. In the first year, we learn basic forms of mobility and communication. We experience milestones every month or so. Later on, key events march ever more slowly, taking years and then decades.

  • The exponentially quickening pace of the evolution of life-forms on Earth.

  • The exponentially quickening pace of the evolution of human-created technology, which picked up the pace from the evolution of life-forms.

  • The exponential growth of computing. Note that exponential growth of a process over time is just another way of expressing an exponentially quickening pace. For example, it took about ninety years to achieve the first MIP (Million Instructions per Second) for a thousand dollars. Now we add an additional MIP per thousand dollars every day. The overall innovation rate is clearly accelerating as well.

  • Moore’s Law on Integrated Circuits. As I noted, this was the fifth paradigm to achieve the exponential growth of computing.

  Many questions come to mind:

  What is the common thread between these varied exponential trends? Why do some of these processes speed up while others slow down? And what does this tell us about the continuation of the exponential growth of computing when Moore’s Law dies?

  Is Moore’s Law just a set of industry expectations and goals, as Randy Isaac, head of basic science at IBM, contends? Or is it part of a deeper phenomenon that goes far beyond the photolithography of integrated circuits?

  After thinking about the relationship between these apparently diverse trends for several years, the surprising common theme became apparent to me.

  What determines whether time speeds up or slows down? The consistent answer is that time moves in relation to the amount of chaos. We can state the Law of Time and Chaos as follows:

  The Law of Time and Chaos: In a process, the time interval between salient events (that is, events that change the nature of the process, or significantly affect the future of the process) expands or contracts along with the amount of chaos.

  When there is a lot of chaos in a process, it takes more time for significant events to occur. Conversely, as order increases, the time periods between salient events decrease.

  We have to be careful here in our definition of chaos. It refers to the quantity of disordered (that is, random) events that are relevant to the process. If we’re dealing with the random movement of atoms and molecules in a gas or liquid, then heat is an appropriate measure. If we’re dealing with the process of evolution of life-forms, then chaos represents the unpredictable events encountered by organisms, and the random mutations that are introduced in the genetic code.

  Let’s see how the Law of Time and Chaos applies to our examples. If chaos is increasing, the Law of Time and Chaos implies the following sublaw:

  The Law of Increasing Chaos: As chaos exponentially increases, time exponentially slows down (that is, the time interval between salient events grows longer as time passes).

  This fits the Universe rather well. When the entire Universe was just a “naked” singularity—a perfectly orderly single point in space and time—there was no chaos and conspicuous events took almost no time at all. As the Universe grew in size, chaos increased exponentially, and so did the timescale for epochal changes. Now, with billions of galaxies sprawled out over trillions of light-years of space, the Universe contains vast reaches of chaos, and indeed requires billions of years to get everything organized for a paradigm shift to take place.

  We see a similar phenomenon in the progression of an organisms life. We start out as a single fertilized cell, so there’s only rather limited chaos there. Ending up with trillions of cells, chaos greatly expands. Finally, at the end of our lives, our designs deteriorate, engendering even greater randomness. So the time period between salient biological events grows longer as we grow older. And that is indeed what we experience.

  But it is the opposite spiral of the Law of Time and Chaos that is the most important and relevant for our purposes. Consider the inverse sublaw, which I call the Law of Accelerating Returns:

  The Law of Accelerating Returns: As order exponentially increases, time exponentially speeds up (that is, the time interval between salient events grows shorter as time passes).

  The Law of Accelerating Returns (to distinguish it from a better-known law in which returns diminish) applies specifically to evolutionary processes. In an evolutionary process, it is order—the opposite of chaos—that is increasing. And, as we have seen, time speeds up.

  Disdisorder

  I noted above that the concept of chaos in the Law of Time and Chaos is tricky Chaos alone is not sufficient—disorder for our purposes requires randomness that is relevant to the process we are concerned with. The opposite of disorder—which I called “order” in the above Law of Accelerating Returns—is even trickier.

  Let’s start with our definition of disorder and work backward. If disorder represents a random sequence of events, then the opposite of disorder should imply “not random.” And if random means unpredictable, then we might conclude that order means predictable. But that would be wrong.

  Borrowing a page from information theory,21 consider the difference between information and noise. Information is a sequence of data that is meaningful in a process, such as the DNA code of an organism, or the bits in a computer program. Noise, on the other hand, is a random sequence. Neither noise nor information is predictable. Noise is inherently unpredictable, but carries no information. Information, however, is also unpredictable. If we can predict future data from past data, then that future data stops being information. For example, consider a sequence which simply alternates between zero and one (01010101 ...). Such a sequence is certainly orderly, and very predictable. Specifically because it is so predictable, we do not consider it information bearing, beyond the first couple of bits.

  Thus orderliness does not constitute order because order requires information. So, perhaps I should use the word information instead of order. However, information alone is not sufficient for our purposes either. Consider a phone book. It certainly represents a lot of information, and some order as well. Yet if we double the size of the phone book, we have increased the amount of data, but we have not achieved a deeper level of order.

  Order, then, is information that fits a purpose. The measure of order is the measure of how well the information fits the purpose. In the evolution of life-forms, the purpose is to survive. In an evolutionary algorithm (a computer program that simulates evolution to solve a problem) applied to, say, investing in the stock market, the purpose is to make money. Simply having more information does not necessarily result in a better fit. A superior solution for a purpose may very well involve less data.

  The concept of “complexity” has been used recently to describe the nature of the information created by an evolutionary process. Complexit
y is a reasonably close fit to the concept of order that I am describing. After all, the designs created by the evolution of life-forms on Earth appear to have become more complex over time. However, complexity is not a perfect fit, either. Sometimes, a deeper order—a better fit to a purpose—is achieved through simplification rather than further increases in complexity. As Einstein said, “Everything should be made as simple as possible, but no simpler.” For example, a new theory that ties together apparently disparate ideas into one broader, more coherent theory reduces complexity but nonetheless may increase the “order for a purpose” that I am describing. Evolution has shown, however, that the general trend toward greater order does generally result in greater complexity.22

  Thus improving a solution to a problem—which may increase or decrease complexity—increases order. Now that just leaves the issue of defining the problem. And as we will see, defining a problem well is often the key to finding its solution.

  The Law of Increasing Entropy Versus the Growth of Order

  Another consideration is how the Law of Time and Chaos relates to the second law of thermodynamics. Unlike the second law, the Law of Time and Chaos is not necessarily concerned with a closed system. It deals instead with a process. The Universe is a closed system (not subject to outside influence, since there is nothing outside the Universe), so in accordance with the second law of thermodynamics, chaos increases and time slows down. In contrast, evolution is precisely not a closed system. It takes place amid great chaos, and indeed depends on the disorder in its midst,from which it draws its options for diversity. And from these options, an evolutionary process continually prunes its choices to create ever greater order. Even a crisis that appears to introduce a significant new source of chaos is likely to end up increasing—deepening—the order created by an evolutionary process. For example, consider the asteroid that is thought to have killed off big organisms such as the dinosaurs 65 million years ago. The crash of that asteroid suddenly created a vast increase in chaos (and lots of dust, too). Yet it appears to have hastened the rise of mammals in the niche previously dominated by large reptiles and ultimately led to the emergence of a technology-creating species. When the dust settled (literally), the crisis of the asteroid had increased order.

 

‹ Prev