Book Read Free

The Singularity Is Near: When Humans Transcend Biology

Page 63

by Ray Kurzweil


  In other words, computer power is a linear function of the knowledge of how to build computers. This is actually a conservative assumption. In general, innovations improve V by a multiple, not in an additive way. Independent innovations (each representing a linear increment to knowledge) multiply one another’s effects. For example, a circuit advance such as CMOS (complementary metal oxide semiconductor), a more efficient IC wiring methodology, a processor innovation such as pipelining, or an algorithmic improvement such as the fast Fourier transform, all increase V by independent multiples.

  As noted, our initial observations are:

  The velocity of computation is proportional to world knowledge:

  (1) V = c1W

  The rate of change of world knowledge is proportional to the velocity of computation:

  Substituting (1) into (2) gives:

  The solution to this is:

  and W grows exponentially with time (e is the base of the natural logarithms).

  The data that I’ve gathered shows that there is exponential growth in the rate of (exponent for) exponential growth (we doubled computer power every three years early in the twentieth century and every two years in the middle of the century, and are doubling it every one year now). The exponentially growing power of technology results in exponential growth of the economy. This can be observed going back at least a century. Interestingly, recessions, including the Great Depression, can be modeled as a fairly weak cycle on top of the underlying exponential growth. In each case, the economy “snaps back” to where it would have been had the recession/depression never existed in the first place. We can see even more rapid exponential growth in specific industries tied to the exponentially growing technologies, such as the computer industry.

  If we factor in the exponentially growing resources for computation, we can see the source for the second level of exponential growth.

  Once again we have:

  (5) V = c1W

  But now we include the fact that the resources deployed for computation, N, are also growing exponentially:

  The rate of change of world knowledge is now proportional to the product of the velocity of computation and the deployed resources:

  Substituting (5) and (6) into (7) we get:

  The solution to this is:

  and world knowledge accumulates at a double exponential rate.

  Now let’s consider some real-world data. In chapter 3, I estimated the computational capacity of the human brain, based on the requirements for functional simulation of all brain regions, to be approximately 1016 cps. Simulating the salient nonlinearities in every neuron and interneuronal connection would require a higher level of computing: 1011 neurons times an average 103 connections per neuron (with the calculations taking place primarily in the connections) times 102 transactions per second times 103 calculations per transaction—a total of about 1019 cps. The analysis below assumes the level for functional simulation (1016 cps).

  If we factor in the exponentially growing economy, particularly with regard to the resources available for computation (already about one trillion dollars per year), we can see that nonbiological intelligence will be billions of times more powerful than biological intelligence before the middle of the century.

  We can derive the double exponential growth in another way. I noted above that the rate of adding knowledge (dW/dt) was at least proportional to the knowledge at each point in time. This is clearly conservative given that many innovations (increments to knowledge) have a multiplicative rather than additive impact on the ongoing rate.

  However, if we have an exponential growth rate of the form:

  where C > 1, this has the solution:

  which has a slow logarithmic growth while t 1/lnC but then explodes close to the singularity at t = 1/lnC.

  Even the modest dW/dt = W2 results in a singularity.

  Indeed any formula with a power law growth rate of the form:

  where a > 1, leads to a solution with a singularity:

  at the time T. The higher the value of a, the closer the singularity.

  My view is that it is hard to imagine infinite knowledge, given apparently finite resources of matter and energy, and the trends to date match a double exponential process. The additional term (to W) appears to be of the form W × log(W). This term describes a network effect. If we have a network such as the Internet, its effect or value can reasonably be shown to be proportional to n × log(n) where n is the number of nodes. Each node (each user) benefits, so this accounts for the n multiplier. The value to each user (to each node) = log(n). Bob Metcalfe (inventor of Ethernet) has postulated the value of a network of n nodes = c × n2, but this is overstated. If the Internet doubles in size, its value to me does increase but it does not double. It can be shown that a reasonable estimate is that a network’s value to each user is proportional to the log of the size of the network. Thus, its overall value is proportional to n × log(n).

  If the growth rate instead includes a logarithmic network effect, we get an equation for the rate of change that is given by:

  The solution to this is a double exponential, which we have seen before in the data:

  (15) W = exp(et)

  * * *

  Notes

  Prologue: The Power of Ideas

  1. My mother is a talented artist specializing in watercolor paintings. My father was a noted musician, conductor of the Bell Symphony, founder and former chairman of the Queensborough College Music Department.

  2. The Tom Swift Jr. series, which was launched in 1954 by Grosset and Dunlap and written by a series of authors under the pseudonym Victor Appleton, continued until 1971. The teenage Tom Swift, along with his pal Bud Barclay, raced around the universe exploring strange places, conquering bad guys, and using exotic gadgets such as house-sized spacecraft, a space station, a flying lab, a cycloplane, an electric hydrolung, a diving seacopter, and a repellatron (which repelled things; underwater, for example, it would repel water, thus forming a bubble in which the boys could live).

  The first nine books in the series are Tom Swift and His Flying Lab (1954), Tom Swift and His Jetmarine (1954), Tom Swift and His Rocket Ship (1954), Tom Swift and His Giant Robot (1954), Tom Swift and His Atomic Earth Blaster (1954), Tom Swift and His Outpost in Space (1955), Tom Swift and His Diving Seacopter (1956), Tom Swift in the Caves of Nuclear Fire (1956), and Tom Swift on the Phantom Satellite (1956).

  3. The program was called Select. Students filled out a three-hundred-item questionnaire. The computer software, which contained a database of about two million pieces of information on three thousand colleges, selected six to fifteen schools that matched the student’s interests, background, and academic standing. We processed about ten thousand students on our own and then sold the program to the publishing company Harcourt, Brace, and World.

  4. The Age of Intelligent Machines, published in 1990 by MIT Press, was named Best Computer Science Book by the Association of American Publishers. The book explores the development of artificial intelligence and predicts a range of philosophic, social, and economic impacts of intelligent machines. The narrative is complemented by twenty-three articles on AI from thinkers such as Sherry Turkle, Douglas Hofstadter, Marvin Minsky, Seymour Papert, and George Gilder. For the entire text of the book, see http://www.KurzweilAI.net/aim.

  5. Key measures of capability (such as price-performance, bandwidth, and capacity) increase by multiples (that is, the measures are multiplied by a factor for each increment of time) rather than being added to linearly.

  6. Douglas R. Hofstadter, Gödel, Escher, Bach: An Eternal Golden Braid (New York: Basic Books, 1979).

  Chapter One: The Six Epochs

  1. According to the Transtopia site (http://transtopia.org/faq.html#1.11), “Singularitarian” was “originally defined by Mark Plus (’91) to mean ‘one who believes the concept of a Singularity.’ ” Another definition of this term is “ ‘Singularity activist’ or ‘friend of the Singularity’; that is, one who acts so as to bring ab
out a Singularity [Mark Plus, 1991; Singularitarian Principles, Eliezer Yudkowsky, 2000].” There is not universal agreement on this definition, and many Transhumanists are still Singularitarians in the original sense—that is, “believers in the Singularity concept” rather than “activists” or “friends.”

  Eliezer S. Yudkowsky, in The Singularitarian Principles, version 1.0.2 (January 1, 2000), http://yudkowsky.net/sing/principles.ext.html, proposed an alternate definition: “A Singularitarian is someone who believes that technologically creating a greater-than-human intelligence is desirable, and who works to that end. A Singularitarian is friend, advocate, defender, and agent of the future known as the Singularity.”

  My view: one can advance the Singularity and in particular make it more likely to represent a constructive advance of knowledge in many ways and in many spheres of human discourse—for example, advancing democracy, combating totalitarian and fundamentalist belief systems and ideologies, and creating knowledge in all of its diverse forms: music, art, literature, science, and technology. I regard a Singularitarian as someone who understands the transformations that are coming in this century and who has reflected on their implications for his or her own life.

  2. We will examine the doubling rates of computation in the next chapter. Although the number of transistors per unit cost has doubled every two years, transistors have been getting progressively faster, and there have been many other levels of innovation and improvement. The overall power of computation per unit cost has recently been doubling every year. In particular, the amount of computation (in computations per second) that can be brought to bear to a computer chess machine doubled every year during the 1990s.

  3. John von Neumann, paraphrased by Stanislaw Ulam, “Tribute to John von Neumann,” Bulletin of the American Mathematical Society 64.3, pt. 2 (May 1958): 1–49. Von Neumann (1903–1957) was born in Budapest into a Jewish banking family and came to Princeton University to teach mathematics in 1930. In 1933 he became one of the six original professors in the new Institute for Advanced Study in Princeton, where he stayed until the end of his life. His interests were far ranging: he was the primary force in defining the new field of quantum mechanics; along with coauthor Oskar Morgenstern, he wrote Theory of Games and Economic Behavior, a text that transformed the study of economics; and he made significant contributions to the logical design of early computers, including building MANIAC (Mathematical Analyzer, Numeral Integrator, and Computer) in the late 1930s.

  Here is how Oskar Morgenstern described von Neumann in the obituary “John von Neumann, 1903–1957,” in the Economic Journal (March 1958: 174): “Von Neumann exercised an unusually large influence upon the thought of other men in his personal relations. . . . His stupendous knowledge, the immediate response, the unparalleled intuition held visitors in awe. He would often solve their problems before they had finished stating them. His mind was so unique that some people have asked themselves—they too eminent scientists—whether he did not represent a new stage in human mental development.”

  4. See notes 20 and 21 in chapter 2.

  5. The conference was held February 19–21, 2003, in Monterey, California. Among the topics covered were stem-cell research, biotechnology, nanotechnology, cloning, and genetically modified food. For a list of books recommended by conference speakers, see http://www.thefutureoflife.com/books.htm.

  6. The Internet, as measured by the number of nodes (servers), was doubling every year during the 1980s but was only tens of thousands of nodes in 1985. This grew to tens of millions of nodes by 1995. By January 2003, the Internet Software Consortium (http://www.isc.org/ds/host-count-history.html) counted 172 million Web hosts, which are the servers hosting Web sites. That number represents only a subset of the total number of nodes.

  7. At the broadest level, the anthropic principle states that the fundamental constants of physics must be compatible with our existence; if they were not, we would not be here to observe them. One of the catalysts for the development of the principle is the study of constants, such as the gravitational constant and the electromagnetic-coupling constant. If the values of these constants were to stray beyond a very narrow range, intelligent life would not be possible in our universe. For example, if the electromagnetic-coupling constant were stronger, there would be no bonding between electrons and other atoms. If it were weaker, electrons could not be held in orbit. In other words, if this single constant strayed outside an extremely narrow range, molecules would not form. Our universe, then, appears to proponents of the anthropic principle to be fine-tuned for the evolution of intelligent life. (Detractors such as Victor Stenger claim the fine-tuning is not so fine after all; there are compensatory mechanisms that would support a wider window for life to form under different conditions.)

  The anthropic principle comes up again in the context of contemporary cosmology theories that posit multiple universes (see notes 8 and 9, below), each with its own set of laws. Only in a universe in which the laws allowed thinking beings to exist could we be here asking these questions.

  One of the seminal texts in the discussion is John Barrow and Frank Tipler, The Anthropic Cosmological Principle (New York: Oxford University Press, 1988). See also Steven Weinberg, “A Designer Universe?” at http://www.physlink.com/Education/essay_weinberg.cfm.

  8. According to some cosmological theories, there were multiple big bangs, not one, leading to multiple universes (parallel multiverses or “bubbles”). Different physical constants and forces apply in the different bubbles; conditions in some (or at least one) of these bubbles support carbon-based life. See Max Tegmark, “Parallel Universes,” Scientific American (May 2003): 41–53; Martin Rees, “Exploring Our Universe and Others,” Scientific American (December 1999): 78–83; Andrei Linde, “The Self-Reproducing Inflationary Universe,” Scientific American (November 1994): 48–55.

  9. The “many worlds” or multiverse theory as an interpretation of quantum mechanics was developed to solve a problem presented by quantum mechanics and then has been combined with the anthropic principle. As summarized by Quentin Smith:

  A serious difficulty associated with the conventional or Copenhagen interpretation of quantum mechanics is that it cannot be applied to the general relativity space-time geometry of a closed universe. A quantum state of such a universe is describable as a wave function with varying spatial-temporal amplitude; the probability of the state of the universe being found at any given point is the square of the amplitude of the wave function at that point. In order for the universe to make the transition from the superposition of many points of varying probabilities to one of these points—the one in which it actually is—a measuring apparatus must be introduced that collapses the wave function and determines the universe to be at that point. But this is impossible, for there is nothing outside the universe, no external measuring apparatus, that can collapse the wave function.

  A possible solution is to develop an interpretation of quantum mechanics that does not rely on the notion of external observation or measurement that is central to the Copenhagen interpretation. A quantum mechanics can be formulated that is internal to a closed system.

  It is such an interpretation that Hugh Everett developed in his 1957 paper, “Relative State Formulation of Quantum Mechanics.” Each point in the superposition represented by the wave function is regarded as actually containing one state of the observer (or measuring apparatus) and one state of the system being observed. Thus “with each succeeding observation (or interaction), the observer state ‘branches’ into a number of different states. Each branch represents a different outcome of the measurement and the corresponding eigenstate for the object-system state. All branches exist simultaneously in the superposition after any given sequence of observations.”

  Each branch is causally independent of each other branch, and consequently no observer will ever be aware of any “splitting” process. The world will seem to each observer as it does in fact seem.

  Applied to the universe as a whole,
this means that the universe is regularly dividing into numerous different and causally independent branches, consequent upon the measurement-like interactions among its various parts. Each branch can be regarded as a separate world, with each world constantly splitting into further worlds.

  Given that these branches—the set of universes—will include ones both suitable and unsuitable for life, Smith continues,“At this point it can be stated how the strong anthropic principle in combination with the many-worlds interpretation of quantum mechanics can be used in an attempt to resolve the apparent problem mentioned at the beginning of this essay. The seemingly problematic fact that a world with intelligent life is actual, rather than one of the many lifeless worlds, is found not to be a fact at all. If worlds with life and without life are both actual, then it is not surprising that this world is actual but is something to be expected.”

 

‹ Prev