The Singularity Is Near: When Humans Transcend Biology

Home > Other > The Singularity Is Near: When Humans Transcend Biology > Page 65
The Singularity Is Near: When Humans Transcend Biology Page 65

by Ray Kurzweil


  38. See note 34 above.

  Chapter Two: A Theory of Technology Evolution:

  The Law of Accelerating Returns

  1. John Smart, Abstract to “Understanding Evolutionary Development: A Challenge for Futurists,” presentation to World Futurist Society annual meeting, Washington, D.C., August 3, 2004.

  2. That epochal events in evolution represent increases in complexity is Theodore Modis’s view. See Theodore Modis, “Forecasting the Growth of Complexity and Change,” Technological Forecasting and Social Change 69.4 (2002), http://ourworld.compuserve.com/homepages/tmodis/TedWEB.htm.

  3. Compressing files is a key aspect of both data transmission (such as a music or text file over the Internet) and data storage. The smaller the file is, the less time it will take to transmit and the less space it will require. The mathematician Claude Shannon, often called the father of information theory, defined the basic theory of data compression in his paper “A Mathematical Theory of Communication,” The Bell System Technical Journal 27 (July–October 1948): 379–423, 623–56. Data compression is possible because of factors such as redundancy (repetition) and probability of appearance of character combinations in data. For example, silence in an audio file could be replaced by a value that indicates the duration of the silence, and letter combinations in a text file could be replaced with coded identifiers in the compressed file.

  Redundancy can be removed by lossless compression, as Shannon explained, which means there is no loss of information. There is a limit to lossless compression, defined by what Shannon called the entropy rate (compression increases the “entropy” of the data, which is the amount of actual information in it as opposed to predetermined and thus predictable data structures). Data compression removes redundancy from data; lossless compression does it without losing data (meaning that the exact original data can be restored). Alternatively, lossy compression, which is used for graphics files or streaming video and audio files, does result in information loss, though that loss is often imperceptible to our senses.

  Most data-compression techniques use a code, which is a mapping of the basic units (or symbols) in the source to a code alphabet. For example, all the spaces in a text file could be replaced by a single code word and the number of spaces. A compression algorithm is used to set up the mapping and then create a new file using the code alphabet; the compressed file will be smaller than the original and thus easier to transmit or store. Here are some of the categories into which common lossless-compression techniques fall:

  Run-length compression, which replaces repeating characters with a code and a value representing the number of repetitions of that character (examples: Pack-Bits and PCX).

  Minimum redundancy coding or simple entropy coding, which assigns codes on the basis of probability, with the most frequent symbols receiving the shortest codes (examples: Huffman coding and arithmetic coding).

  Dictionary coders, which use a dynamically updated symbol dictionary to represent patterns (examples: Lempel-Ziv, Lempel-Ziv-Welch, and DEFLATE).

  Block-sorting compression, which reorganizes characters rather than using a code alphabet; run-length compression can then be used to compress the repeating strings (example: Burrows-Wheeler transform).

  Prediction by partial mapping, which uses a set of symbols in the uncompressed file to predict how often the next symbol in the file appears.

  4. Murray Gell-Mann, “What Is Complexity?” in Complexity, vol. 1 (New York: John Wiley and Sons, 1995).

  5. The human genetic code has approximately six billion (about 1010) bits, not considering the possibility of compression. So the 1027 bits that theoretically can be stored in a one-kilogram rock is greater than the genetic code by a factor of 1017. See note 57 below for a discussion of genome compression.

  6. Of course, a human, who is also composed of an enormous number of particles, contains an amount of information comparable to a rock of similar weight when we consider the properties of all the particles. As with the rock, the bulk of this information is not needed to characterize the state of the person. On the other hand, much more information is needed to characterize a person than a rock.

  7. See note 175 in chapter 5 for an algorithmic description of genetic algorithms.

  8. Humans, chimpanzees, gorillas, and orangutans are all included in the scientific classification of hominids (family Hominidae). The human lineage is thought to have diverged from its great ape relatives five to seven million years ago. The human genus Homo within the Hominidae includes extinct species such as H. erectus as well as modern man (H. sapiens).

  In chimpanzee hands, the fingers are much longer and less straight than in humans, and the thumb is shorter, weaker, and not as mobile. Chimps can flail with a stick but tend to lose their grip. They cannot pinch hard because their thumbs do not overlap their index fingers. In the modern human, the thumb is longer, and the fingers rotate toward a central axis, so you can touch all the tips of your fingers to the tip of your thumb, a quality that is called full opposability. These and other changes gave humans two new grips: the precision and power grips. Even prehominoid hominids such as the Australopithecine from Ethiopia called Lucy, who is thought to have lived around three million years ago, could throw rocks with speed and accuracy. Since then, scientists claim, continual improvements in the hand’s capacity to throw and club, along with associated changes in other parts of the body, have resulted in distinct advantages over other animals of similar size and weight. See Richard Young, “Evolution of the Human Hand: The Role of Throwing and Clubbing,” Journal of Anatomy 202 (2003): 165–74; Frank Wilson, The Hand: How Its Use Shapes the Brain, Language, and Human Culture (New York: Pantheon, 1998).

  9. The Santa Fe Institute has played a pioneering role in developing concepts and technology related to complexity and emergent systems. One of the principal developers of paradigms associated with chaos and complexity is Stuart Kauffman. Kauffman’s At Home in the Universe: The Search for the Laws of Self-Organization and Complexity (Oxford: Oxford University Press, 1995) looks “at the forces for order that lie at the edge of chaos.”

  In his book Evolution of Complexity by Means of Natural Selection (Princeton: Princeton University Press, 1988), John Tyler Bonner asks the questions “How is it that an egg turns into an elaborate adult? How is it that a bacterium, given many millions of years, could have evolved into an elephant?”

  John Holland is another leading thinker from the Santa Fe Institute in the emerging field of complexity. His book Hidden Order: How Adaptation Builds Complexity (Reading, Mass.: Addison-Wesley, 1996) includes a series of lectures that he presented at the Santa Fe Institute in 1994. See also John H. Holland, Emergence: From Chaos to Order (Reading, Mass.: Addison-Wesley, 1998) and Mitchell Waldrop, Complexity: The Emerging Science at the Edge of Order and Chaos (New York: Simon & Schuster, 1992).

  10. The second law of thermodynamics explains why there is no such thing as a perfect engine that uses all the heat (energy) produced by burning fuel to do work: some heat will inevitably be lost to the environment. This same principle of nature holds that heat will flow from a hot pan to cold air rather than in reverse. It also posits that closed (“isolated”) systems will spontaneously become more disordered over time—that is, they tend to move from order to disorder. Molecules in ice chips, for example, are limited in their possible arrangements. So a cup of ice chips has less entropy (disorder) than the cup of water the ice chips become when left at room temperature. There are many more possible molecular arrangements in the glass of water than in the ice; greater freedom of movement equals higher entropy. Another way to think of entropy is as multiplicity. The more ways that a state could be achieved, the higher the multiplicity. Thus, for example, a jumbled pile of bricks has a higher multiplicity (and higher entropy) than a neat stack.

  11. Max More articulates the view that “advancing technologies are combining and cross-fertilizing to accelerate progress even faster.” Max More, “Track 7 Tech Vectors to Take Advanta
ge of Technological Acceleration,” ManyWorlds, August 1, 2003.

  12. For more information, see J. J. Emerson et al., “Extensive Gene Traffic on the Mammalian X Chromosome,” Science 303.5657 (January 23, 2004): 537–40, http://www3.uta.edu/faculty/betran/science2004.pdf; Nicholas Wade, “Y Chromosome Depends on Itself to Survive,” New York Times, June 19, 2003; and Bruce T. Lahn and David C. Page, “Four Evolutionary Strata on the Human X Chromosome,” Science 286.5441 (October 29, 1999): 964–67, http://inside.wi.mit.edu/page/Site/Page%20PDFs/Lahn_and_Page_strata_1999.pdf.

  Interestingly, the second X chromosome in girls is turned off in a process called X inactivation so that the genes on only one X chromosome are expressed. Research has shown that the X chromosome from the father is turned off in some cells and the X chromosome from the mother in other cells.

  13. Human Genome Project, “Insights Learned from the Sequence,” http://www.ornl.gov/sci/techresources/Human_Genome/project/journals/

  insights.html. Even though the human genome has been sequenced, most of it does not code for proteins (the so-called junk DNA), so researchers are still debating how many genes will be identified among the three billion base pairs in human DNA. Current estimates suggest less than thirty thousand, though during the Human Genome Project estimates ranged as high as one hundred thousand. See “How Many Genes Are in the Human Genome?” (http://www.ornl.gov/sci/techresources/Human_Genome/faq/

  genenumber.shtml) and Elizabeth Pennisi,“A Low Number Wins the GeneSweep Pool,” Science 300.5625 (June 6, 2003): 1484.

  14. Niles Eldredge and the late Stephen Jay Gould proposed this theory in 1972 (N. Eldredge and S. J. Gould, “Punctuated Equilibria: An Alternative to Phyletic Gradualism,” in T. J. M. Schopf, ed., Models in Paleobiology [San Francisco: Freeman, Cooper], pp. 82–115). It has sparked heated discussions among paleontologists and evolutionary biologists ever since, though it has gradually gained acceptance. According to this theory, millions of years may pass with species in relative stability. This stasis is then followed by a burst of change, resulting in new species and the extinction of old (called a “turnover pulse” by Elisabeth Vrba). The effect is ecosystemwide, affecting many unrelated species. Eldredge and Gould’s proposed pattern required a new perspective: “For no bias can be more constricting than invisibility—and stasis, inevitably read as absence of evolution, had always been treated as a non-subject. How odd, though, to define the most common of all palaeontological phenomena as beyond interest or notice!” S. J. Gould and N. Eldredge, “Punctuated Equilibrium Comes of Age,” Nature 366 (November 18, 1993): 223–27.

  See also K. Sneppen et al., “Evolution As a Self-Organized Critical Phenomenon,” Proceedings of the National Academy of Sciences 92.11 (May 23, 1995): 5209–13; Elisabeth S. Vrba, “Environment and Evolution: Alternative Causes of the Temporal Distribution of Evolutionary Events,” South African Journal of Science 81 (1985): 229–36.

  15. As I will discuss in chapter 6, if the speed of light is not a fundamental limit to rapid transmission of information to remote portions of the universe, then intelligence and computation will continue to expand exponentially until they saturate the potential of matter and energy to support computation throughout the entire universe.

  16. Biological evolution continues to be of relevance to humans, however, in that disease processes such as cancer and viral diseases use evolution against us (that is, cancer cells and viruses evolve to counteract specific countermeasures such as chemotherapy drugs and antiviral medications respectively). But we can use our human intelligence to outwit the intelligence of biological evolution by attacking disease processes at sufficiently fundamental levels and by using “cocktail” approaches that attack a disease in several orthogonal (independent) ways at once.

  17. Andrew Odlyzko, “Internet Pricing and the History of Communications,” AT&T Labs Research, revised version February 8, 2001, http://www.dtc.umn.edu/~odlyzko/doc/history.communications1b.pdf.

  18. Cellular Telecommunications and Internet Association, Semi-Annual Wireless Industry Survey, June 2004, http://www.ctia.org/research_statistics/index.cfm/AID/10030.

  19. Electricity, telephone, radio, television, mobile phones: FCC, www.fcc.gov/Bureaus/Common_Carrier/Notices/2000/fc00057a.xls. Home computers and Internet use: Eric C. Newburger, U.S. Census Bureau,“Home Computers and Internet Use in the United States: August 2000” (September 2001), http://www.census.gov/prod/2001pubs/p23-207.pdf. See also “The Millennium Notebook,” Newsweek, April 13, 1998, p. 14.

  20. The paradigm-shift rate, as measured by the amount of time required to adopt new communications technologies, is currently doubling (that is, the amount of time for mass adoption—defined as being used by a quarter of the U.S. population—is being cut in half) every nine years. See also note 21.

  21. The “Mass Use of Inventions” chart in this chapter on p. 50 shows that the time required for adoption by 25 percent of the U.S. population steadily declined over the past 130 years. For the telephone, 35 years were required compared to 31 for the radio—a reduction of 11 percent, or 0.58 percent per year in the 21 years between these two inventions. The time required to adopt an invention dropped 0.60 percent per year between the radio and television, 1.0 percent per year between television and the PC, 2.6 percent per year between the PC and the mobile phone, and 7.4 percent per year between the mobile phone and the World Wide Web. Mass adoption of the radio beginning in 1897 required 31 years, while the Web required a mere 7 years after it was introduced in 1991—a reduction of 77 percent over 94 years, or an average rate of 1.6 percent reduction in adoption time per year. Extrapolating this rate for the entire twentieth century results in an overall reduction of 79 percent for the century. At the current rate of reducing adoption time of 7.4 percent each year, it would take only 20 years at today’s rate of progress to achieve the same reduction of 79 percent that was achieved in the twentieth century. At this rate, the paradigm-shift rate doubles (that is, adoption times are reduced by 50 percent) in about 9 years. Over the twenty-first century, eleven doublings of the rate will result in multiplying the rate by 211, to about 2,000 times the rate in 2000. The increase in rate will actually be greater than this because the current rate will continue to increase as it steadily did over the twentieth century.

  22. Data from 1967–1999, Intel data, see Gordon E. Moore, “Our Revolution,” http://www.sia-online.org/downloads/Moore.pdf. Data from 2000–2016, International Technology Roadmap for Semiconductors (ITRS) 2002 Update and 2004 Update, http://public.itrs.net/Files/2002Update/2002Update.pdf and http://www.itrs.net/Common/2004Update/2004_00_Overview.pdf.

  23. The ITRS DRAM cost is the cost per bit (packaged microcents) at production. Data from 1971–2000: VLSI Research Inc. Data from 2001–2002: ITRS, 2002 Update, Table 7a, Cost-Near-Term Years, p. 172. Data from 2003–2018: ITRS, 2004 Update, Tables 7a and 7b, Cost-Near-Term Years, pp. 20–21.

  24. Intel and Dataquest reports (December 2002), see Gordon E. Moore, “Our Revolution,” http://www.sia-online.org/downloads/Moore.pdf.

  25. Randall Goodall, D. Fandel, and H. Huffet, “Long-Term Productivity Mechanisms of the Semiconductor Industry,” Ninth International Symposium on Silicon Materials Science and Technology, May 12–17, 2002, Philadelphia, sponsored by the Electrochemical Society (ECS) and International Sematech.

  26. Data from 1976–1999: E. R. Berndt, E. R. Dulberger, and N. J. Rappaport, “Price and Quality of Desktop and Mobile Personal Computers: A Quarter Century of History,” July 17, 2000, http://www.nber.org/~confer/2000/si2000/berndt.pdf. Data from 2001–2016: ITRS, 2002 Update, On-Chip Local Clock in Table 4c: Performance and Package Chips: Frequency On-Chip Wiring Levels—Near-Term Years, p. 167.

  27. See note 26 for clock speed (cycle times) and note 24 for cost per transistor.

  28. Intel transistors on microprocessors: Microprocessor Quick Reference Guide, Intel Research, http://www.intel.com/pressroom/kits/quickrefyr.htm. See also Silicon Research Areas, Intel Research, http://www.intel.com/resea
rch/silicon/mooreslaw.htm.

  29. Data from Intel Corporation. See also Gordon Moore, “No Exponential Is Forever . . . but We Can Delay ‘Forever,’” presented at the International Solid State Circuits Conference (ISSCC), February 10, 2003, ftp://download.intel.com/research/silicon/Gordon_Moore_ISSCC_

  021003.pdf.

  30. Steve Cullen, “Semiconductor Industry Outlook,” InStat/MDR, report no. IN0401550SI, April 2004, http://www.instat.com/abstract.asp?id=68&SKU=IN0401550SI.

  31. World Semiconductor Trade Statistics, http://wsts.www5.kcom.at.

  32. Bureau of Economic Analysis, U.S. Department of Commerce, http://www.bea.gov/bea/dn/home/gdp.htm.

  33. See notes 22–24 and 26–30.

  34. International Technology Roadmap for Semiconductors, 2002 update, International Sematech.

  35. “25 Years of Computer History,” http://www.compros.com/timeline.html; Linley Gwennap, “Birth of a Chip,” BYTE (December 1996), http://www.byte.com/art/9612/sec6/art2.htm; “The CDC 6000 Series Computer,” http://www.moore cad.com/standardpascal/cdc6400.html; “A Chronology of Computer History,” http://www.cyberstreet.com/hcs/museum/chron.htm; Mark Brader, “A Chronology of Digital Computing Machines (to 1952),” http://www.davros.org/misc/chronology.html; Karl Kempf, “Electronic Computers Within the Ordnance Corps,” November 1961, http://ftp.arl.mil/~mike/comphist/61ordnance/index.html; Ken Polsson, “Chronology of Personal Computers,” http://www.islandnet.com/~kpolsson/comphist; “The History of Computing at Los Alamos,” http://bang.lanl.gov/video/sunedu/computer/comphist.html (requires password); the Machine Room, http://www.machine-room.org; Mind Machine Web Museum, http://www.userwww.sfsu.edu/~hl/mmm.html; Hans Moravec, computer data, http://www.frc.ri.cmu.edu/~hpm/book97/ch3/processor.list; “PC Magazine Online: Fifteen Years of PC Magazine,” http://www.pcmag.com/article2/0,1759,23390,00.asp; Stan Augarten, Bit by Bit: An Illustrated History of Computers (New York: Ticknor and Fields, 1984); International Association of Electrical and Electronics Engineers (IEEE), Annals of the History of the Computer 9.2 (1987): 150–53 and 16.3 (1994): 20; Hans Moravec, Mind Children: The Future of Robot and Human Intelligence (Cambridge, Mass.: Harvard University Press, 1988); René Moreau, The Computer Comes of Age (Cambridge, Mass.: MIT Press, 1984).

 

‹ Prev