Chaitin’s Law: Gregory Chaitin, inventor of algorithmic information theory, ordains that you cannot use static, eternal, perfect mathematics to model dynamic creative life. Determinist math traps the mathematician in a mechanical process that cannot yield innovation or surprise, learning or life. You need to transcend the Newtonian mathematics of physics and adopt post-modern mathematics—the mathematics that follows Gödel (1931) and Turing (1936), the mathematics of creativity.
Economic growth: Learning tested by falsifiability or possible bankruptcy. This understanding of economic growth follows from Karl Popper’s insight that a scientific proposition must be framed in terms that are falsifiable or refutable. Government guarantees prevent learning and thus thwart economic growth.
All expanding businesses and industries follow a learning curve that ordains a 20 to 30 percent decrease in costs with every doubling of total units sold. Classical learning curves are Moore’s Law in microchips and Metcalfe’s Law in networking. Raymond Kurzweil generalized the idea as a “law of accelerating returns,” a concept that Henry Adams introduced in a learning curve chart in The Education of Henry Adams and applied to the increase of energy output.
Economic growth as a learning process does not directly gain from “machine-learning,” unless the symbols processed are interpreted by human beings.
Expansionary fiscal and monetary policy: The attempt by central banks to stimulate economic activity by selling government securities to pay for a governmental deficit.
Keynesians, mostly on the left, believe that central banks sell securities and impart a fiscal stimulus by enabling more government spending.
Monetarists, mostly on the right, believe that central banks stimulate economic activity by creating money to buy government securities.
This new money goes to the previous owners of the purchased securities, chiefly banks, which in recent years have used the funds to purchase more securities from the Treasury. Keynesianism and monetarism thus converge in expanding the government’s power to spend.
In an information economy, both measures attempt to use government power to force growth. But economic growth is learning (accumulating tested knowledge). Learning cannot be forced.
Gödel’s Incompleteness Theorem: Kurt Gödel’s discovery in mathematical logic that any formal system powerful enough to express the truths of arithmetic will be incomplete and dependent on axioms not reducible to the system—truths that cannot be proved within the system itself. In developing his proof, Gödel (1906–1978) invented a mathematical machine that used numbers to embody axioms and thus anticipated the discoveries of computer science. By showing that mathematics could not be hermetically sealed or physically determinist, Gödel opened the way to postmodern mathematics: a mathematics of software and creativity. John von Neumann (1903–1957) was the first person to appreciate and publicize the importance of Gödel’s demonstration in 1931 that mathematical statements can be true but unprovable.
As von Neumann saw, Gödel’s proof depended on his invention of a mathematical “machine” that used numbers to encode and prove algorithms also expressed in numbers. This invention, absorbed by von Neumann and Alan Turing, launched computer science and information theory and enabled the development of the Internet and the blockchain.
Gold: The monetary element, atomic number 179, tested over centuries and found uniquely suitable as money. The five precious metals in the Periodic Table are rhodium, palladium, silver, platinum, and gold. Rhodium and palladium are rare elements that were not discovered until the eighteenth century. Platinum’s melting point is three thousand degrees Fahrenheit, making it unworkable without advanced technology. Silver tarnishes and corrodes, and its reactivity makes it more tractable for most industrial purposes than gold. Only gold can function as a durable and unchanging measuring stick for value. Usually thought to be money because it is a useful commodity—pretty, shiny, divisible, portable, scarce, and convertible into jewelry—gold is in fact the monetary element because it is useless. Money is not valuable because it is really jewelry; jewelry is valuable because it is really money. Gold is a metric of valuation based on the time to extract an incremental ounce, which has changed little over the centuries, while gold has become more difficult to extract from deeper and more attenuated lodes. The gold metric is therefore not a function of technology and industrial progress, part of what it measures, but a pure gauge of value.
Hash: Conversion of a digital file of variable length into a string of characters of a specific length—in Secure Hashing Algorithm (SHA-256 used in Bitcoin’s blockchain cryptography) the output is always thirty-two bytes (256 bits). Hashes are prohibitively hard to invert; knowledge of the hash does not convey knowledge of the file, but knowledge of the file is readily converted into the hash. Any change to the file drastically changes the hash result. Hashes therefore reveal any tampering with the hashed data. The most common hash is the checksum at the end of every Internet packet. Hashes are the enabling technology of blockchains and hashgraphs.
Hashgraph: Use of chained blocks (called “rounds”) of hashes in a tree-like structure, with an ingenious algorithm called “virtual voting,” which achieves consensus without actual voting or proof of work, a complex and laborious process good to avoid whenever possible. This fast and efficient system may well prevail as the bottom layer of blockchains.
Hypertrophy of finance: The growth of finance beyond the rate of growth of the commerce it measures and intermediates. For example, international currency trading is roughly seventy-three times more voluminous than all global trading in goods and services and an estimated one hundred times as voluminous as all stock market transactions. Oil-futures trading has risen by a factor of one hundred in some three decades, from 10 percent of oil output in 1984 to ten times oil output in 2015. Derivatives on real estate are now nine times global GDP. That’s not capitalism, that’s hypertrophy of finance.
Information Theory: Begun by Kurt Gödel when he made logic into functional mathematcis and algorithms. Information theory evolved through the minds of Claude Shannon (1916–2001) and Alan Turing (1912–1954) into its current role as mathematical philosophy. It depicts human creations and communications as transmissions across a channel, whether a wire or the world, in the face of the power of noise, with the outcome measured by its “news” or surprise, defined as entropy and consummated as knowledge.
Entropy is higher or lower depending on the freedom of choice of the sender. It is a libertarian index. The larger the available alphabet of symbols—that is, the larger the set of possible messages—the greater the composer’s choice and the higher the entropy and information of the message.
Information theory both enables and describes our digital and analog world.
Main Street: The symbol of the real economy of workers paid hourly or monthly and sealed off from the accelerated circular loops of Wall Street moneymaking. Perhaps the street where you live, Main Street is the site of local businesses and jobs.
Metcalfe’s Law: The value and power of a network grows by the square of the number of compatible nodes it links. Named for the engineer Robert Metcalfe (1946–), a co-inventor of Ethernet, this law is a rough index and deeply counterintuitive. (The Internet is worth less than the square of its six billion connected devices.) But the law applies to smaller networks, and it explains the vectors of value creation of companies such as Facebook, Apple, Google, and Amazon, which now dominate stock market capitalization. Metcalfe’s Law may well apply to the promise of new digital currencies and ultimately assure the success of a new transactions layer for the Internet software stack.
Moore’s Law: Cost-effectiveness in the computer industry doubles every two years. This pace corresponds closely to a faster pace in the number of transistors produced, signifying a learning curve. Formulated by Intel founder Gordon Moore (1929–) and inspired by Caltech professor Carver Mead’s research, Moore’s Law was originally based on the biennial doubling of the density of transistors on a silicon
chip. It now chiefly relies on other vectors of learning, such as parallel processing, multi-threading, lower voltages, and three-dimensional chip architectures. As a learning curve, Moore’s Law is an important principle of information theory.
Noise: Interference in a message. Any influence of the conduit on the content: An undesired disturbance in a communications channel. Noise is commonly the distortion of content by its conduit. A high-entropy message (full of surprise) requires a low-entropy channel (with no surprises). Surprises in the signal are information; surprises in the channel are noise.
Peirce’s Triad: The theorem of the mathematician and philosopher Charles Sanders Peirce (1839–1914) holding that all symbol and sign systems (such as software and mathematics) are meaningless without an interpreter. The triad consists of a sign (or symbol), an object, and a human interpreter. Removing the interpreter empties the triad, leaving it to be filled by ideology and artifice (e.g., “machine-learning” and “artificial intelligence”).
Public Key Cryptography: Most cryptography is symmetrical: the same key (or string of digital numbers) both encrypts and decrypts the message. This is fine if you can personally give the key to the recipient. But the Internet economy depends on continual transactions with people you never see. The answer to this problem is asymmetrical pairs of keys, generated together, with the key that encrypts the message—the public key—unable to decrypt it, and with a private key for decryption. Blockchains all depend on public keys as addresses for transactions that can be consummated by their private keys.
An important payoff for private keys is using them to encrypt files to be decrypted by the related public key. This process enables digital signatures that authenticate the source of a message. You know that the message originated with a unique private key that was generated in a pair with the public key that you hold. This means money can be signed, like a check, assuring authentication without necessarily revealing the source of the signature.
This technique reconciles two apparently conflicting goals of cryptocurrencies: privacy and attestation, the need for seamless trusted transactions without the exposure of personal data and the need to access and demonstrate reliable records of property and history for legal purposes. Thus we can have cash-like transactions (with no exposed secrets) together with robust and reliable and immutable records when demanded by the courts or the Internal Revenue Service. Identity and property can be concealed when appropriate and proved when required.
Contrast this system with the current system in which identity and property are constantly exposed to untrusted outsiders but cannot be proved without reliance on possibly corrupt or mendacious third parties or prosecutors.
Real money: A measuring stick, a metric of value, reflecting the scarcity and irreversible passage of time—entropy-based, equally distributed, and founded on the physical limits of the speed of light and the span of life. Bitcoin and gold are both real money in this sense. Government monopoly money is not.
Sand Hill Road: The arboreal abode of California venture capitalists and their “unicorns,” stretching from the Camino Real near Stanford to Route 280 and into the clouds and wealth of Woodside and Silicon Valley. Losing its leadership in entrepreneurial capital to China, Israel, Initial Coin Offerings around the world (ICOs), and other fund-raising sites, this seat of luxury is filling up with lawyers and politicians.
Shannon Entropy: Most simply measured by the number of binary digits need to encode a message, it is calculated as the sum of the base two logarithms of the probabilities of the components of the message. The logarithms of probabilities between one and zero are always negative quantities; entropy is rendered positive by a minus sign in front of this sum. This minus sign prompted some eminent theorists to blunder into the idea of negentropy, which is an oxymoron—more than 100 percent probability. Counterintuitively, surprising information is a kind of disorder. The alphabet is ordered; crystals are ordered; snowflakes are ordered. Hamlet and Google are beautifully disordered alphabets conveying surprising information.
Turing Machine: Inspired by Gödel’s proof, Turing conceived an abstract universal computer model consisting of a control unit administering a set of instructions reading, writing, and moving one space at a time back and forth along an infinitely long tape divided into squares along its length. He proved that this hypothetical machine could perform any computable function. Silicon Valley has been cheering ever since, despite his further proof that most numbers cannot be generated by a computational process. Turing’s universal computer could not calculate whether any particular program would ever halt. Turing’s machine was a general-purpose computer because it commanded infinite time and space. Necessarily restricted to specific purposes, real computers are not minds.
Wealth: Tested knowledge. Physical law dictates that matter is conserved: material resources have not changed since the Stone Age. All enduring economic advances come from the increase of knowledge through learning.
About the Author
GEORGE GILDER, one of the leading economic and technological thinkers of the past forty years, is the author of nineteen books, including Wealth and Poverty, Life after Television, Knowledge and Power, and The Scandal of Money. A founding fellow of the Discovery Institute, where he began his study of information theory, and an influential venture investor, he lives with his wife in western Massachusetts.
LIKE REGNERY ON FACEBOOK
FOLLOW US ON TWITTER
Notes
Prologue: Back to the Future—The Ride
1. Josiah Lee Auspitz, “The Wasp Leaves the Bottle,” The American Scholar, 2002, 602–19; Charles Sanders Peirce, Chance Love and Logic: Philosophical Essays (New York: Barnes & Noble, 1923), edited with an introduction by Morris Cohen and with an essay by John Dewey.
2. Neal Stephenson, Snow Crash (New York: Bantam Books, 1992), 24. “So Hiro’s not actually here at all. He’s in a computer-generated universe. . . . In the lingo, this imaginary place is known as the Metaverse.”
3. C. S. Lewis, The Weight of Glory and Other Addresses (New York: The Macmillan Company, 1949), 16–29. “Transposition” is the second essay.
4. George Gilder, Life after Television: The Coming Transformation of Media and American Life (Knoxville, Tenn.: Whittle Communications, 1990); revised edition (New York: Norton, 1994). “We will discover that television was a technology with supreme powers but deadly flaws. In the beginning the powers were ascendant; now the flaws dominate. The cultural limitations of television, tolerable when there was no alternative, are unendurable in the face of the new computer technologies now on the horizon—technologies in which, happily, the U.S. leads the world.” At the time, I called the new device a “telecomputer.” My friend Bruce Chapman, the founder of the Discovery Institute, suggested I change the word to the more elegant “teleputer” in subsequent editions, and so I did.
5. Gilder, Life after Television (1994), 20.
6. Ibid., 20 and passim. Earlier editions lack this specific riff on the teleputer, though I used it in speeches regularly in the early 1990s.
Chapter 1: Don’t Steal This Book
1. W. Brian Arthur, “Where is Technology Taking the Economy?” McKinsey Quarterly, October 2017.
2. Yuval Noah Harari, Homo Deus: A Brief History of Tomorrow (New York: HarperCollins, 2017).
3. Jeff John Roberts and Adam Lashinsky, “Hacked: How companies fight back,” Fortune, June 22, 2017. “At the end of the day, though, humans are as much to blame as software. ‘The weak underbelly of security is not tech failure but poor process implementation or social engineering,’ says Asheem Chandna.” An investor with Greylock Partners and a Palo Alto Networks director, “Chandna notes that most hacking attacks come about in two ways, neither of which involves a high level of technical sophistication: An employee clicks on a booby-trapped link or attachment—perhaps in an email that appears to be from her boss—or someone steals an employee’s log-in credentials and gets access to the company network.”
/>
4. Karl Marx, The German Ideology (Amherst, NY: Prometheus Books, 1998), 53.
5. William F. Buckley Jr., Cancel Your Own Goddam Subscription (New York: Basic Books, 2007), 21–24. J. R. Nyquist, “How to Immanentize the Eschaton,” Financial Sense, Applying Common Sense to Markets, July 6, 2009. “In 1969 a sixteen-year-old boy wrote to conservative columnist William F. Buckley, Jr., ‘to discover just what, in God’s name, the phrase “to immanentize the eschaton” means.’ Buckley replied: ‘Eschaton means, roughly, the final things in the order of time; immanentize means, roughly, to cause to inhere in time. So that to immanentize the eschaton is to cause to inhere in the worldly experience and subject to human dominion that which is beyond time and therefore extraworldly. To attempt such a thing is to deny the transcendence of God; to assume that Utopia is for this world.’
“Buckley’s answer strikes me as humorous because ‘to immanentize the eschaton’ basically means to bring about the end of the world (i.e., ‘the final things in the order of time’). It may be said, that those who deny the transcendence of God carry within themselves the apocalypse as they attempt to build their silly Utopia. For what better way is there for bringing about the ‘end times’? Those who would free us from racism, sexism and classism, seeking to make ‘the world as one,’ fail to realize that humanity isn’t perfectible; that any attempt to make men perfect is likely to confuse essential instincts, breaking up whatever workable order we’ve managed to achieve.”
6. Ronald H. Coase, The Firm, The Market, and the Law (Chicago: University of Chicago Press, 1988), 7. “The limit to the size of the firm is set where its costs of organizing a transaction become equal to the cost of carrying it out through the market.”
Life After Google Page 28