Book Read Free

The Internet Is Not the Answer

Page 4

by Andrew Keen


  Bob Kahn and Vint Cerf met at UCLA in 1970 while working on the ARPANET project. In 1974 they published “A Protocol for Packet Network Intercommunication,” which laid out their vision of two complementary internetworking protocols that they called the Transmission Control Protocol (TCP) and the Internet Protocol (IP)—TCP being the service that guarantees the sending of the stream and IP organizing its delivery.

  Just as Paul Baran designed his survivable network to have a distributed structure, so the same was true of Kahn and Cerf’s TCP/IP. “We wanted as little as possible at the center,” they wrote about the unerringly open architecture of these new universal standards that treated all network traffic equally.32 The addition of these protocols to the ARPANET in January 1983 was, according to Internet historians Hafner and Lyon, “probably the most important event that would take place in the development of the Internet for years to come.”33 TCP/IP enabled a network of networks that enabled users of every network—from ARPANET, SATNET, and PRNET to TELENET and CYCLADES—to communicate with each other.

  Kahn and Cerf’s universal rulebook for digital communications fueled the meteoric growth of the Internet. In 1985, there were around 2,000 computers able to access the Internet. By 1987 this had risen to almost 30,000 computers and by October 1989 to 159,000.34 Many of these computers were attached to local area networks as well as early commercial dial-up services like CompuServe, Prodigy, and America Online. The first so-called killer app, a term popularized by Larry Downes and Chunka Mui in their bestseller about the revolutionary impact of digital technology on traditional business,35 was electronic mail. A 1982 ARPANET report reviewing the network’s first decade noted that email had come to eclipse all other applications in the volume of its traffic and described it as a “smashing success.” 36 Thirty years later, email had become, if anything, an even bigger hit. By 2012 there were more than 3 billion email accounts around the world sending 294 billion emails, of which around 78% were spam.37

  Another popular feature was the Bulletin Board System (BBS), which enabled users with similar interests to connect and collectively share information and opinions. Among the best known of these was the Whole Earth ’Lectronic Link (the WELL), begun in 1985 by the Whole Earth Catalog founder Stewart Brand. The WELL captured much of the countercultural utopianism of early online users who believed that the distributed structure of the technology created by Internet architects like Paul Baran, with its absence of a central dot, represented the end of traditional government power and authority. This was most memorably articulated by John Perry Barlow, an early WELL member and lyricist for the Grateful Dead, in his later 1996 libertarian manifesto “Declaration of the Independence of Cyberspace.”

  “Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of mind,” Barlow announced from, of all places, Davos, the little town in the Swiss Alps where the wealthiest and most powerful people meet at the World Economic Forum each year. “I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.”38

  The real explanation of the Internet’s early popularity was, however, more prosaic. Much of it was both the cause and the effect of a profound revolution in computer hardware. Rather than being dependent on “giant brains” that one “walked into” like the 1,800-square-foot ENIAC, the invention of the transistor by a Bell Labs team in 1947—“the very substructure of the future,”39 in the words of the technology writer David Kaplan—resulted in computers simultaneously becoming smaller and smaller and more and more powerful. “Few scientific achievements this century were as momentous,” Kaplan suggests about this breakthrough. Between 1967 and 1995, the capacity of computer hard drives rose an average of 35% every year, with Intel’s annual sales growing from under $3,000 in 1968 to $135 million six years later. Intel’s success in developing faster and faster microprocessors confirmed the prescient 1965 statement of its cofounder Gordon Moore, “Moore’s law,” which predicted that chip speed would double every year or eighteen months. And so, by the early 1980s, hardware manufacturers like IBM and Apple were able to build “personal computers”—relatively affordable desktop devices that, with a modem, allowed anyone access to the Internet.

  By the end of the 1980s, the Internet had connected 800 networks, 150,000 registered addresses, and several million computers. But this project to network the world wasn’t quite complete. There was one thing still missing—Vannevar Bush’s Memex. There were no trails yet on the Internet, no network of intelligent links, no process of tying two items together on the network.

  The World Wide Web

  In 1960, a “discombobulated genius” named Ted Nelson came up with the idea of “nonsequential writing,” which he coined “hypertext.”40 Riffing off Vannevar Bush’s notion of “information trails,” Nelson replaced Bush’s reliance on analog devices like levers and microfilm with his own faith in the power of digital technology to make these nonlinear connections. Like Bush, who believed that the trails on his Memex “do not fade,”41 the highly eccentric Nelson saw himself as a “rebel against forgetting.”42 His lifelong quest to create hypertext, which he code-named Xanadu, was indeed a kind of rebellion against forgetfulness. In Nelson’s Xanadu system, there was no “concept of deletion.” Everything would be remembered.

  In 1980, twenty years after Nelson’s invention of the hypertext idea, a much less eccentric genius, Tim Berners-Lee, arrived as a consultant at the European Particles Physics Laboratory (CERN) in Geneva. Like Nelson, Berners-Lee, who had earned a degree in physics from Oxford University’s Queens College in 1976, was concerned with protecting his own personal forgetfulness. The problem, Berners-Lee wrote in his autobiography, Weaving the Web, was remembering “the connections among the various people, computers, and projects at the lab.”43 This interest in memory inspired Berners-Lee to build what he called his first website program, Enquire. But it also planted what he called “larger vision” in his “consciousness”:

  Suppose all the information stored on computers everywhere were linked, I thought. Suppose I could program my computer to create a space in which anything could be linked to anything. All the bits of information in every computer at CERN, and on the planet, would be available to me and to anyone else. There would be a single global information space.44

  In 1984, when Berners-Lee returned to CERN and discovered the Internet, he also returned to his larger vision of a single global information space. By this time, he’d discovered the work of Vannevar Bush and Ted Nelson and become familiar with what he called “the advances” of technology giants like Donald Davies, Paul Baran, Bob Kahn, and Vint Cerf.

  “I happened to come along with time, and the right interest and inclination, after hypertext and the Internet had come of age,” Berners-Lee modestly acknowledged. “The task left to me was to marry them together.”45

  The fruit of that marriage was the World Wide Web, the information management system so integral to the Internet that many people think that the Web actually is the Internet. “If I have seen further it is by standing on the shoulders of giants,” Isaac Newton once said. And Berners-Lee not only built upon the achievements of the Internet’s founding fathers, but he designed the Web to ride on top of the Internet to create what the Sussex University economist Mariana Mazzucato calls a “foundational technology.”46

  His program leveraged the Internet’s preexisting packet-switching technology, its TCP/IP protocols, and, above all, its completely decentralized structure and commitment to treating all data equally. The Web’s architecture was made up of three elements: first, a computer language for marking up hypertext files, which he called Hypertext Markup Language (HTML); second, a taxonomy for traveling between these hypertext files, which he called Hypertext Transfer Protocol (HTTP); third, a special address code linked to each hypertext file that would be able to instantly call up any other file on the Web, which he called a Universal Resource Locator (URL).47 By labeling files and by u
sing hypertext as a link between these files, Berners-Lee radically simplified Internet usage. His great achievement was to begin the process of taking the Internet out of the university and into the world.

  Berners-Lee wrote his initial proposal for the Web in March 1989, revising the proposal and building the first Web browser, named WorldWideWeb, in 1990. In January 1991 the Web went public and in November 1991 the first website, an information resource about CERN with the address Info.cern.ch, was launched. Even more than email, the Web has been the Internet’s ultimate killer app over the last quarter of a century. With the creation of the Web, concludes John Naughton, the Internet achieved “liftoff.”48 Without Berners-Lee’s brilliantly simple innovation there would be no Google, Amazon, Facebook, or the millions of other websites and online businesses that we use on a daily basis. Without the Web, we wouldn’t all be living in Ericsson’s Networked Society.

  Tim Berners-Lee wrote his initial Web proposal in March 1989 at CERN. Six months later, a few hundred miles to the northeast of Geneva, the Berlin Wall fell and the Cold War came to an end. Back then, with the dramatic destruction of the Wall in November, it was thought that 1989 would be remembered as a watershed year that marked the end of the Cold War and the victory of free-market liberalism. The Stanford University political scientist Francis Fukuyama, assuming that the great debate between capitalists and socialists over the best way to organize industrial society had finally been settled, described the moment that the Wall came down as the “End of History.”

  But the converse is actually true. Nineteen eighty-nine actually represents the birth of a new period of history, the Networked Computer Age. The Internet has created new values, new wealth, new debates, new elites, new scarcities, new markets, and above all, a new kind of economy. Well-intentioned technologists like Vannevar Bush, Norbert Wiener, J. C. R. Licklider, Paul Baran, Robert Kahn, and Tim Berners-Lee had little interest in money, but one of the most significant consequences of their creation has been the radical reshaping of economic life. Yes, the Internet may, as one historian suggests, be the “greatest co-operative enterprise in the history of mankind.”49 But distributed technology doesn’t necessarily lead to distributed economics, and the cooperative nature of its technology isn’t reflected in its impact on the economy. No, with the creation of the Web came the creation of a new kind of capitalism. And it has been anything but a cooperative venture.

  CHAPTER TWO

  THE MONEY

  The One Percent Economy

  San Francisco’s venerable Commonwealth Club, standing at the southern end of Battery Street, a few blocks from the Battery social club, rarely sells out of tickets for its speaking events. But in February 2014, the club hosted a controversial eighty-two-year-old multibillionaire speaker who gave a sold-out speech titled “The War on the One Percent,” requiring the presence of three police officers to protect him from a bellicose, standing-room-only crowd.1

  A month earlier, Tom Perkins, the cofounder of the Kleiner Perkins Caufield & Byers (KPCB) venture capital firm and “the man most responsible for creating Silicon Valley,” according to his biographer,2 had written an angry letter of complaint to the Wall Street Journal about what he described as San Francisco’s “Progressive Kristallnacht.” The letter was a defense of Silicon Valley’s technological elite—the venture capitalists, entrepreneurs, programmers, and Internet executives of KPCB-backed local Internet companies like Google, Twitter, and Facebook, identified by Perkins as “the successful one percent.”3 It turned out to be the most commented upon letter ever published in the Journal, sparking an intense debate about the nature of the new digital economy.

  “From the Occupy movement to the demonization of the rich embedded in virtually every word of our local newspaper, the San Francisco Chronicle, I perceive a rising tide of hatred of the successful one percent. This is a very dangerous drift in our American thinking. Kristallnacht was unthinkable in 1930; is its descendant ‘progressive’ radicalism unthinkable now?,” Perkins wrote about the growing popular resentment in the Bay Area to dominant Internet companies like Google and Facebook.

  Tom Perkins’s February 2014 speech at the Commonwealth Club also made news around the world. While Perkins apologized for his incendiary Kristallnacht analogy, he nonetheless defended the main premise of his Journal letter, telling the audience that “the one percent are not causing inequality—they’re the job creators.”4

  Many of those in the audience disagreed, seeing local Internet companies like Google, Facebook, and Twitter as the cause of rather than solution to the exorbitant real estate prices and the high levels of poverty and unemployment in the Bay Area. “We’ve never seen anything remotely like this before,” explained the San Francisco cultural historian Gary Kamiya. “Techies used to seem endearing geeks, who made money and cute little products but couldn’t get the girls. Now they’re the lord and masters.”5

  Perkins, a former Hewlett-Packard executive who cofounded KPCB in 1972, made the same argument about the value of the one percent in his 2007 autobiography, Valley Boy. Describing some of his venture capital firm’s greatest triumphs, including the financing of Netscape, Amazon, and Google, he boasted that KPCB investments have created $300 billion in market value, an annual revenue stream of $100 billion, and more than 250,000 jobs.6 It’s a win-win, he wrote in Valley Boy, insisting that the new digital economy is a cooperative venture. It’s resulting in more jobs, more revenue, more wealth, and more general prosperity.

  KPCB’s successful bets on Netscape, Amazon, and Google certainly have been a personal win-win for Perkins. These lucrative investments enabled the self-styled “Valley Boy” to build the Maltese Falcon, a $130 million yacht as long as a football field, made out of the same militarized carbon-fiber material as a B-1 bomber,7 and Dr No, his current “adventure yacht,” which carries his own private submarine to explore the South Pole. They financed the purchase of his Richard Mille watch, which he claims is worth as much as a “6-pack of Rolexes,”8 his 5,500-square-foot apartment on the sixtieth floor of San Francisco’s Millennium Tower with its spectacular views of the Bay, and his multimillion-dollar mansion in exclusive Marin County, just over the Golden Gate Bridge from San Francisco.

  But Perkins was wrong about the broader benefits of the network economy that KPCB played such an important role in creating. A quarter century after Tim Berners-Lee’s invention of the Web, it’s becoming increasingly clear that the Internet economy is anything but a cooperative venture. The structure of this economy is the reverse of the technological open architecture created by the Internet’s pioneers. Instead, it’s a top-down system that is concentrating wealth instead of spreading it. Unfortunately, the supposed “new rules” for this new economy aren’t very new. Rather than producing more jobs or prosperity, the Internet is dominated by winner-take-all companies like Amazon and Google that are now monopolizing vast swaths of our information economy.

  But why has this happened? How has a network designed to have neither a heart, a hierarchy, nor a central dot created such a top-down, winner-take-all economy run by a plutocracy of new lords and masters?

  Monetization

  In The Everything Store, his definitive 2013 biography of Amazon founder and CEO Jeff Bezos, Brad Stone recounts a conversation he had with Bezos about the writing of his book. “How do you plan to handle the narrative fallacy?” the Internet entrepreneur asked, leaning forward on his elbows and staring in his bug-eyed way at Stone.9

  There was a nervous silence as Stone looked at Bezos blankly.

  The “narrative fallacy,” Bezos explained to Stone, is the tendency, particularly of authors, “to turn complex realities” into “easily understandable narratives.” As a fan of Nassim Nicholas Taleb’s The Black Swan, a book that introduced the concept, Jeff Bezos believes that the world—like that map on the wall of Ericsson’s Stockholm office—is so random and chaotic that it can’t be easily summarized (except, of course, as being randomly chaotic). The history of Amazon is too
complicated and fortuitous to be squeezed into an understandable narrative, Bezos was warning Stone. And he would, no doubt, argue that the history of the Internet, in which he and his Everything Store have played such a central role since Amazon.com went live on July 16, 1995, is equally complex and incomprehensible.

  But Bezos, the founder and CEO of the largest bookstore in the world, is wrong to be so skeptical of easily understandable stories. The narrative fallacy is actually a fallacy. Sometimes a story that appears to be complex is, in reality, quite simple. Sometimes it can be summarized in a sentence. Or even a single word.

  The history of the Internet, which certainly appears to be as random and chaotic as any story ever told, is actually two simple stories. The first story—from World War II through the end of the Cold War in the early nineties—is a narrative of public-spirited technologists and academics like Vannevar Bush, Paul Baran, and Tim Berners-Lee, and of publicly funded institutions like NDRC, ARPA, and NSFNET. This is primarily a story of how the Internet was invented for national security and civic goals. It’s a story of how public money—like the million-dollar ARPA investment in Bob Taylor’s project to link computers—paid to build a global electronic network. And it’s a story of these well-meaning pioneers’ relative indifference and occasional hostility to the lucrative economic opportunities offered by their creation. Berners-Lee, who fought hard to make his World Wide Web technology available to anyone for free, even argued that charging for licensing fees for Web browser technology “was an act of treason in the academic community and the Internet community.”10 Indeed, up until 1991, Internet commerce was an “oxymoron” since the US government maintained legal control of the Internet and required companies that sought access to NSFNET, the Net’s backbone, to sign an “Acceptable Use Policy” that limited such use to “research and education.”11

 

‹ Prev