How the Internet Happened

Home > Other > How the Internet Happened > Page 28
How the Internet Happened Page 28

by Brian McCullough


  From the days of the Net­scape browser, users had used bookmarks and “favorites” to keep track of their favorite web pages. But what if you wanted to see what other people had bookmarked? Del.icio.us (launched in September of 2003) let you do just that, allowing users to discover cool new things on the web by sharing their bookmarks with each other, just as Napster had allowed them to exchange songs.

  The new postbubble web was about the users and the content in equal measure. It was about spontaneous impulses like “sharing” and self-organizing schemes like “tagging” and taxonomies. It was about how the content created by and for the hoi polloi often ended up being more engaging and exciting than the content that was prepackaged or professionally produced. And increasingly, the new web was about the collective “wisdom” of the crowd to create and organize the anarchy.

  The idea of collaborative effort and collective organization had long been a common practice in hacker and software development circles. Just as each of the hackers on w00w00 had pitched in to help Shawn Fanning refine Napster, groups of programmers often came together and formed communities around the development of “open source” projects like the Linux operating system. Far from being a case of “too many cooks in the kitchen” creating a muddled fiasco, open-source development proved that complete strangers could independently, and without much centralized coordination, come together to collectively produce things in an orderly, sublime way.

  A veteran software developer named Ward Cunningham brought this practice to the web for the first time on his Portland Pattern Repository, a website for other programmers to contribute and share programming ideas. On March 25, 1995, Cunningham installed a subpage on the site called WikiWikiWeb. The “wiki” (the term came from the Hawaiian word for “quick”) constituted a series of pages that could be edited by any user. So, a given user might post some code patterns to the wiki, and another user might come behind him and add to those patterns, change them, even completely replace them. But all edits were stored, and the page could revert to previous versions if any user chose to do so. It seems counterintuitive that such a system could work, but Cunningham learned that, given enough input from enough interested users, his Wiki system worked quite well. Cunningham is famous for coining “Cunningham’s Law,” which finds that “the best way to get the right answer on the Internet is not to ask a question, it’s to post the wrong answer.”4 If a user contributed code patterns to his site that other users found wrong or merely objectionable, Cunningham found that, almost inevitably, another user would come along and right the wrong.

  Wikis tapped into a powerful impulse of collective action. A few years later, an obscure entrepreneur would make use of this impulse to save his own struggling creation. Jimmy Wales was a serial dot-com entrepreneur who had found a modest degree of success by creating more sophisticated web directories—sites like Yahoo, but more focused. Wales also had a lifelong passion for encyclopedias and was obsessed with the notion that the web could create the largest encyclopedia conceivable. “Imagine a world in which every single person is given free access to the sum of all human knowledge,” Wales would write later.5 In early 2000, he launched what he called his Nupedia project, soliciting experts in a wide range of fields to contribute articles for what he hoped would eventually become an infinity encyclopedia. Contributors to the project were required to be knowledgeable in a given topic, and they would have to submit their articles to a rigid system of peer review by vetted editors. Also, the editors themselves had to be credentialed. “We wish editors to be true experts in their fields and (with few exceptions) possess Ph.D.s.,” the Nupedia policy stated.

  But Nupedia’s rigid quality control apparatus proved inefficient. It wasn’t until September 2000 that the first article made it through the layers of editors, and by the end of the year, less than two dozen had been published on Nupedia’s website. In frustration, on January 10, 2001, Wales installed a descendant of Cunningham’s original wiki software on Nupedia’s server. This “Wikipedia” was merely intended as a separate feeder service to speed up the Nupedia submissions process. Articles would be collectively written and edited on Wikipedia, then fed into the existing peer-review editing process. Almost immediately, however, Wikipedia overtook Nupedia not just in the quantity of articles that were created, but in the quality as well. The first article created, on January 15, was on the letter “U” and investigated the origins and usage of the twenty-first letter of the English alphabet.6 It was comprehensive, it was well written, and it was—to the surprise of Wales and his team of editors—accurate. The few thousand users who had shown up to test out Wikipedia had, through their collective input and edits, gotten the article polished to near-authoritative quality.

  Within a month, Wikipedia had around 600 articles, achieving in a matter of weeks more than Nupedia had achieved in a year. The experiment was promoted on Slashdot, and soon Wikipedia was flooded with Slashdot’s passionate users, members of a community who were already comfortable with collective editorial action. Within a year, Wikipedia had grown to 20,000 articles. By 2003, the English-language Wikipedia had more than 100,000 articles, and versions of the service were springing up in every language imaginable. By that point, Nupedia and its rigorous system of editors and peer review had long been abandoned.

  What confounded everyone who learned of the success of Wikipedia was that it actually worked! “Couldn’t total idiots put up blatantly false or biased descriptions of things, to advance their ideological agendas?” asked one of the leads of the original Nupedia project on internal Wikipedia message forums. “Yes,” replied a Wikipedia partisan, “and other idiots could delete those changes or edit them into something better.”7 It turned out that the “infinite monkey theorem” about giving enough monkeys typewriters and eventually producing Shakespeare—was not exactly fanciful. Enough self-interested strangers could achieve a fair degree of accuracy on a wide range of topics. In 2006, there were 45,000 active editors of the English-language version of Wikipedia alone.8

  And Wikipedia had unique advantages that the web made possible. In the coming years, as any news or historical event occurred, Wikipedia contributors would post an up-to-the-minute factual summation of these events, and then amend the entries to reflect changing circumstances or new information. Wikipedia was often accurate and authoritative in near-real time, and it had the infinite space and resources of the Internet to play with, so it could serve what became known as the “long-tail” of content. Any encyclopedia worth its salt might have an article on World War II. But Wikipedia could produce a 418-word entry on, say, the Compton railway station, an abandoned stop on the Didcot, Newbury & Southampton Railway in England. Or, it could produce a detailed plot and development synopsis on Season 8, episode 14 of the TV show Cheers, the one where Cliff Clavin goes on Jeopardy. No other encyclopedia in history was capable of that sort of breadth of topics.

  Wikipedia was a modern miracle and soon became one of the most trafficked websites in the world. Wales had originally intended the project to be a commercial one, supported by advertising. But when the contributors and editors revolted at the very suggestion of putting ads up on Wikipedia, Wales instead made the site into a nonprofit enterprise. To this day, it is supported by contributions from the public and is thereby an open-source counterweight to the proprietary “answer engine” that is Google.

  ■

  GRADUALLY, PEOPLE BEGAN to notice that there was a new energy on the web and it shared several characteristics. The long tail. The wisdom of crowds. Users creating content of and to their own design. In 2004, this new Internet energy gained the name Web 2.0, after a similarly named conference held in October 2004. If Web 1.0 was about browsing stuff created by others, Web 2.0 was about creating stuff yourself. If Web 1.0 was about connecting all the computers in the world together, then Web 2.0 was about connecting all the people in the world together, via those interlaced computers. If the clarion call of Web 1.0 was the Net­scape IPO, then the coming of age of We
b 2.0 was Google’s IPO. “Web 2.0 means using the web the way it’s meant to be used,” wrote Paul Graham, a veteran entrepreneur of the Web 1.0 era who would soon become a key driver of Web 2.0 as an investor. “The ‘trends’ we’re seeing now are simply the inherent nature of the web emerging from under the broken models that got imposed on it during the Bubble.”9

  Within the technology industry itself, the sense that the Internet revolution was back in gear came via the promotional efforts of—what else?—a blog. On June 10, 2005, Michael Arrington, a thirty-five-year-old former Silicon Valley lawyer who was active during the dot-com years, started posting to a personal blog at TechCrunch.com. Arrington’s entries were mostly musings about the new services, websites and companies he saw bubbling up through the Web 2.0 scene. But he soon branched out to covering the actual news of Web 2.0: what new companies were being founded and by whom; what startups were raising an investment round and with whom; what hot new websites had been acquired, and by whom. TechCrunch became not only the cheerleader of the Web 2.0 movement, but, in a sense, proof that the movement even existed. Arrington simultaneously became a power player in his own right, since his site became a PR bonanza for whatever new service or company he deigned to blog about. As Wired magazine put it, “A positive 400-word write-up on TechCrunch usually means a sudden bump in traffic and a major uptick in credibility among potential investors.” When TechCrunch gave a glowing write-up to a startup named Scribd, as Wired reported, “CEO and cofounder Trip Adler says he had 10 calls from venture capitalists within 48 hours.”10

  Indeed, the startup scene was back in full swing, in no small part thanks to TechCrunch and the hype around Web 2.0. Usage of the Internet had never dipped and indeed was finally reaching critical mass in the developed world. In 2003 alone, the percentage of Americans with broadband Internet connections in their home increased from 15 to 25%.11 A new technology called WiFi arrived on the stage to make the notion of surfing the web something that felt ubiquitous and commodified. Even online advertising was coming back, providing that same old business model (but with different tools and greater numbers) to new online efforts. Between 2002 and 2006, U.S. advertisers increased their online ad spend from $6 billion to $16.9 billion.12

  The venture capital machine started to lurch back into life to fund this new activity. VC investments in U.S. startups bottomed out at $19.7 billion in 2003, a far cry from the dot-com–era peak of more than $100 billion in the year 2000.13 In the coming years, VC investment would rise—modestly but steadily—reaching $29.4 billion in 2007.14 A slew of new companies were funded, but the renewed interest in Internet startups was not a replay of the late-nineties frenzy. Both investors and entrepreneurs had been chastened by the bubble’s aftermath. Get Big Fast was no longer the strategic mantra; multimillion-dollar advertising campaigns and gaudy launch parties were out. Instead, Web 2.0 companies aimed at refining their products and services, carefully cultivating a user base through feature innovation and word-of-mouth discovery, all while focusing like a laser on issues such as reliability and scalability.

  VC investment didn’t roar back in huge numbers because it didn’t have to. In the Web 2.0 era, you could create a service used by millions in a matter of months, and you could do so for pennies on the dollar—at least, compared to the dot-com era. The hangover from the bubble fallout meant that talented programmers could be hired on the cheap; the infrastructure glut leftover from the global fiber buildout meant that bandwidth, storage and data costs were lower; and the tools developed during the bubble meant that you didn’t have to build a company from scratch anymore—you could cobble one together using free and open-source tools to assemble the building blocks of a minimum-viable product for next to nothing. By some estimates, the cost of starting a web company had fallen by 90% in the few short years of the nuclear winter.15

  The website Digg was perhaps the poster-child company of the Web 2.0 era, and it illustrates this change in startup economics perfectly. In 2004, twenty-seven-year-old Kevin Rose had an idea for a new website that would help plugged-in geeks like himself discover the news of the day: whatever was hot on the blogs or even mainstream sites like the New York Times. His vision was of a site that took the community-voting aspects of Slashdot, but gave the power to surface news to anybody. On Digg.com, any user could submit a story and other users could “digg” it. If enough users dugg, then the story would rise to the front page. Conversely, if users didn’t like a story, they could vote to “bury” it. Rose registered the Digg.com domain name (that was the biggest expense, actually; he had to buy the domain from an existing owner), paid a programmer in Canada $12 an hour to code up the site, and paid $90 a month to have a company host it. The site launched on December 5, 2004.16 Rose’s total outlay was around $10,000.

  For that investment, Rose soon had the hottest site on the Internet. Within a year, Digg passed Slashdot in traffic.17 Making it to the front page of Digg could drive scads of traffic to a website, so publishers all around the web began to add “Digg This” buttons to their websites. Within two years, Digg had nearly as much web traffic as the New York Times and more than 1 million people came to the site daily, “digging” thousands of stories.18 Digg was nominally profitable from day one, thanks to AdSense ads from Google, and later, banner ads from more traditional marketing networks. In 2007, Digg landed a $100 million ad deal with Microsoft. By that point, Rose had appeared on the cover of Businessweek under the headline “How This Kid Made $60 Million in 18 Months.” That estimation of Rose’s paper wealth came from the valuation given to Digg by venture capitalists. But the truth was, Digg had only raised money reluctantly. As Rose and his cofounder Jay Adelson made the rounds on Sand Hill Road, home to the most powerful Silicon Valley VCs, they were shocked by what they saw as the outdated thinking among the money men. “They are still back in the 1998 belief system that it’s all about the portals,” Adelson marveled.19 The VCs wanted to throw tens of millions of dollars at them in order to build the next Yahoo or AOL. Rose and Adelson were content to raise a paltry $2 million. They didn’t really need the funding, and besides, raising less money meant keeping more equity for themselves.

  The new Web 2.0 companies didn’t need to raise as much money and, unlike just a few years previously, none of them were in any hurry to go public. In the wake of the bubble bursting, a wave of scandals involving companies such as Enron and WorldCom had ushered in a new era of financial regulations. The Sarbanes-Oxley legislation especially meant that there were fewer advantages to going public and more incentives to stay private for as long as possible. Without the venture capitalists breathing down their necks for a financial “exit,” the Web 2.0 companies were more in control of their own destinies and wary of the pressures that a blockbuster IPO would impose upon them. The lesson of the bubble had been learned: you can go for broke, but try to build a real company first.

  That didn’t mean the money men were denied their “exits.” As the survivors of the dot-com bubble began to see their balance sheets return to health, there was an entire group of deep-pocketed acquirers that would begin to pick off the most promising members of the Web 2.0 class. Yahoo swallowed up Flickr and Del.icio.us in 2005, for around $40 million and $20 million, respectively. Scandinavians Niklas Zennström and Janus Friis created the second-generation peer-to-peer networking platform Kazaa before turning to that same P2P technology in order to make phone calls over the web. They founded Skype, enabled hundreds of millions of users worldwide to call and chat with each other for free, and sold the company to eBay for $2.6 billion in September 2005.

  But the acquisition saga everyone followed in those early Web 2.0 days was that of YouTube. Late in 2004, three former PayPal employees, Chad Hurley, Steve Chen and Jawed Karim, were mulling over a problem: why wasn’t it as easy to post a video to the web as it was to post a photo to Flickr or a blog post to a blog? YouTube was the site they launched to solve that problem, and from the very beginning, the overriding idea was for dead-simple, pu
sh-button video uploading.

  But what, exactly, should people be encouraged to upload? Should YouTube encourage people to create original, dramatic videos with near–television-production quality? Or maybe YouTube would just host videos for eBay auctions and use the thriving auction economy to jumpstart growth just as PayPal had (they were card-carrying members of the PayPal Mafia, remember). There was even some early discussion about copying HotorNot.com, a popular Web2.0 site where users uploaded profile pictures, and other users voted the portraits up or down based on attractiveness. “In the end, we just sat back,” said Hurley, meaning they just let the users upload whatever they wanted no matter how silly, or inane, or personal, or whatever.20 It was the Web 2.0 way.

  The first video posted to YouTube exemplified this attitude. Me at the Zoo is a nineteen-second video of Jawed Karim at the San Diego Zoo in front of the elephant exhibit. Uploaded on April 23, 2005, Karim offered the following pithy narration:

  Alright, so here we are in front of the, uh, elephants. Uh. The cool thing about these guys is that they have really, really, really long, um, trunks, and that’s, that’s cool. And that’s pretty much all there is to say.

  Not exactly “one small step for man” stuff, but credit to the YouTube guys for understanding that that was exactly the sort of video that YouTube was good for.

  YouTube was fortunate in its timing. By 2005, broadband Internet adoption continued to increase, and consumer video cameras were becoming common. Even some cell phones allowed you to shoot video by the time YouTube launched. In August of 2005, YouTube got favorable coverage from TechCrunch as well as Slashdot. The number of videos posted started to increase. And then, the post-anything spirit of blogging that YouTube was mimicking helped traffic ramp up even more. In fact, it was the blogs themselves that really helped YouTube explode in popularity. The blogs—and social networks like Myspace.

 

‹ Prev