Aside from push-button-easy uploading, the true brilliance of YouTube was the site’s second important focus: dead-simple sharing. After you posted a video to YouTube, you could simply share a link to your uploaded video, just like with Flickr. But you could just as easily cut and paste a few lines of code and your video would play, embedded, wherever you wanted it to: on your website, your blog, or your Myspace page. You didn’t ever have to send people to YouTube if you didn’t want to. Suddenly, videos were popping up all around the web at a time when web video was still a relatively rare phenomenon. Every time someone embedded a video on a random website, there was that little YouTube logo at the bottom that encouraged people to visit YouTube and try posting videos themselves.
YouTube was incredibly popular on Myspace, but it was the combination of Myspace and the blogs that really caused YouTube to take off. It was the “share-yourself, share-anything!” ethos of the moment combined with the ubiquitous distribution platform of the web that led to what we now call “virality.” This was proven by the smash online success of the Lazy Sunday video. In 2005, Saturday Night Live aired a roughly two-minute musical skit chronicling the antics of a couple of young white dudes in Manhattan hitting up Magnolia Bakery on a Sunday morning and then catching a matinee showing of the recent Chronicles of Narnia movie—all set to hard-core rap stylings. It was goofy and catchy, and was also probably a throwaway segment on the show’s first airing. But as fate would have it, shortly after the original broadcast, someone posted a video capture of the skit to YouTube, where it quickly racked up 5 million views.21 NBC’s lawyers had it taken down in a matter of days, but not before word of mouth around the video increased YouTube traffic by 83%.
After the early months of indifferent traffic, YouTube’s audience exploded faster than any previous website in history (including Google, Myspace and Facebook). By the beginning of 2006, the site was serving 3 million video views a day. Six months later, that number had grown to 100 million views a day. Like most good Web 2.0 companies, YouTube achieved this success on a shockingly small amount of money. The company only ever raised $11.5 million, in two investment rounds. The fact that YouTube could serve video to the world from just a handful of servers (and some helpful content delivery networks in the background) was a powerful testament to the infrastructure the dot-com bubble had bequeathed to this new generation of startups.
Today we’re used to popular “memes” bouncing around the world in an instant and have come to expect that social media can make superstars of teenagers from Canada (I’m thinking specifically of Justin Bieber, of course, who would be discovered thanks to videos his mother posted to YouTube). YouTube was ground zero for things like that, for the birth of modern meme culture as well as the social media–celebrity ecosystem. The idea that random events or random people could “go viral” really entered the mainstream thanks to YouTube. “We are providing a stage where everyone can participate and everyone can be seen,” Hurley told the Associated Press in April of 2006.22 There was no greater Web 2.0 manifesto than that.
But the “Lazy Sunday” phenomenon also pointed to one looming issue that concerned a lot of people about YouTube: there was a ton of copyrighted material uploaded illegally on the site. Sure, there were user-created home movies by the barrelful; but just as common were copies of last night’s episode of Survivor or even clips from first-run movies still in theaters. In short, there was plenty of piracy going on. Just as with Napster, users came to expect that they could watch anything and everything on YouTube—from the latest Justin Timberlake video to obscure Japanese films from the 1960s.
But that was the issue: how was YouTube anything but Napster 2.0, with all the inevitable liability headaches that would imply? That was why people were obsessed with the who-will-buy-YouTube guessing game in 2006. Even though YouTube was exploding in popularity, it wasn’t making any money, and in the postbubble era, an IPO was out of the question without meaningful revenue on the bottom line. So, unless YouTube was able to sell out to a deep-pocketed patron before the lawsuits started flying, it ran the very real risk of being pushed into an early grave.
As would come out in subsequent litigation, the YouTube guys knew perfectly well that there was a ton of pirated material on their site. But they had learned the lessons of Napster. Napster had attempted to make the argument that it enjoyed legal immunity under the Digital Millennium Copyright Act as a neutral platform. Service providers and platforms were protected as “safe harbors” under the law, provided they quickly and efficiently remove copyrighted material when notified. That was what had ultimately doomed Napster: it had never been able to take down 100% of the pirated files on its service. Five years on from Napster, might YouTube be able to find someone who could create a better system to remove illegally uploaded material—someone who had a mastery of algorithms, perhaps?
On October 9, 2006, Google announced that it was purchasing YouTube for $1.65 billion in stock. For the YouTube guys, selling to Google was logical: for all of YouTube’s frugality, the cost of serving hundreds of millions of videos would eventually become prohibitive. Bandwidth might have been cheaper now, but who could hope to manage data on a scale that YouTube was achieving? Google was a perfect fit because its enormous infrastructure allowed YouTube the chance to handle the scale.
But Google’s decision to take on YouTube’s burden seemed downright crazy to a lot of people. Wasn’t Google paying a lot of money to basically assume a huge liability risk? It turned out that Google made one simple calculation when it purchased YouTube: in the broadband era, video was likely to become as ubiquitous on the web as text and pictures had always been. YouTube was already, in essence, the world’s largest search engine for video. In fact, it would eventually become the second-most-used search engine, period. With its stated mission to organize all the world’s information, Google simply couldn’t let video search fall outside its purview.
Google was able to come up with sophisticated automated systems that quickly and efficiently took down copyrighted videos when the rights holders alerted them. Lawsuits from aggrieved rights holders did eventually come, especially a billion-dollar lawsuit from Viacom. But because Google could prove that it was effective in policing content, in 2010 the judge in the Viacom case ruled in Google/YouTube’s favor, saying that Google’s takedown system was efficient enough that it complied with the Digital Millennium Copyright Act.
Google was the savior Napster never had. It had the infrastructure to allow YouTube to scale up; it had the technical sophistication to keep YouTube on the right side of the law; it had the money to contest the legal battles; and—most important—it provided YouTube with the business model that would allow it to thrive. Those little text ads that Google had put all over the Internet? They could be used to monetize the videos on YouTube just as they could with any other type of content. As the years went by, the text ads could even morph into actual video ads—but algorithmically targeted and effective ads, as Google’s ads always were.
And this was the last way in which YouTube’s timing was impeccable. The movie and television studios had watched the Napster debacle with dread. They knew their industries were next in line for disruption from the Internet. When that disruption arrived, in the form of YouTube, Hollywood was at least willing to weigh its options this time. Going scorched earth against Napster had not saved the music industry. And so, once Google came to the table with a willingness to share advertising revenue with rights holders, a lot of them (Viacom notwithstanding) were willing to play ball. At least Google/YouTube was offering Hollywood some kind of revenue stream. Digital revenue might not be as lucrative as the old analog revenue streams but, well, that was the Napster lesson, right? Better to take what you could get and embrace new distribution models rather than fight them. The entertainment industry was even now willing to buy into one of the key arguments Napster had tried to make only half a decade before: giving users a taste of your content online was actually great promotion! The phenomenon
of Lazy Sunday had shown that. By 2008, when YouTube was streaming 4.3 billion videos per month (in the United States alone), many people—young people especially—were beginning to watch more video online than they were watching on traditional TV.23 For the first time, Hollywood stopped fighting disruption, and followed the changing tastes of their audience into a digital future.
■
WEB 2.0 WAS ABOUT PEOPLE expressing themselves—actually being themselves, actually living—online. The last piece of the puzzle was simply to make the threads of all this social activity explicit.
Online chat clients like IRC, through which the Napster hackers had met each other and collaborated, had a technological cousin at AOL. In the days when AOL was still the dominant ISP with more than 20 million users, its internal messaging program allowed you to chat with your friends and family in real time. AOL’s chat had an extra feature called the “Buddy List” that alerted you as to which of your friends were online at the same time you were, so you could hit them up for a quick conversation. The system also allowed you to leave an away message so that your friends could know when they might expect you to be online again.
Instant messaging was only intended for internal use by members of AOL’s walled garden. But in 1997, the company did something completely out of character: it released the messenger program online as a stand-alone web client. It was known as AOL Instant Messenger, or AIM, and it allowed people to stay in touch with their AOL friends when they were away from AOL. It proved especially popular for people who were at work, where they couldn’t log on to AOL, and among teenagers, allowing them to keep up with all of their friends, whether they were AOL users or not. Soon, there were hundreds of millions of AIM users, many times more than the number of actual AOL subscribers at its height. Even as AOL the company began to crumble after the disastrous merger with Time Warner, AIM continued as a breakout success for one simple factor: it was a literal social graph, a tangible map of your online connections and relationships. Chatting on AIM became more popular than email, and your AIM screen name eventually gave you the ability to customize a rudimentary profile, turning it into a valuable online marker of identity. These features, combined with the away messages and status updates, came to reflect a user’s daily circumstance. Add to this the emojis and icons that allowed AIM users to project their mood, and AIM became a fully functional and real-time representation of the digital self. There was even an abortive project to create “Aimster,” which would add the ability to search a friend’s hard drive and trade files (AOL management, of course, killed that before it could see the light of day).
And that was the problem, of course. AOL had no idea what it was sitting on. AIM was a fully fleshed-out social network. True, it was free to use; but it was making a limited amount of money thanks to traditional banner ads. Had anyone at AOL been able to predict the future, AIM could have been the perfect platform to transition AOL users into the post–dial-up world. Before we were all sending SMS texts, before we all reconnected on Facebook, a great many of us were connected on AIM. The social graph was actually the great prize of Web 2.0. Others were only able to seize this prize because AOL dropped the ball. AIM eventually lost its relevance through benign neglect. “If AOL had 20/20 hindsight, maybe the story [of social networking] would have had a different ending,” says Barry Appelman, one of the AOL engineers who invented AIM.24
■
SOCIAL NETWORKING MIGHT SEEM like a dead-obvious concept in retrospect, but that’s only because we’ve gone through the looking glass into a modern world where the boundaries between our online lives and “real life” have been broken down almost completely. The roots of social networking go all the way back to the early web. The earliest dating sites like Match.com and the message boards on sites like iVillage allowed users to create an online “profile” or representation of your real-world self. And sites like GeoCities and Angelfire allowed users to construct personal webpages so intricate as to serve as virtual avatars in cyberspace.
The first modern social-networking site as we would recognize it today was invented by SixDegrees.com. In 1996, a former lawyer and Wall Street analyst named Andrew Weinreich had an idea inspired by the popular notion that any single person on the planet can be connected to anyone else by around six steps of personal connections—“six degrees” of separation. If that was true, then the web was the perfect tool for mapping those connections.
Launched in early 1997, SixDegrees took off in about a month, in the usual viral way we’re now familiar with: users sent their friends invitations to link up on the site. At its peak, the site had 3.5 million members, and in 1999, Weinreich wisely sold the company for $125 million to another Internet startup.25
At the time, many viewed SixDegrees as a newfangled Rolodex at best, a creepy dating site at worst. But Weinreich had been convinced there was something more powerful to the idea of networking online. “We envisioned Six Degrees being something of an OS—of an operating system—and we thought about it in the context of when you’re buying a watch at eBay you should be able to filter the watches based on people’s proximity to you,” Weinreich said. “You should be able to filter movie reviews in the future by who’s reviewing them.”26 It was the right idea, but as Weinreich would ruefully admit, “We were early. Timing is everything.”27 The site was expensive to operate in the dot-com days, and of course, there were no photos on the profiles. “We had board meetings where we would discuss how to get people to send in their pictures and scan them in,” Weinreich says.28 After the dot-com crash, the site was shuttered.
In 2002, a former Netscape employee named Jonathan Abrams launched a site called Friendster. Abrams wanted to rekindle SixDegrees’ original notion of real identities and real personal connections. Within a few months, the site had 3 million users from word-of-mouth marketing alone.29 The media seized upon Friendster as a more sophisticated version of online dating, and certainly, the digital profile pictures that could now easily be uploaded to your Friendster helped shape this impression of the site. Once connected to someone else, you could browse their friends to see who among them was attractive (and single) and then the idea was that your friend would put in a good word for you. But, this was just as the notion of the Web 2.0 renaissance was taking hold in Silicon Valley, so, dating site or no, Friendster was able to raise $12 million from blue-chip VCs including Kleiner Perkins and Benchmark Capital. In 2003, Google offered to buy Friendster for $30 million in pre-IPO Google stock, but the venture capitalists encouraged Abrams to spurn the offer and instead shoot for the moon.
Friendster ended up missing the moon by some distance. It turned out that hosting blogs or even serving portal pages to millions of users was one thing, but a social network scaled to millions of users was another thing entirely. On a social network, the content was ever-changing, and what was served to each user was often unique to that user, often only in that moment of time. Friendster had to dynamically propagate each new update, each new post—and each new picture. The engineering challenges of delivering what was quickly becoming a deluge of content were at a whole new scale, and Friendster simply wasn’t up to the challenge.
“When it grew as fast as it did, we absolutely weren’t prepared for it,” Abrams said later. “Throughout 2004, 2005, Friendster barely worked. The site was really slow; it was buggy. That, unsurprisingly, caused an exodus of users to leave.”30 When Friendster users grew frustrated waiting thirty or more seconds for pages to load, they had a throng of Friendster copycat sites to turn to instead. Like any good idea, the rebirth of social networks inspired dozens of people to try their hand at the concept. Many of the Friendster copycats tried to create social networks that targeted specific niches: college students, high school students, even, in the case of Dogster.com, pet owners.
One of the copycat sites that rushed in to tempt away disillusioned Friendster users was called Myspace. Myspace was owned by eUniverse, a dot-com survivor that made a lot of money peddling wrinkle cream (“Bett
er than Botox”) via online ads that purported to offer the cream for free despite built-in expensive automatic refills, and that made advertising claims that the FDA asserted “were not supported by reliable scientific evidence.” An eUniverse employee named Tom Anderson became obsessed with Friendster and convinced his boss, Chris DeWolfe, that creating a Friendster clone might be a cheap and easy way to amass more people for eUniverse’s marketing lists. On August 15, 2003, Myspace was launched as a nearly feature-for-feature clone of Friendster. Users had a profile page where they could post pictures, share their interests and hobbies, and link to the profiles of their friends and family. But Myspace also added a kitchen sink’s worth of features, such as blogs, horoscopes, games and more.
One of the things that was driving users away from Friendster (aside from the slow performance) was the fact that Abrams had insisted on a strict fidelity to identity. Anytime users created a Friendster account under a pseudonym, or started a parody account or pretended to create an account as a celebrity, Friendster would delete it. Myspace had no such regulations. If you wanted to sign up as Leonardo DiCaprio or Bugs Bunny, Myspace let you do it. Furthermore, you could follow anyone you wanted, whether you truly knew them or not. Myspace was the first to hit on a key concept in social networking: linking to others could be a way of mapping your personal connections, but it could also highlight your personal tastes. Friending, or “following” another profile, could be a powerful vote of interest and engagement. When this was combined with the ability to host MP3 files on your profile, Myspace became a potent venue for promotion, especially among musicians. Now that Napster was gone, an entire generation of unknown musical acts ranging from Fall Out Boy and My Chemical Romance to Arctic Monkeys would rise to prominence by engaging with their thousands of fans, promoting tour dates and even releasing new songs on their Myspace pages.
How the Internet Happened Page 29