Book Read Free

The Attention Merchants

Page 32

by Tim Wu


  In 2006, Time magazine, struggling to stay hip to it all, named “YOU” as its person of the year. “Yes you. You control the information age. Welcome to your world.”12 The journalist Jon Pareles wrote that “ ‘user-generated content’ [is] the paramount cultural buzz phrase of 2006….I prefer something a little more old-fashioned: self-expression. Terminology aside, this will be remembered as the year that the old-line media mogul, the online media titan and millions of individual Web users agreed: It demands attention.”13

  No one quite knew how much or what it all meant, but user-generated content had clearly seized its beachhead. The erasure of barriers to entry in markets for speech had, as predicted, released an outpouring. The quality was perhaps uneven (one critic called it “the cult of the Amateur”),14 but that wasn’t the point—it attracted millions, perhaps billions, of hours of attention anyhow, attention no one had the chance to resell.

  Let us now take account of what was happening economically. The attention merchants had developed a business model based on directing the public mind toward commercial, well-packaged media products on television. But as the web grew in popularity, people started to pay more attention to one another instead, with no money changing hands. Bloggers did not, at first anyway, advertise, just as friends in the course of conversation do not usually plan to resell the attention they’ve gained by shilling for a product. Not that commerce had ground to a halt: everyone was using Google to find things they needed, and perhaps a few they didn’t. The ones suffering for this happy state of affairs were those industries that had spent the past century devising how best to get people to look at them and listen, to enjoy their diversion and tolerate a word from their sponsors.

  In that way the early web was exactly like the 1960s counterculture: it encouraged both a Great Refusal of what had always been handed down from on high, and asked people to spend more time with each other. It asserted that money need not be involved in attentional barter, and that everyone had an inherent potential to be a creator. In the early days at some companies, like Google, the link was more explicit, with much of the company retreating to the Burning Man festival every year and management espousing the value of putting in place a practical, pragmatic implementation of the counterculture’s ideas. Perhaps that’s why in the early 1990s, Timothy Leary advised people to “turn on, boot up, jack in”; he even wrote a computer game.15

  As in the 1960s, this great turning away was the cause of no little consternation, if not degrees of panic, in the old attention industries. As the columnist-writer Dave Barry put it, “We can no longer compel people to pay attention. We used to be able to say, there’s this really important story in Poland. You should read this. Now people say, I just look up what I’m interested in on the Internet.”16 Also as before, the change was so strong and apparent that seriously questioning it became mainly the province of cynics, naysayers, and Luddites. The reasonable pundit’s challenge was to capture in adequately epic terms what was happening. Yochai Benkler explained that “the removal of the physical constraints on effective information production has made human creativity and the economics of information itself the core structuring facts in the new networked information economy.”17 Clay Shirky, for his part, compared the “radical spread of expressive capabilities” to “the one that gave birth to the modern world: the spread of the printing press five centuries ago.”18 But few could compete with Jeff Jarvis’s penchant for proclamation. “We are undergoing a millennial transformation from the industrial, mass economy to what comes next” is how he once put it. “Disruption and destruction are inevitable.”19

  Lawrence Lessig, the darkest of the bunch, would turn out to be the one asking the most pertinent question: how long can this last? For even in the golden age of the user-driven web, there was reason to wonder whether the noncommercial model could persist. Bloggers and other creators of content were not Renaissance aristocrats; they faced the material constraints of most individuals or small-scale enterprises—most still needed to make a living, and as things progressed and improved, and expectations rose, it took ever more work to keep a blog up to date and engaging. A few would make a decent living, some through advertisings, others through newspaper acquisition. But for most, the effort would remain a hobby, and a time-consuming one at that. Burnout and attrition were perhaps inevitable.

  As so often, as in the 1960s, the triumphalism would prove premature. Far from being unstoppable, both the blogosphere and the amateur were in fact quite vulnerable, but not, as Lessig predicted, to the established powers, devouring their upstart progeny to prevent the inevitable future, as the Titan Kronos did. Instead, the commercial forces that would overgrow this paradise came from the web itself. Indeed, we can now see that there was nothing about the web’s code that would keep it open, free, and noncommercial, as its architects intended. Where attention is paid, the attention merchant lurks patiently to reap his due. Et in Arcadia ego. The fall of the web to this force was virtually preordained.

  In retrospect, the first wave of bloggers and their fellow travelers can be likened to a first wave of visitors to some desert island, who erect crude, charming hostels and serve whatever customers come their way, and marvel at the paradise they’ve discovered. As in nature, so, too, on the web: the tourist traps high and low are soon to follow; commercial exploitation is on its way. Such, unfortunately, is the nature of things.

  * * *

  * The micro-fragmentation represented by blogging audiences caused panic to some thinkers like Noam Chomsky and Cass Sunstein. Chomsky argued that blogs lacked the power to constrain powerful actors. “There’s plenty to criticize about the mass media, but they are the source of regular information about a wide range of topics. You can’t duplicate that on blogs.” Natasha Lennard, “Noam Chomsky, the Salon Interview: Governments Are Power Systems, Trying to Sustain Power,” Salon, December 29, 2013. Sunstein, at the height of blogging’s popularity, wrote a rare academic attack on the choices made possible by technologies like cable or the Internet. He argued that blogs and other technologies were dividing the country into informational factions who pay attention only to what they care to hear. “In a Democracy,” wrote Sunstein, “people do not live in echo chambers or information cocoons. They see and hear a wide range of topics and ideas.” This vision of democracy, says Sunstein, “raise[s] serious doubts about certain uses of new technologies, above all the Internet, about the astonishing growth in the power to choose—to screen in and to screen out.” Cass Sunstein, Republic.com 2.0 (Princeton: Princeton University Press, 2007). Both he and Chomsky preferred an environment where the nation regularly tuned in, together, to something like NBC or CBS or perhaps a public broadcaster.

  CHAPTER 22

  THE RISE OF CLICKBAIT

  Back in 2001, at MIT’s media laboratory in Cambridge, Massachusetts, a former schoolteacher named Jonah Peretti was sitting at his desk and, like so many graduate students, not doing his work. Peretti was meant to be plugging away at his master’s thesis; instead he was playing around with what was already the greatest procrastination aide ever devised, the World Wide Web.

  Born in California to a Jewish mother and Italian American father, Peretti was a pretty ordinary and sober-looking young guy. But there was always a slight smile on his lips, a clue to the inveterate prankster beneath the facade of the typical grad student. In fact, he was fascinated by the line between the serious and the absurd, which in his mind also often delineated art from commerce, if not always respectively. Most of his ventures, even those that would prove important, seem to have been conceived as a kind of inside joke, and a test of the limits of possibility.

  While goofing off—“surfing the web” in the vernacular of the time—Peretti went to Nike’s website and noticed a feature allowing customers to order shoes personalized with any word they might like. On a whim, he placed an order for Nike Zoom XC USA running shoes with the following embroidered on them:

  SWEATSHOP

  He thought nothing more
of it until the next day, when he received the following email:

  From: Personalize, NIKE iD

  To: Jonah H. Peretti

  Subject: RE: Your NIKE iD order o16468000

  Your NIKE iD order was cancelled for one or more of the following reasons.

  1) Your Personal iD contains another party’s trademark or other intellectual property.

  2) Your Personal iD contains the name of an athlete or team we do not have the legal right to use.

  3) Your Personal iD was left blank. Did you not want any personalization?

  4) Your Personal iD contains profanity or inappropriate slang. If you wish to reorder your NIKE iD product with a new personalization please visit us again at www.nike.com

  Thank you, NIKE iD

  Seeing comic potential, Peretti wrote back asking just which rule he had broken. A Nike customer service representative replied: “Your NIKE iD order was cancelled because the iD you have chosen contains, as stated in the previous e-mail correspondence, ‘inappropriate slang.’ ”

  Peretti, just warming up, wrote back:

  Dear NIKE iD,

  Thank you for your quick response to my inquiry about my custom ZOOM XC USA running shoes. Although I commend you for your prompt customer service, I disagree with the claim that my personal iD was inappropriate slang. After consulting Webster’s Dictionary, I discovered that “sweatshop” is in fact part of standard English, and not slang. The word means: “a shop or factory in which workers are employed for long hours at low wages and under unhealthy conditions” and its origin dates from 1892. So my personal iD does meet the criteria detailed in your first email. Your web site advertises that the NIKE iD program is “about freedom to choose and freedom to express who you are.” I share Nike’s love of freedom and personal expression. The site also says that “If you want it done right…build it yourself.” I was thrilled to be able to build my own shoes, and my personal iD was offered as a small token of appreciation for the sweatshop workers poised to help me realize my vision. I hope that you will value my freedom of expression and reconsider your decision to reject my order.

  Thank you,

  Jonah Peretti

  In response, Nike simply canceled the order. Peretti wrote one last one email:

  From: Jonah H. Peretti

  To: Personalize, NIKE iD

  Subject: RE: Your NIKE iD order o16468000

  Dear NIKE iD,

  Thank you for the time and energy you have spent on my request. I have decided to order the shoes with a different iD, but I would like to make one small request. Could you please send me a color snapshot of the ten-year-old Vietnamese girl who makes my shoes? Thanks, Jonah Peretti

  [no response]1

  Amused, Peretti sent a copy of the email chain to twelve or so friends, including one who posted it on his personal website. Within a week, Peretti’s exchange had been shared by people far and wide; first thousands and within a few weeks, millions. Along the way, it was picked up by mainstream media outlets around the world. To use a phrase that did not exist in 2001, the email “went viral.”

  “So then I found myself on the Today show talking with Katie Couric about sweatshop labor,” says Peretti. “It was, like, what do I know about sweatshops?” Here was that mind-warping rush of unexpected renown, of reaching an audience way beyond your wildest expectations. Peretti would later remember it simply: “Something very small became something very big.” Unbeknownst to him at the time, the experience would end up changing his career and his life.

  Popular email chains have been around nearly as long as email itself, but back in 2001 words like “viral,” “Internet meme,” and “clickbait” were as yet unknown. What Peretti naively experienced was an early version of what would become a pervasive means of harvesting attention in the early twenty-first century. Peretti, who has a curious, scientific mind, started to consider the phenomenon carefully and systematically. “I looked at stories like the Nike shoe story, and there were actually plenty of other ones. Someone did something, it went big for a while, but then that’s where the story ends.” Having done it once, by accident, he wanted to see if he could make it happen at will. He wanted to see if he could understand what it took to make something “go viral.” For he realized that his weird experience had a deeper significance; it marked a change, made possible by the Internet, in how attention might be captured, and from whom.2

  A few months later, Peretti left MIT to take a job at the Eyebeam art and technology center, a giant space in New York’s West Chelsea neighborhood; from the outside it looked like many of the art galleries that surrounded it. Here, in his “contagious media laboratory,” he tried to figure out if he could make lightning in a bottle.

  Peretti started throwing stuff against the web to see if anything might stick. It turned out he was not alone; there were others who shared his fascination with creating zany stuff that might, almost magically, erupt across the Internet. He got to know a man who called himself “Ze Frank,” a sort of self-styled web-jester. Ze Frank’s own road to Damascus had involved a web-based birthday invitation in which he featured, performing funny dance moves; it had earned millions of hits and won him a Webby Award. Peretti also met Cory Arcangel, a conceptual artist who, among other things, had made art that required him to hack Nintendo’s Super Mario Brothers. Then there was a social scientist, Duncan Watts, who tried to understand media contagion with mathematical models. In these guys, Peretti found a posse.

  His sister, Chelsea Peretti, also got in on the act. Together, they launched www.blackpeopleloveus.com, a fake website featuring a white couple inordinately proud of having made black friends; it attracted 600,000 hits. There was also the “Rejection Line,” a service for those who found it too inconvenient to reject people themselves. As the site said:

  Someone won’t leave you alone?

  Give them “your” number: 212-479-7990

  The official New York Rejection Line!

  (operators are standing by!)

  It was as if stories from The Onion had actually been put into production.3

  —

  Peretti and his pals definitely had some fun at Eyebeam. They held how-to workshops for the public with titles like “The Mass Hoax.” In 2005 they hosted something called the “Contagious Media Showdown,” giving contestants three weeks to get as much traffic as they could. Entries included “hire-a-killer.com,” “Crying While Eating,” “email god,” and “change your race”; the winner (possibly as a result of cheating) was “Forget-me-not-panties.com,” a prank site purporting to sell women’s underwear that broadcasts the wearer’s location to possessive fathers and husbands. “Unlike the cumbersome and uncomfortable chastity belts of the past, these panties are 100% cotton, and use cutting-edge technology to help you protect what matters most.” The site suckered both bloggers and mainstream press, remaining operational for quite some time, albeit with a notice that stock was currently sold out.4

  Peretti may not have been able to create anything while at Eyebeam on the scale of his Nike experience, but he would author a twenty-three-point manifesto that he called “Notes on Contagious Media,” expounding just what distinguished that variety from others. Some of it was obvious: “Contagious media is the kind of media you immediately want to share with all your friends. This requires that you take pleasure in consuming the media but also pleasure in the social process of passing it on.” Some more theoretical: “Contagious media is a form of pop conceptual art” in which “the idea is the machine that makes the art (LeWitt, 1967) and the idea is interesting to ordinary people.” For that reason, “a contagious media project should represent the simplest form of an idea. Fancy design or extraneous content made media less contagious. Anything inessential constituted a ‘payload’ that the contagion must drag along as it spreads. The bigger the payload, the more slowly the entire project spreads.” Peretti had more or less made himself the world’s expert on contagious media, but the recognition of peers was not enough; the measure of
his success would be the ability to generate traffic. “For the artist, a work can be celebrated even if the only people who like it are a small group of curators and collectors,” he wrote. “For the contagious media designer, all that matters is how other people see the work. If people don’t share the work with their friends, it is a failure regardless of the opinion of the creator, critics, or other elites.”

  In 2004, Peretti was still puttering around at Eyebeam, teaching and throwing stuff on the web, when he was approached by Ken Lerer, a former communications executive at AOL and a committed political activist. A journalist by training and adept at raising money, Lerer presented what he considered an urgent project. Despite every kind of blunder in office, President George W. Bush seemed likely to be reelected. This was incomprehensible to Lerer and other Democrats, who considered Bush an obvious incompetent; in their view, the Internet was in part to blame. The right-wing blogs—above all, the Drudge Report, the most widely read aggregator of news links—just captured more attention than all the left-wing ones. “You know the Internet, let’s build something,” Lerer cajoled Peretti, who would later explain, “I’m the son of a public defender and a public school teacher” and “it seemed pretty important.”5

  “Something” was at first quite vague. As it evolved, the idea seemed to be this: leverage the left’s superior hold on Hollywood celebrities, as well as Peretti’s knack for driving traffic and Lerer’s touch for fundraising, to create a counterweight to conservative online media. The celebrity part, they decided, was best handled by the third partner, and by far the best connected, Arianna Huffington.

  On May 10, 2005, just two days after the contagious media contest, The Huffington Post, an online aggregator of news, blogs, and other content, debuted to widespread mainstream coverage. The first issue featured blog posts by Huffington herself and, as planned, various famous people: the renowned historian and Kennedy confidant Arthur M. Schlesinger Jr.; the actor John Cusack; Larry David, producer of Seinfeld; the husband-and-wife acting pair Julia Louis-Dreyfus (also of Seinfeld) and Brad Hall, jointly posting on the issue of gun violence.

 

‹ Prev