Book Read Free

The Attention Merchants

Page 37

by Tim Wu


  Technology doesn’t follow culture so much as culture follows technology. New forms of expression naturally arise from new media, but so do new sensibilities and new behaviors. All desire, the philosopher and critic René Girard wrote, is essentially mimetic; beyond our elemental needs, we are led to seek after things by the example of others, those whom we may know personally or through their fame. When our desires go beyond the elemental, they enter into their metaphysical dimension, in which, as Girard wrote, “All desire is a desire to be,” to enjoy an image of fulfillment such as we have observed in others. This is the essential problem with the preening self unbound by social media, and the democratization of fame. By presenting us with example upon example, it legitimates self-aggrandizement as an objective for ever more of us. By encouraging anyone to capture the attention of others with the spectacle of one’s self—in some cases, even to the point of earning a living by it—it warps our understanding of our own existence and its relation to others. That this should become the manner of being for us all is surely the definitive dystopic vision of late modernity. But perhaps it was foretold by the metastatic proliferation of the attention merchants’ model throughout our culture.

  In the fall of 2015, an Australian teenager, Essena O’Neill, quit Instagram in utter despair. A natural beauty and part-time model, she had become an Instagram celebrity, thanks to her pictures, which had drawn half a million followers. But her Instagram career, she explained, had made her life a torment.

  “I had the dream life. I had half a million people interested in me on Instagram. I had over a hundred thousand views on most of my videos on YouTube. To a lot of people, I made it,” she confessed in a video. But suddenly it had all become too much.

  Everything I was doing was edited and contrived and to get more views….Everything I did was for views, for likes, for followers….Social media, especially how I used it, isn’t real. It’s contrived images and edited clips ranked against each other. It’s a system based on social approval, likes, validation in views, success in followers. It’s perfectly orchestrated self-absorbed judgement….I met people that are far more successful online than I am, and they are just as miserable and lonely and scared and lost. We all are.6

  A survey of Instagram and other social media users by the London Guardian yielded similar responses, suggesting that even among those with relatively few followers the commitment is grim. “I feel anxiety over how many likes I get after I post a picture. If I get two likes, I feel like, what’s wrong with me?” wrote one woman.7 “I do feel insecure if I see girls who look prettier than me,” wrote another, “or if they post really pretty pictures, and I know I won’t look as good in any that I post. I do feel pressure to look good in the photos I put up. I don’t feel anxious about not getting enough likes on a photo but if it doesn’t get enough likes, I will take it down.”

  In April 2012, a mere eighteen months after its debut, Instagram was purchased by Facebook for $1 billion. The high-flying start-up’s founders had cashed out without ever having devised a business model. No matter: by November the following year, the first ad feed would run in Instagram, following Facebook principles of limited targeting. The acquisition would prove astute. In April 2012 Instagram had 30 million users, but by the fall of 2015 it had 400 million, more than Twitter. And so Facebook would join the ranks of hoary behemoths with a war chest. A transfusion of young blood would preserve their status in the uppermost echelon of attention merchants.

  As for Instagram, its upward glide portended a future in which the line between the watcher and the watched, the buyer and the seller, was more blurred than ever. The once highly ordered attention economy had seemingly devolved into a chaotic mutual admiration society, full of enterprising Narcissi, surely an arrangement of affairs without real precedent in human history.

  * * *

  * The pager was a portable device used in the 1980s and 1990s that allowed the bearer to receive notifications that a return call was desired.

  CHAPTER 26

  THE WEB HITS BOTTOM

  “Which Ousted Arab Spring Ruler Are You?” “You Might Be Cleaning Your Penis Wrong,” “37 Things Conservatives Would Rather Do Than Watch Obama’s State of the Union Speech,” “29 Cats Who Failed So Hard They Won.”

  Here was BuzzFeed, at its height in the 2010s, undisputed king of clickbait, and the grandmaster of virality. As a cofounder of The Huffington Post, Jonah Peretti had gained a measure of success, recognition, and personal wealth. But it wouldn’t be long before he lost interest in the operation, which had begun to run itself, and felt compelled to return to his original passion: the pure art and science of harvesting attention with “contagious” or “viral” media. He was still at The Huffington Post when he began to conceive the endpoint, or perhaps the punch line, to his long obsession: a site whose mission would be nothing but to build pure contagion and launch it into the ether.

  BuzzFeed billed itself as the “first true social news organization,” which meant it was designed for a post-Facebook and post-Twitter world, where news gained currency by being shared on social networks, through newsfeeds, Twitter feeds, and the like. It was also designed to be read on the now ubiquitous mobile platforms; by 2015, 60 percent of its traffic was via phones and other wireless devices (including 21 percent from Snapchat)—the key to success was now getting people to share stuff socially from mobile.

  By the time Peretti built BuzzFeed, viral media were not an occasional phenomenon, but reaching the public like successive waves crashing on a metaphorical shore, they thus both rivaled and complemented (depending on the context) existing means of capturing attention. It was a time when a random picture of a grumpy-looking cat (Grumpy Cat) posted on the online bulletin board Reddit made a viable career for its owners; when a ridiculous dance video like “Gangnam Style” amassed more than 2.4 billion online views (while the 2014 World Cup, the most watched event in human history, reached about 1 billion).

  As nothing but a pure embodiment of Peretti’s techniques, BuzzFeed did without even the pretense of a public mission, the only goal being to amuse viewers enough to trigger their sharing. With content often nearly devoid of any meaningful communication, the medium truly was the message. And while this might sound like unprecedented cynicism vis-à-vis the audience, the idea was to transfer creative intention to them; they alone would “decide if the project reaches 10 people or 10 million people.”1 To help them decide, BuzzFeed pioneered techniques like “headline optimization,” which was meant to make the piece irresistible and clicking on it virtually involuntary. In the hands of the headline doctors, a video like “Zach Wahls Speaks About Family” became “Two Lesbians Raised a Baby and This Is What They Got”—and earned 18 million views. BuzzFeed’s lead data scientist, Ky Harlin, once crisply explained the paradoxical logic of headlining: “You can usually get somebody to click on something just based on their own curiosity or something like that, but it doesn’t mean that they’re actually going to end up liking the content.”

  BuzzFeed also developed the statistical analysis of sharing, keeping detailed information on various metrics, especially the one they called “viral lift.” Let’s take, for example, a story entitled “48 Pictures That Capture the 1990s,” which garnered over 1.2 million views. BuzzFeed would measure how many people read it (views), and of those, how many went on to share it, whether on Twitter, Facebook, or other sites. If, say, twenty-two people with whom the link was shared were moved to click on it, the story would be said to have a viral lift of 22x. Such data would help BuzzFeed’s experts refine their understanding of what gets shared, and what doesn’t.

  Collectively BuzzFeed and its rivals—Mashable, Upworthy, and in time parts of the mainstream media—began to crack the code; eventually they could consistently make content go viral. Much of what they discovered validated Peretti’s original theories—particularly about the necessity of stimulating “pleasure in the social process of passing” something along and of ensuring that the contagion “
represent[s] the simplest form of an idea.”2 But the “pleasure” of sharing did not necessarily mean that viewing the content had been pleasurable. The urge to share was activated by a spectrum of “high-arousal” emotions, like awe, outrage, and anxiety. A story with a headline like “When All Else Fails, Blaming the Patient Often Comes Next,” or “What Red Ink? Wall Street Paid Hefty Bonuses,” or “Rare Treatment Is Reported to Cure AIDS Patient” would trigger one of these emotions—or even better, several at once.

  Naked plays for attention always draw scorn, and as BuzzFeed’s fortunes rose in the 2010s, it was no exception. As Ben Cohen, founder of the journalism site The Daily Banter, wrote: “I loathe BuzzFeed and pretty much everything they do….It could well trump Fox News as the single biggest threat to journalism ever created.”3 When BuzzFeed presented the Egyptian democratic revolution as a series of GIFs from the film Jurassic Park, Cohen fulminated: “To say this is childish, puerile bullshit would be a massive understatement….Doing funny GIF posts about cats and hangovers is one thing, but reducing a highly complex political crisis into 2 second moving screen shots of a children’s dinosaur movie is something completely different. If BuzzFeed really is the future of journalism, we’re completely and utterly fucked.”4 Indeed, by 2012, the scramble for eyeballs against forces like BuzzFeed seemed to bring news media to a new low. When Fox News broadcast a video of a man committing suicide and BuzzFeed reposted the link, the Columbia Journalism Review was compelled to ask, “Who’s worse? @FoxNews for airing the suicide, or @BuzzFeed for re-posting the video just in case you missed it the first time?”5

  BuzzFeed was indeed proving the envy of all other online attention merchants, in traffic at least. By 2015, its 200+ million monthly unique viewers exceeded most of its competitors, and 75 percent of its traffic was coming from social media. Ultimately its techniques were widely copied, not just by its direct competitors like the Daily Mail or Cracked.com but by The Huffington Post, Peretti’s previous venture, and more obliquely, by magazines like Slate as well as newspapers like The Washington Post. Even literary magazines like The Atlantic and The New Yorker got in on the act. BuzzFeed thus became the reference point, the gold standard, for attention capture on the web.

  Not that BuzzFeed was terribly profitable. It lost money for most of its early years, only began to turn a profit in 2013, and never exceeded $10 million (while hardly a fair comparison, Apple’s iTunes store alone, also in the content business, and not considered highly profitable, has been estimated to clear $1 billion in profit per year). Its fortunes reflected the still-low price of digital ads; BuzzFeed’s annual ad revenues of roughly $100 million were still far less than, say, People magazine (about $1 billion). Nonetheless, BuzzFeed was still growing, and as the decade reached its midpoint, was pegged at $850 million in value; then, over the summer of 2015, the cable giant Comcast bought a stake that valued the company at $1.5 billion.

  Comcast’s investment in BuzzFeed was at last a consummation of the union between the old and the new media such as Microsoft and AOL–Time Warner had once contemplated, though now involving far less money than in those headier days. For comparison’s sake, though, it is worth remarking that The Washington Post, with its forty-seven Pulitzer Prizes, was purchased by Amazon for $250 million in 2013—old media valuations clearly weren’t what they used to be, either. And yet even if BuzzFeed had attracted real dollars, the deal with Comcast nonetheless seemed to diminish the new media in some way. Blogging and other forces that Jeff Jarvis and others had predicted were going to demolish the establishment had eventually yielded to BuzzFeed. BuzzFeed was then bought by old media for what amounted to chump change. So much for all of that.

  Peretti had never been less than forthright and consistent about the objectives of his work: it was attention capture for its own sake. But the entry of contagions and clickbait and even social networks in the ecosystem of the content-driven media inevitably had its degrading influence on the latter. Mark Manson well described the state of the web in the 2010s:

  Last week, I logged onto Facebook to see a story about a man who got drunk, cut off his friend’s penis and then fed it to a dog. This was followed by a story of a 100-year-old woman who had never seen the ocean before. Then eight ways I can totally know I’m a 90’s kid. Then 11 steps to make me a “smarter Black Friday shopper,” an oxymoron if I ever saw one. This is life now: one constant, never-ending stream of non sequiturs and self-referential garbage that passes in through our eyes and out of our brains at the speed of a touchscreen.6

  Within twenty years of having been declared king, content seemed to be on the road to serfdom.

  Once a commons that fostered the amateur eccentric in every area of interest, the web, by 2015, was thoroughly overrun by commercial junk, much of it directed at the very basest human impulses of voyeurism and titillation. To be sure, there were exceptions, like Wikipedia, a healthy nonprofit; Reddit, still a haven for some of the spirit of the old Internet; small magazines like Verge, Vox, Quartz, and the Awl; even some efforts to reboot blogging, like the Medium. Likewise, faced with an existential crisis of relevancy, traditional news media, so long allergic to the Internet, dramatically improved their online content over the decade. But these bright spots were engulfed by the vast areas of darkness, the lands of the cajoling listicles and the celebrity nonstories, engineered for no purpose but to keep a public mindlessly clicking and sharing away, spreading the accompanying ads like a bad cold. As the clicks added up, what was most depressing perhaps was that all this was for the sake of amassing no great fortune, but in fact relatively paltry commercial gains, rounding errors in the larger scheme of commerce. The idealists had hoped the web would be different, and it certainly was for a time, but over the long term it would become something of a 99-cent store, if not an outright cesspool. As with the demolition of Penn Station, a great architectural feat had been defaced for little in return. But as so often in the history of attention merchants, when competition mounts, the unseemliness soars and the stakes plummet.

  And that was just the content; the advertising, meanwhile, was epically worse. By the mid-2010s the average reader on news sites like the Boston Globe’s boston.com would be subjected to extraordinary surveillance methods, with only the barest degree of consent. Such operations would be invisible to the user, except for the delays and solicitations they triggered. Online tracking technologies evolved to a point that would have made a Soviet-era spy blush. Arrival at NYPost.com would trigger up to twenty or more urgent “tracking” messages to online ad agencies, or “ad networks,” advising them of any available intelligence on the user, in addition to specifying what stories they were reading. Attention merchants had always been ravenous for attention, but now they were gobbling up personal data as well. Perhaps the oversharing on social media had simply lowered the standard of privacy. Perhaps the Internet, with its potential to capture every turn of our attention, made this inevitable. Whatever the case, several commercial entities were now compiling ever more detailed dossiers on every man, woman, and child. It is a more thoroughly invasive effort than any NSA data collection ever disclosed—and one of even more dubious utility.

  The automation of customized advertising was intended, in theory, to present you with things more likely to seize your attention and make you click. It must be seen as a continuation of the search we have described for advertising’s Holy Grail: pitches so aptly keyed to one’s interests that they would be as welcome as morning sunshine. The idealists foresaw a day when ad platforms would be like a loyal valet who detected his master’s needs before he was aware of them, who suggested a new pair of shoes as a reasonably priced replacement for those you hadn’t noticed were wearing out. Perhaps he would remind you of your mother-in-law’s birthday while offering to send an appropriate gift at a one-day discount.

  But the gap between this theory and its execution was wide enough to march Kitchener’s Army through it. Google’s CEO Eric Schmidt had once said that the ideal was to “get ri
ght up to the creepy line and not cross it.”7 Unfortunately, by the mid-2010s, that line was being crossed constantly. While promising to be “helpful” or “thoughtful,” what was delivered was often experienced as “intrusive” and worse. Some ads seemed more like stalkers than valets: if, say, you’d been looking at a pair of shoes on Amazon, an ad for just those shoes would begin following you around the web, prodding you to take another look at them. What was meant to be “relevant” to your wishes and interests turned out to be more of a studied exploitation of one’s weaknesses. The overweight were presented with diet aids; the gadget-obsessed plied with the latest doodads; gamblers encouraged to bet; and so on. One man, after receiving a diagnosis of pancreatic cancer, found himself followed everywhere with “insensitive and tasteless” ads for funeral services. The theoretical idea that customers might welcome or enjoy such solicitations increasingly seemed like a bad joke.

  To make matters worse, the technology of behavioral advertising added layers of complexity to the code of any website, causing the system to slow or freeze, and sometimes preventing the page from loading altogether. According to a New York Times study in 2015, despite the fact that every other technology had improved, some websites were now taking five seconds or more to load; and the situation was even worse on mobile phones, with their slower connections.8 Videos had a way of popping up and starting to play unbidden; and the user looking for the stop button would find it was the tiniest of all, and often oddly located. And something of a ruse as well: if you missed hitting it directly, yet another website would open, with yet more ads.

 

‹ Prev