The Attention Merchants

Home > Other > The Attention Merchants > Page 39
The Attention Merchants Page 39

by Tim Wu


  For television, this change was radical. Let us remember that back in the 1950s, early programs—variety shows mainly—lacked plots altogether. The first plotted shows were, in the language of programming, “episodic.” And so a viewer could tune in to any episode of I Love Lucy, or Gilligan’s Island, knowing that only the plot was (in the first case) breaking into show business or (in the second case) getting off the island; any particular episode was merely a freestanding enactment of that story line, the principal aim being to showcase the characters, who, oddly, never learned. At the end of every episode (or, in an exceptional case, every few) the show would reset to its eternal starting point, like a video game returning to its starting screen for a new player.

  In a hyperserial, the plot, like that of a long film or novel, has more of the flow of life. Each installment has its consequences; characters might disappear or reappear; a larger story is actually told. The press of characters and incident can vary; some hyperserials, like some lives, are easier to drop in to than others. But like any good story they are nearly impossible to understand from the middle.

  That good storytelling should have emerged as a priority is perhaps the most dramatic statement one could make about what was happening to television, although its appearance attests a potential that the medium always possessed. That expression of this potential was first undertaken by shows that had eliminated commercials gives some indication of what the attention merchants have to answer for. That it took so long made one wonder why we had put up with anything else for so many years.

  * * *

  * The first uses of “binge watching,” “binge viewing,” and other variants date to the late 1990s on online user forums, perhaps for the X-Files. In 1996, a fan looking for VHS tapes of the show wrote: “There are three of us who all got hooked at the same time, so I’d predict that there’d be some MASSIVE binge watching right away! :-).” The phrase, however, only began to show up in mainstream sources in 2003; Emily Nussbaum, of The New York Times, wrote an article on the growing popularity of TV show DVDs. “DVD’s are perfect for fast-paced arc shows like ‘24,’ increasing the intensity of the action and introducing the sickly pleasures of binge-viewing.” But the word became widely used in 2013 after Netflix began releasing the full seasons at once, prompting the Oxford Dictionary to add it to the language and also short-list it as “Word of the Year” (the ultimate winner was “selfie”).

  CHAPTER 28

  WHO’S BOSS HERE?

  On June 1, 2015, Tim Cook, CEO of Apple, the world’s most valuable company, gave a speech at the annual dinner of a small Washington nonprofit named EPIC, the Electronic Privacy Information Center. Apple’s leaders almost never give speeches in Washington, making Cook’s appearance something of a surprise; though to be fair, he didn’t actually show up at the dinner, but rather videoconferenced his way in. All the same, even more surprising was what he had to say. “I’m speaking to you from Silicon Valley, where some of the most prominent and successful companies have built their businesses by lulling their customers into complacency about their personal information,” said Cook. “They’re gobbling up everything they can learn about you and trying to monetize it. We think that’s wrong.”1

  Cook proceeded to lay into the very foundation of the attention merchant’s model, which, since the early 2000s, had become dominant on the web. “You might like these so-called free services, but we don’t think they’re worth having your email, your search history and now even your family photos data mined and sold off for god knows what advertising purpose. And we think someday, customers will see this for what it is.”2 There was reason to suppose so, since “[a] few years ago, users of Internet services began to realize that when an online service is free, you’re not the customer. You’re the product.”3

  That phrase was, nearly verbatim, what critics had been saying about all advertising-supported media for decades. “In selling commercial time to advertisers to sell products,” wrote Robin Anderson in the 1990s, “broadcasters are also selling a product—their audiences.”4 Richard Serra, the artist and cultural critic, put it this way in 1973: “Television delivers people to an advertiser….It is the consumer who is consumed….You are delivered to the advertiser who is the customer. He consumes you.”5

  At the dinner, Cook’s remarks were roundly applauded by the privacy advocates in attendance. They elicited less positive responses elsewhere. The New York Times accused Cook of ignoring “the substantial benefits that free, ad-supported services have brought to consumers worldwide.”6 Mark Zuckerberg, CEO of Facebook, shot back, too. “What, you think because you’re paying Apple that you’re somehow in alignment with them? If you were in alignment with them, then they’d make their products a lot cheaper!”7 Cook was not claiming that Apple maximized value for what it charged, but then he had not appeared there to praise his own business model, only to bury someone else’s. It didn’t take an MBA to notice that Apple’s defense of individual privacy was also an assault on the principal revenue scheme of its competitors.

  Although it was a hardly subtle indictment of Facebook and Google, Cook’s speech was also aimed at the entire web and what it had become. For behind the scenes, Apple had grown impatient with the practices of those publishing content on the web; they were loading up their sites with bloated advertisements, complex tracking technology, and other junk that was making the mobile web an unattractive and unpleasant experience. Sites froze, or were so crowded with come-ons that one could hardly find what one came to them for. None of it enhanced the use of Apple’s devices.

  The web, in theory, belongs to everyone, so no one likes an 800-pound gorilla telling them what to do there. But while Apple could be constitutionally controlling to a fault, in this case the wish to control the experience of its iPhone and iPad was a popular one. For as Apple had noticed, thanks to out-of-control advertising the mobile web was burning through the data plans, the battery life, not to mention the attention that all rightly belonged to its users, and violating their privacy for good measure. As the owner of the platform, and a company with no qualms about throwing its weight around, Apple was in a position to do something about the mobile web. Few who weren’t attention merchants had cause to complain.

  —

  Just a few days after Cook’s speech, the company, quietly and wholly without ceremony, released what one analyst called its “atomic bomb.” Buried in the documentation of its upcoming release of iOS 9, its latest operating system for the iPhone and iPad, Apple had included the following:

  The new Safari release brings Content Blocking Safari Extensions to iOS. Content Blocking gives your extensions a fast and efficient way to block cookies, images, resources, pop-ups, and other content.8

  The nonprofit Nieman Lab was the first to realize what this really meant: “Adblocking is coming to the iPhone.”9 An adblocker is a program that detects that a web page carries advertisements—say, pop-ups, embedded audio and video, etc.—and prevents them from loading; it also usually blocks “tracking,” that is, the sending of information to attention merchants like Google or Facebook, who build profiles of you based on where you go online. An iPhone that blocks ads works better and faster, consuming less power and bandwidth. It also chokes off the business model that has dominated the web ever since the year 2000 or so.

  When September arrived, and with it, Apple’s new operating system, it was worse than the web publishers and advertisers could have imagined. Adblockers, overnight, became far and away the most popular of apps. Millions of copies were downloaded, turning what had been a trend into a torrent. By the end of the year, an estimated 100 to 200 million Americans were now using some kind of adblocker some of the time. Apple’s plot, if such it was, had paid off masterfully.

  Attention merchants, advertisers, every company that hoped to make money reselling attention on the mobile web reacted to Apple’s move with a mixture of anger, fear, and moral indignation. “Ad blocking is robbery, plain and simple,” opined Ad Age.10 “Ad-b
locking threatens democracy,” pronounced the president of the Newspaper Association of America; the industry also released a study estimating that $22 billion of revenue was being destroyed and warned of far worse.11 “Every time you block an ad,” wrote another editor, “what you’re really blocking is food from entering a child’s mouth.”12

  Mostly, the developers and users of adblockers were unfazed. One developer wrote that “web advertising and behavioral tracking are out of control. They’re unacceptably creepy, bloated, annoying, and insecure, and they’re getting worse at an alarming pace.”13 Others called it the last chance to fight what the web had become. James Williams, the Oxford ethicist, wrote that “ad blockers are one of the few tools that we as users have if we want to push back against the perverse design logic that has cannibalized the soul of the Web.”14

  The problem for the attention merchants is that once people get used to avoiding advertisements, it’s hard to accept the terms of the old arrangement. The scales fall from one’s eyes, and sitting there staring at product pitches doesn’t make sense any longer. As Michael Wolff put it, “If people can avoid advertising, they do. And further: as soon as they do figure out how to circumvent advertising, they don’t go back to it.”15

  Apple’s maneuvers can be understood as a policing of its platform, which depends on payment. Hence, it could be seen taking the moral high ground on behalf of users, while also damaging its greatest rival, Google, at its point of greatest vulnerability. The adblocking campaign was, in the lingo, a twofer.

  As for Google, it was left fighting with one hand tied behind its back; for it was now being haunted by the choice it had made at that fork in the road in 2000, when it went the way of the attention merchant. Android is, of course, a competitor of the iPhone, and thus Google’s prime channel for accessing the minds of mobile users, a must-win cohort. But being wedded to an advertising model, Google was in no position to follow Apple and make Android the best experience it possibly could be. For the company was in a bind of its own making, stuck serving two masters, trying to enact the attention merchant’s eternal balancing act between advertisers and users, at a time when the latter were losing patience.

  In 1998, Larry Page and Sergey Brin had written that reliance on advertising would inevitably make it difficult to create the best possible product; in the late 2010s, in competition with Apple, they faced their own prophecy. Since the death of its founder, Steve Jobs, Apple had softened somewhat in its opposition to open platforms, and was able to use its enormous profits to build better products. Google had bested rivals like Yahoo!, who were hamstrung by their own excessive reliance on advertising, but over the late 2010s, in competition with Apple, Google got a taste of its own medicine.

  Whether Apple was truly serving its users and business model and only incidentally exploiting Google’s Achilles’ heel is a matter of debate. But there is no doubt that the sentiment of revolt against the shabbiness of the mobile web was coming from an even deeper and broader place. Writing in the 1960s, Marshall McLuhan described the media as “technological extensions of man”; in the 2010s it had become more obvious than ever that technologies like the smartphone are not merely extensions but technological prosthetics, enhancements of our own capacities, which, by virtue of being constantly attached to us or present on our bodies, become a part of us. Whether called a “phone” or a “watch,” the fact remains that “wearables” take the check-in habit begun with email and turn it into something akin to a bodily function, even as they measure the bodily functions that naturally occur. The ambitious efforts over the mid-2010s to cover the eyes, and create virtual realities, only served as a further example; a well-known photo of Mark Zuckerberg grinning as he strolled past hundreds of begoggled users caused no little alarm. It seems only natural that the closer a technology feels to being part of us, the more important that we trust it; the same goes for someone who is creating a virtual reality for you to inhabit. And so, in the coming decade, the attention merchants will need to tread very lightly as they come as close as one can to the human body. Nonetheless, adaptation is a remarkable thing, and if our history has shown anything, it is that what seems shocking to one generation is soon taken for granted by the next.

  EPILOGUE

  THE TEMENOS

  Was it, maybe, all a dream? By the late 2010s, for the rich or tech-savvy cord-cutters, it could feel that way, as they watched commercial-free television on Netflix or Amazon, read eBooks or browsed the web on an ad-blocked phone or computer. It was quite possible to think that the reign of the attention merchants had been an aberration, a sordid interval on the way to a better world, albeit a spell that lasted a century. Perhaps the long, dark night of attention arbitrage, even advertising itself—by which our very awareness was bought cheap and sold at a markup—was finally coming to an end. Certainly among its most desired target demographics, the young and the affluent, advertising seemed to become one more avoidable toxin in the healthy lifestyle, another twentieth-century invention mistakenly assumed to be harmless, like sugary soft drinks, processed foods, and tanning beds.

  An overstatement, perhaps. Still, the new millennium’s growing distaste for advertising and unprecedented willingness to pay for peace and quiet were hardly good news for the attention merchants or their brokers in the advertising industry. As Michael Wolff points out, television, as a whole, was now reliant on subscription charges for 50 percent of its revenue, an unheard-of portion; meanwhile, the mobile web was under siege, and the tethered web was being forgotten. These trends, coinciding with a growing sense that media had overtaxed our attentions to the point of crisis, certainly made it look as if the attention merchant had nowhere left to go.

  But taking the long view, as our story does, such revolts against advertising must be seen as part of a larger dynamic. We are speaking, after all, of an industry left for dead at least four separate times over the past hundred years. Again and again, it has seemed as if the party was over, that consumers had fled once and for all, and yet the attention merchants have always found a way to overgrow the bright new machines that seemed to be hacking through the old-growth foliage. The 1960s, the very zenith of anticommercialism, remarkably left the attention merchants stronger than they’d ever been. The World Wide Web, designed by research scientists, was supposed to strike a fatal blow against commercialism in communications, but these things have a logic of their own: Advertising always becomes less annoying and intrusive, and people rediscover their taste for “free” stuff. In this long view, it is hard to imagine that a business with such a marvelously simple premise—capture people’s attention in exchange for a little fun and then resell it to firms sponsoring the amusement—might simply wither away.

  What the cord-cutters and ad-avoiders of the 2010s were doing was important but not new; rather, it was part of the general and continuous effort to police our deal with the attention merchants, whether the content is the CBS Evening News or hamster videos on YouTube. Since the attention industry, like any other, demands constant growth, the terms of the deal are constantly evolving, usually to our disadvantage, with more attention seized for less diversion in return. Periodic revolts against the arrangement are therefore not just predictable but necessary. For if the attention economy is to work to our benefit (and not merely exploit us), we need to be vigilant about its operation and active in expressing our displeasure at its degrading tendencies. In some cases, as we’ve seen, its worst excesses may have no remedy but law.

  The most urgent question raised by this book does not, however, relate to the eternal debate over whether advertising is good, bad, or a necessary evil. The most pressing question in our times is not how the attention merchant should conduct business, but where and when. Our society has been woefully negligent about what in other contexts we would call the rules of zoning, the regulation of commercial activity where we live, figuratively and literally. It is a question that goes to the heart of how we value what used to be called our private lives.

&nb
sp; This book begins with a story about the growth of advertising in public schools, a new phenomenon based on the unstated premise that every sliver of our attention is fair game for commercial exploitation. That norm, as we’ve seen, has spread slowly but inexorably over the past century, until it has become a default position respecting virtually every bit of time and space we occupy. It is shocking how little it has been necessary to defend the sheer reach of the attention merchant into the entirety of our lived experience. Formerly the state of technology imposed its own limits, but at a time when these limits have been effectively eliminated, it is for us to ask some fundamental questions: Do we draw any lines between the private and the commercial? If so, what times and spaces should we consider as too valuable, personal, or sacrosanct for the usual onslaught?

  Custom answered these questions in previous times, but just as technology has transcended its former limitations, we seem less moved by the imperatives of tradition. At one time, tradition set limits on where people could be intruded upon and when. Even with the necessary technology, it was not always so easy to reach people in their homes, let alone while walking or in a taxi. For the majority, religious practice used to define certain inviolable spaces and moments. Less formal norms, like the time reserved for family meals, exerted considerable force as well. In this world, privacy was the default, commercial intrusions the exception. And while there was much about the old reality that could be inconvenient or frustrating, it had the advantage of automatically creating protected spaces, with their salutary effects.

 

‹ Prev