The Attention Merchants

Home > Other > The Attention Merchants > Page 21
The Attention Merchants Page 21

by Tim Wu


  PRIZM was put to one of its first great tests in 1982 with the introduction of Diet Coke. When Coca-Cola launched the new product (slogan “Just for the taste of it!”), its objective was to capture customers new to the growing market for diet soda, while not cannibalizing sales of TaB, its existing diet drink. Introduced in 1963, TaB was a healthy profit center and the country’s leading diet beverage. But using Robbin’s cluster analysis, Coca-Cola was able to determine that, in fact, just six types of people actually drank it, among them “Money & Brains,” “Furs & Station Wagons,” “Young Influentials,” “Pools & Patios,” “Black Enterprise,” and “Young Suburbia.” And so the company came up with a marketing plan that aimed for everyone else, essentially by inverting the pitch. “We positioned it as a great-tasting soft drink that happens to have one calorie, rather than as a diet drink that tastes great,” said its adman, Steve Norcia. “We thought this would broaden its appeal as the first diet soft drink to emphasize sheer pleasure and great taste—not just part of a diet regimen.”11

  Such as it could, Coca-Cola avoided advertising Diet Coke in TaB clusters, and even began mailing TaB drinkers coupons for their preferred cola, so as to neutralize any collateral damage. It was entirely in keeping with the ultimate claim of PRIZM that you could say different things to different people and win them all. And it goes a long way toward explaining the system’s later importance in politics.

  To this point, with some experimental exceptions, the contest for human attention had been mostly approached as if everyone were roughly the same. But PRIZM revealed how the nation was actually a knowable mosaic of sensibilities and tastes—or even vulnerabilities and deepest desires. Here, at last, was a quantitative version of the shamanistic insight offered by Bernays and Dichter. For the advertising industry, it proposed a vast new horizon of targeted campaigns and additional billings. For the advertisers, it promised a more potent kind of messaging than the old scattershot method. Now, firms could not only tailor ads to their desired consumer but also fine-tune the product itself, so as to be more alluring to his or her specific attention. Here, too, was a way around the perennial attention industry problem of conditioned indifference; it is far more taxing to learn to ignore messages that seem to speak to you specifically.

  And so, with the launch of Claritas, the liberating politics of recognition would attain its commercial correlative. Born of a wish for greater empathy and greater understanding of people from all backgrounds, the new system industrialized the project of finding out as much as possible about each and every one of us, not out of regard for anyone’s dignity, but so as to know precisely what would catch our attention and make us want things we never knew we wanted.

  —

  Where marketing had led, media would follow. And so when cable at last scaled the walls that the networks had erected around television, its offerings would be as varied as the new map of America. And the efforts of those like Fred Silverman to make television more vital and interesting would give way to the commercial will to make it all things to all people, if not in all places.

  “It seems that America’s youth is about to be served,” reported The New York Times in 1981. “This week Warner-Amex brings the Top 40 to television, when it inaugurates MTV, a 24-hour all music cable network service.” The new service, analysts predicted, “will be popular with companies looking for ways to tap teen purchasing power.” And if the “video disco channel” were to succeed in gaining five million subscribers, it might “creat[e] additional demand for video discs, and reducing the need for record companies to send groups on costly national concert tours.”12

  Not for you?

  How about sixty and one half hours of sports programming per week on one channel? “ ‘That,’ one man, said, ‘is the ghastliest threat to the social fabric of America since the invention of the automobile.’ ‘No,’ he was told, ‘it is ESPN, the Entertainment and Sports Programming Network for sports junkies who have to have a fix every time they touch the dial.’ ”13 Into the late 1980s, a dozen cable networks launched, each targeted not at the middle, like CBS and NBC, but at some demographic fragment.

  Bravo launched in 1980, explicitly targeting lovers of the arts, assumed by its backers to be mostly women (but later to include gay men, when the appeal to two groups would be dubbed “dualcasting”).14 Two years on came BET, Black Entertainment Television, targeting blacks; and the Playboy Channel, targeting not only playboys but all straight men; as the Times reported, “Playboy Enterprises Inc. is now trying to transfer its 30-year-old formula—either sexual sophistication, or soft-core pornography, depending on the reader—to video.” The network did make an effort to reach women as well, with shows like Dr. Yes: The Hyannis Affair, which was described as “a sort of topless Dynasty” following “the idle rich, with time on their hands and sex on their minds.”15

  “Audience fragmentation” was the industry term for what the new cable networks were molding themselves to, and it was of a piece with the idea of appealing to clusters through PRIZM. Of course it was never entirely clear whether “fragment” was being used more as a verb or a noun: Were the networks reacting to fragmented audiences, or were they in fact fragmenting them? In retrospect, they were doing both.

  There is no doubt that the targeting of audience fragments was a reaction to the programming style of the 1950s, when men like William Paley and Pat Weaver went for the broad middle, when the supreme attention grabber was I Love Lucy, and one could capture more than 60 million viewers every week with the same show. This had happened at a time of relatively homogeneous national sensibilities following the great collective effort of winning the Second World War. But as we’ve seen, broadcasters used their monopoly over the most powerful attention capture technology yet invented to mold that relative sameness into a single, national consumer mass, unified in schedule, attentional habits, and information diet. At the time, nothing could have been better for their business or American business generally.

  But what happened in the 1980s was a departure also from 1970s programming, which had turned the countercultural spirit into palatable viewing, mostly for the giant new generational block. Antiestablishment in tone, it nevertheless preserved the network status quo. To be sure, it would take the sort of new paradigm that usually follows advances in technology—in this case, the improvement of signal transmission over coaxial cable, among others—to make possible a truly radical break with the broadcast model.

  Technology always embodies ideology, and the ideology in question was one of difference, recognition, and individuality. But commerce bows to none, taking its opportunities wherever they may lie. The logic of cable ultimately did not flow from the zeitgeist but rather from the opportunity it created for a host of upstarts to feast on the attentional harvest once the manna of the networks alone. And once all those newcomers began to contest what had been the broad middle, it began to fall apart, not at once, but from this point onward.

  —

  One upstart would even be well funded enough to challenge broadcast on its own terms. In 1986 the Fox Broadcasting Company suddenly appeared on televisions across America, its unstated mission to serve the un-served, at least in terms of entertainment. Its owner was an Australian newspaper magnate, Rupert Murdoch; its chairman and CEO, Barry Diller, a self-styled programming revolutionary himself. Out came a show conceived to be the “anti–Cosby Show” (in response to the hugely popular NBC sitcom starring Bill Cosby as a prosperous physician and occasionally befuddled father). Married…with Children was the chronicle of Al Bundy, an unhappy salesman of women’s shoes; his lazy, lubricious, and self-indulgent wife, Peggy; their promiscuous and blazingly stupid daughter, Kelly; and their alienated, sarcastic son, Bud. Two years later, the network was out with another outside-the-box concept: a prime-time cartoon for adults called The Simpsons. With mostly yellow characters based on members of the cartoonist Matt Groening’s own family, The Simpsons was a biting satire of middle-class life in middle America.
r />   In 1996, Fox would make perhaps its most momentous move against the legacy broadcasters when it launched Fox News. Promising to be more “fair and balanced” than its chief cable rival, CNN, as well as network news, Fox in practice catered to conservative viewers who considered other news outlets liberal in bias, to the point of contempt for conservatives’ views.16

  In each case, whether broadcast or cable, Murdoch and Diller chased audiences believed, one way or another, to have fallen through the cracks, who had somehow disdained, or been disdained by, the mainstream. Not every new program worked, needless to say: Babes, a sitcom about three overweight sisters sharing an apartment, didn’t last long, nor did Get a Life, about a thirty-year-old paperboy who moves back in with his parents. But hits like The Simpsons, together with the Fox News Channel, with its steady rise of market share, more than made up for the duds.

  It was billed as a win-win, an alignment of commercial interests with those of a society now diverse enough to require a variety of choices to meet its full range of interests and sensibilities. Underlying the public-spiritedness was a flattering idea that the viewer should have more sovereignty over his mind and what came into it—should have the right to decide exactly how his attention would be spent, among a variety of real choices represented in the free-market gaggle of cable networks, good old broadcast television, and the new Fox shows. The exercise of that choice was facilitated by the now ubiquitous and reliable remote control, the scepter by which the new sovereign decreed his destiny.

  As befits a story of unintended consequences, however, no one fully appreciated where “more choice” would lead. The idealists of the 1960s and 1970s envisioned a thoughtful public who knew what they wanted, and would judiciously select programs that precisely matched their preferences. The ordinary viewer, who formerly had his tastes dictated to him, would now be elevated by democratically inspired offerings. The leading visionaries of cable Ralph Lee Smith (author of The Wired Nation) and Fred Friendly both foresaw a new age of true media democracy, when one’s attention was truly his own.

  In actuality, the spectacle of a fragmented audience would prove nightmarish to many of those well-meaning liberals and progressives who had originally welcomed the idea. If they imagined a paradise of shows like Sesame Street or the countercultural Public Broadcasting Laboratory, the reality would look more like Fox News, MTV, Married…with Children, and nonstop sports coverage. While never radically progressive, the great middle, which the networks had held together with semi-coercive prime-time rituals, was at least more easily led toward moderate mainstream values. In some sense what did come to pass fulfilled an earlier prophecy of Friendly’s concerning broadcast handled poorly; in 1970 he’d warned the system might “give way to a new Tower of Babel, in which a half-hundred voices scream in a cacophonous attempt to attract the largest audience.”17

  Even more unexpected, in retrospect, was the channel surfing, or put another way, the rise of a far more inattentive, scattered habit of viewing. A generation had passed since the darkened rooms of the 1950s, in which audiences had watched shows, one after another, in hushed awe. By the mid-1980s, Channels magazine was among the first to notice a new way of watching TV, which it called “grazing.” “People, rather than viewing specific shows or even specific types of shows,” the magazine reported, “like to sample in an un-patterned way a wide variety of what the medium offers. Their television diet comes from throughout the full buffet of shows.”18

  When you think of it, channel surfing, or grazing, is a bizarre way to spend your time and attention. It is hard to imagine someone saying to himself, “I think I’ll watch TV for three hours, divided into five-to-ten-minute segments of various shows, never really getting to the end of anything.” It hardly seems the kind of control Zenith could have had in mind when it first introduced the remote, to say nothing of the sovereign choice that cable’s more optimistic backers had dreamed of. However fragmented, attention was still being harvested to be sure, but the captivity was not a pleasant experience.

  The profusion of channels and ubiquity of the remote control, followed up by the VCR and its fast-forward function, also meant that, for the first time, television commercials were well and truly avoidable by means easier than the old expedient of getting up and going to the kitchen. As New York magazine announced: “1985 is the age of zapping—channel switching on remote controls or fast-forwarding through commercials.”19 For advertisers, this provoked a minor crisis, as their business and rates had long hinged on the fundamental precept of unavoidability. “Conventional wisdom in the advertising industry holds that a certain amount of irritation helps make advertising effective,” wrote Rena Bartos of the J. Walter Thompson agency in 1981. “But the erosion of advertising credibility,” she warned, may “be undermining consumers’ trust in brand name advertised products.”20 Consequently, over the 1980s, the major advertising firms (after introducing fifteen-second commercials) began to suggest, not for the first or last time, that advertising would henceforth need to be more entertaining and engaging—something that people wanted to watch.

  Of course, people have always wanted accurate information about products, but that’s not advertising. The goal was something both persuasive but also entertaining, something that, somehow, would keep the channel surfer’s finger still. Or as New York explained, the 1980s called for “zap-proof” advertisements, which were, in Madison Avenue’s thinking: “animation, sixties-type-musical takeoffs, soft-sell patriotism, [and] MTV-style rock videos.”21

  Pepsi changed its slogan to “The choice of a new generation” and paid Michael Jackson an unprecedented $5 million to dance with children to a song set awkwardly to his “Billie Jean” (lyrics: “You’re the Pepsi Generation / guzzle down and taste the thrill of the day / And feel the Pepsi way”).*

  Print advertising has always been less unpopular than television or radio; for it is more under the control of the reader who can avert the eyes. It can also be beautiful. Some of it became much harder to ignore during the 1980s, like a Calvin Klein campaign starring a fifteen-year-old Brooke Shields, or another picturing two men and a woman sleeping in their (Calvin Klein) underwear after a threesome. And on television, viewers (other than parents) might be less disposed to zap Shields posing provocatively in her Calvin Klein jeans and intoning, “You want to know what comes between me and my Calvins? Nothing.” These were a far cry from the old, extremely irritating Anacin advertisements depicting hammers pounding the skull. Whether, in fact, they prevented zapping, however, is hard to know for sure. Fortunately for the advertising industry, it remained impossible to accurately measure whether people were watching commercials or not, saving the enterprise from a true and full accounting.

  It was also during this era that the Super Bowl became a showcase for advertising’s greatest talents, seeming to prove the point that there were, indeed, commercials that people truly wanted to see. A much lauded advertisement for Coca-Cola that ran during the 1979 Super Bowl featured an enormous African American football player, Mean Joe Greene, being offered a Coke by a young white boy.22 And in 1984, Apple Computer ran its “Big Brother” advertisement during the Super Bowl to great acclaim. Directed by Blade Runner auteur Ridley Scott, it portrayed a young woman running and smashing a giant screen to save society from a totalitarian overlord. “On January 24th Apple Computer will introduce Macintosh,” the advertisement proclaimed. “And you’ll see why 1984 won’t be like ‘1984.’ ” The publicity created by its advertising, at least according to Apple, sold $3.5 million of its new Macintoshes.23

  In retrospect, the word “remote control” was ultimately a misnomer. What it finally did was to empower the more impulsive circuits of the brain in their conflict with the executive faculties, the parts with which we think we control ourselves and act rationally. It did this by making it almost effortless, practically nonvolitional, to redirect our attention—the brain had only to send one simple command to the finger in response to a cascade of involuntary cues. I
n fact, in the course of sustained channel surfing, the voluntary aspect of attention control may disappear entirely. The channel surfer is then in a mental state not unlike that of a newborn or a reptile. Having thus surrendered, the mind is simply jumping about and following whatever grabs it.

  All this leads to a highly counterintuitive point: technologies designed to increase our control over our attention will sometimes have the very opposite effect. They open us up to a stream of instinctive selections, and tiny rewards, the sum of which may be no reward at all. And despite the complaints of the advertising industry, a state of distracted wandering was not really a bad one for the attention merchants; it was far better than being ignored.

  This is ultimately where the Great Refusal had led, not to a bang but a whimper. Faced with a new abundance of choice and a friction-less system of choosing, we individuals, in our natural weak-mindedness, could not resist frittering away our attention, which once had been harvested from us so ceremoniously. And the choices would continue to proliferate. As they did, and the attention merchants’ work grew more challenging, the strategies for getting the job done would grow only more various and desperate.

  * * *

  * Michael Jackson’s agent approached Coca-Cola first, but the market leader was not interested. “They [Coke] saw anything they would do with Michael,” recalls the agent, “as a more targeted, ethnic campaign.” Monica Herrera, “Michael Jackson, Pepsi Made Marketing History,” Adweek, July 6, 2009.

 

‹ Prev