The Attention Merchants
Page 15
What could possibly go wrong?
* * *
* In The ‘Hitler Myth,’ Ian Kershaw discusses the Kirchenkampf, or “church struggle,” the Reich’s effort to Nazify the German church, purging it of all individuals who resisted the new ideology of blood and soil in favor of traditional Christian teachings.
CHAPTER 10
PEAK ATTENTION, AMERICAN STYLE
Back in the United States, with the war over, all eyes turned to television, the newest and most exciting invention since radio. And in 1950, Mr. Nielsen’s new ratings made it official: the National Broadcasting Company ruled the new medium, with exclusive ownership of the five top-rated shows:
Texaco Star Theater NBC
Fireside Theatre NBC
Philco Television Playhouse NBC
Your Show of Shows NBC
Colgate Comedy Hour NBC
On Tuesdays at 8 p.m., 61.6 percent of the 10 million families with a television were tuned to Texaco Star Theater. It began with the four “merry Texaco men” (as they called themselves) singing their catchy refrain, “Tonight we may be showmen, / Tomorrow we’ll be servicing your cars.” The host was a cross-dressing former vaudevillian named Milton Berle; he called himself “Mr. Television,” and his brand of humor did not aim high.1 Following the Texaco men, Berle burst through the curtains outlandishly costumed, usually in drag, whether as Carmen Miranda, or as Cinderella, or as a bride in white escorted by Fatso Marco, Berle’s shorter sidekick. The setup would then give way to Berle’s famous torrent of one-liners.
It was not a particularly high bar for television set by NBC and Berle. But the ratings were a triumph for NBC’s overlord, David Sarnoff, who as early as 1923 had prophesied the medium’s importance. A firm believer in what would later be called “first mover advantage,” he had decided during the 1930s that dominating television would be essential to both RCA (which made the sets) and NBC. The world war had ended but the television war had just begun. And so Sarnoff insisted (via internal memo) that he be referred to as “General” David Sarnoff, the “Father of Television.”*1 Bill Paley, meanwhile, entered the postwar period wishfully thinking that radio would remain the main action; sure enough, CBS’s lackluster TV ratings reflected this halfheartedness. But even if he were proved wrong, he would keep the mind-set of a middle distance runner, caring not so much about the fast start as finishing first. Going second and waiting for the lay of the land to reveal itself was always less risky. And still believing that Sarnoff basically misunderstood broadcasting, Paley was confident that he and his deep team of programmers could wipe out NBC’s lead when the opportunity came. The subsequent race, over the 1950s, would define for decades what the “second screen”—television—would become.
The migration of the screen into the home of every family was an event of unparalleled significance in the history of the attention industries and their influence over people’s lives. The poster had always been a sort of prototype. But the first real screens, those in the motion picture theaters, had a seemingly magical power to capture attention. The filmgoer “feels as if he were passing through life with a sharper accent which stirs his personal energies,” wrote the psychologist Hugo Münsterberg in 1916. “Love and hate, gratitude and envy, hope and fear, pity and jealousy, repentance and sinfulness, and all the similar crude emotions…help greatly to excite and to intensify the personal feeling of life and to stir the depths of the human mind.”2
A neuroscientist might say that cinema had always shown the ability to activate our mirror neurons, the brain cells known to light up identically whether we observe someone perform an act or do it ourselves.3 It happens, too, when an image is close enough to reality to make our brain identify with what it sees. Perhaps this partly explains why, even by contemporary standards, television was adopted quickly.
Early models were too expensive: between $5,000 and up to $20,000 in present dollars for a picture tube encased in wooden cabinetry. So most people first saw TV in a saloon. But from a mere 9 percent of homes in 1950, it was in 72 percent by 1956. Once ensconced in private space, it immediately began to devour time—nearly five hours a day by the end of the decade—causing a mixture of excitement, fascination, and fear.
Given TV’s novelty and limited offerings, early audiences were not fickle channel surfers. Contemporary accounts suggest something much more like the deeply immersive experience of motion pictures. Indeed, the lights were usually turned off for viewing, and there was little or no conversation. One only got up to change the channel. “We ate our suppers in silence, spilling food, gaping in awe,” said one woman in 1950. “We thought nothing of sitting in the darkness for hours at a stretch without exchanging a word except ‘who’s going to answer that confounded telephone?’ ”4
We have, to this point, treated all attention paid as more or less the same, which is a reasonable way to broach a complex subject. And it is all the same, insofar as everything that captures it enters our mind the same way. But of course there are differences in quality of attention; watching an airplane fly overhead is not as involving as being engrossed in a film. We might half listen to a professor babble, but our ears prick up at the sound of a marriage proposal. The most basic dividing line is likely between transitory and sustained attention, the former quick, superficial, and often involuntarily provoked; the latter, deep, long-lasting, and voluntary. What matters for present purposes is that selling us things relies mainly on the former—on which the attention merchant thrives—but our happiness depends on balancing the two.
As the immersive power of the movie screen was brought into the home, some could already see that it might be a force for greatness or nothingness. “It is also almost like a giant eye on life itself,” wrote the novelist Calder Willingham, “and can easily become the vehicle for masterpieces of a magnitude and power never achieved before in the arts, given the artists to create them and the audience to support them. For this very reason, it would also become the worst cultural opiate in history, buy and corrupt all talent, and completely degrade the sensibility of the country. Everything depends on the use to which television is put.”5
Just what uses would the attention gathered by television be put to? The next decade would begin to answer that question.
Though dominant in ratings, NBC still clearly suffered from the same weakness in programming that had limited it in radio, mainly because of Sarnoff’s indifference to content. When Paley and his team of programmers launched their attack, the strategy would be one that had worked before, that of promoting CBS as the higher-quality alternative—the Tiffany Network—purveyor of the best of the best.
NBC’s lackadaisical approach was epitomized by the Camel News Caravan, its television news show. The fifteen-minute Caravan was hosted by a former actor, John Cameron Swayze, and consisted mainly of his reading out headlines and playing newsreels designed for movie theaters, until delivering his signature sign-off, “That’s the story folks,” almost the same one used by Porky Pig.6 The show was not only superficial but also subject to onerous censorship and direction by Camel’s owner, the R. J. Reynolds Company. The sponsors preferred upbeat news, and mandated coverage of football (for men) and fashion (for women). They also set out a surprisingly detailed speech code that barred any display of competing brands, pipes, cigars, not to mention “no-smoking signs” as well as actual, living camels. When, in 1952, Reader’s Digest published a report linking, for the first time, cigarettes and cancer, the ensuing media sensation somehow never reached the Camel News Caravan. As one writer put it, “What Camel wanted, Camel got…because they paid so much.”7
Anyone could do better than that, and CBS soon established itself as the leader with CBS Television News (later renamed the CBS Evening News). In 1951, CBS radio’s star, Edward Murrow, appeared on television, perhaps surprising viewers, who had only ever heard his voice. His first show, See It Now, was produced by another legend, Fred Friendly, and offered something new—a weekly news analysis, critical in nature
, accompanied by clips and delivered by the charismatic Murrow, languorously smoking cigarettes. The New York Times generously praised the new competitor, perhaps not yet fully seeing it as such: “A striking and compelling demonstration of the power of television as a journalistic tool…in its emotional impact, sensitivity and drama, the commentator’s thirty-minute review of the week’s news was in all respects a magnificent achievement.”8
See It Now became deservedly famous in the history of journalism for a series of episodes broadcast in 1954, wherein Murrow decided to challenge American senator Joseph McCarthy’s Red Scare—his ceaseless investigations of alleged communists in the U.S. government and other institutions. It was widely and correctly taken as an act of bravery, for McCarthy was vindictive and indeed would try to destroy CBS. But Murrow succeeded in exposing him as a witch-hunter and a bully, and this was partly owing to the power of the medium: the same screen that made Murrow appear serene and dependable made McCarthy, with his strange and craven mannerisms, look like a monster.9
There is another dimension to this story that has become clearer with time. McCarthy was basically a typical twentieth-century government propagandist; like Creel, Mussolini, or Goebbels, he used the looming foreign threat to inflame hatred for marginal groups (communists mainly, but also gays) for the sake of amassing power. By exposing him, Murrow and CBS demonstrated not only courage but also the power of the private sector and, in particular, the attention industries, to defeat official propaganda. That had never happened before. And it coincided with the Supreme Court’s slow and gradual rediscovery of the First Amendment as a tool to check government controls on free speech.*2
Meanwhile, NBC answered its rivals by investing in an unsponsored news program of its own, this one called The Huntley-Brinkley Report. Paley, however, was never interested in running a public broadcasting service; he still wanted only to wrest away the claim NBC had on audiences. In 1952, he struck gold.
Lucille Ball was a forty-year-old radio and B-film actress of modest success, and also a friend of Paley’s. Her radio show My Favorite Husband became I Love Lucy with Philip Morris as its sponsor. Playing on Mondays at 9 p.m., it was an instant hit, within a year soaring past Milton Berle to take the top spot in Nielsen’s ratings. In 1953, the show attracted an astonishing average 71.3 percent of audiences, and as an average for an entire season, that figure remains unsurpassed.
What attracted audiences—sometimes over 50 million—to the show was, as everyone agrees, Lucille Ball herself. The show revolved around serial failure: Lucy’s desperate wish for a career in show business; she would do anything, no matter how ridiculous, to get her foot in the door. Invariably, for one reason or another, her plots would fall apart. But if her schemes failed, Ball’s performance succeeded brilliantly as spectacle. The medium was made for personalities.
An even more unlikely rejoinder to the manic Milton Berle appeared when CBS’s programmers found a stone-faced and awkward New York Daily News gossip columnist named Ed Sullivan, whose show (originally entitled The Toast of the Town) had a shaky start. On first viewing it, even the unerring Paley insisted that Sullivan be temporary. (His evident “anti-talent” would inspire the classic remark that “Ed does nothing, but he does it better than anyone else on television.”) But Sullivan’s talent wasn’t his presence on camera; rather, as with Paley, it was his connections and an eye for the talent of others, and his show gradually took off until it became the most watched on Sunday evenings by a broad margin.
And so it was that CBS and Paley made inexorable inroads, until, by 1955, the rising power had broken NBC’s dominance. Not that Paley’s lifestyle suffered a whit in the process: in 1947, he had married his second wife, the Vogue writer and fashion icon “Babe” Cushing Mortimer, of whom the designer Billy Baldwin said, “So great is her beauty, each time I see her, it is as if for the first time.” (Today Babe Paley is perhaps best remembered for her remark that “a woman can never be too thin or too rich.”) The two moved into the St. Regis Hotel and spent their weekends in Long Island with friends, who grew to include authors, actresses, and other bon vivants, drawn mainly from the creative classes.
Paley and CBS had done it again, overtaking NBC in five years without breaking a sweat. But this time, however, Sarnoff wasn’t content to lose gracefully. In 1953, he would appoint his own star programmer, giving him license to wage total war. As David Halberstam would later write, never had there been a “competition…so fierce…with weapons so inane.”10
We can credit these years, the mid-1950s, with the achievement of “peak attention.” By that, I mean something quite specific. A historic anomaly, unprecedented and eventually to dissolve, this was the moment when more regular attention was paid to the same set of messages at the same time than ever before or since. The phenomenon resulted from a confluence of the prime-time ritual, the novelty of television, and industry concentration—all combining to create within the Free World the routinely massive audiences previously possible only in fascist and communist regimes. Peak attention would continue through the 1970s, with CBS, NBC, and eventually ABC taking turns at winning the big prize of nearly 60 or 70 million viewers at a time. Nevertheless, the peak of peak attention can be assigned an exact date: Sunday, September 9, 1956, when Elvis Presley made his first appearance on television, on CBS’s Ed Sullivan Show. Its 82.6 percent share of viewers (out of a U.S. population roughly half of today’s) has never been equaled or bettered.
With a great many Americans, sometimes even a majority, watching the same programs, exposed to the same information, every day—or if not exactly the same, the same theme with slight variations—a kind of convergence was inevitable. Sitting in silence, everyone would “receive the same impulses and impressions, find themselves focused on the same centers of interest, experience the same feelings, have generally the same order of reactions or ideas, participate in the same myths—and all this at the same time.”11 That there were three channels to choose from hardly mattered. As the advertising-executive-turned-activist Jerry Mander would write:
It was as if the whole nation had gathered at a gigantic three-ring circus. Those who watched the bicycle act believed their experience was different from that of those who watched the gorillas or the flame eater, but everyone was at the circus.
What was missing was the exaltation of the rally, the thrill of losing oneself in the common experience for as we all watched from our separate living rooms, it was as if we sat in isolation booths, unable to exchange any responses about what we were all going through together. Everybody was engaged in the same act at the same time, but we were doing it alone. What a bizarre situation! 12
The 1950s would be remembered as a decade of conformity, and while the reasons may be many and complex, one cannot exclude the most salient: the historically anomalous scale of attention capture effected by television, together with the homogeneity of the stuff created to do the capturing. This was, of course, not enforced viewing; there were, as mentioned, three channels, and no facist “Television Guard”—though given the primacy of the ratings champions, most Americans were indeed watching the same shows and the same advertisements most of the time.
Was this propaganda? Certainly, it wasn’t experienced that way, and that difference does matter. After all, watching television was voluntary (even if everyone made the same choice); the medium was run not by the State, but private companies, whose goals were commercial, not political. Some of course have contended that shows like I Love Lucy did have an underlying social agenda or at least ideology. What must be remembered, however, is that Lucy and the other attractions were merely the bait; the effort to persuade came in a different form: a form new to the world, known as the “television commercial.”
“The customer is not a moron, she’s your wife,” wrote David Ogilvy in 1955, expressing, in the vernacular of the times, a buoyant new mentality among American advertising executives.13 Having nearly collapsed over the 1930s, advertising now found itself, much as it did
after World War I, the prime beneficiary of a new medium and a postwar return of consumer spending. Ultimately, the 1950s, if not as individually enriching as the 1920s had been, would be the industry’s golden age, as advertising spending quadrupled from 1950 through 1960, from $1.3 billion to $6 billion, or approximately from $11.5 billion to $54 billion in present value. Advertising was back, ready to feed off of television’s bountiful harvest of America’s attention. It must have seemed only fair, since every other business was doing the same.
In New York, the embodiment of the new industry was Rosser Reeves, a hard-drinking, hard-sell Virginian, and yet another son of a preacher, who’d made his name as a pioneer of television advertising. The first commercials were primitive—cartoons were common, and it was enough to import the best techniques from print advertising. Reeves soon proved himself the Claude Hopkins of the screen, selling products with claims they would do something for you, thanks to some distinguishing factor. The approach once called “reason-why” advertising was rebranded by Reeves as the “unique selling proposition.”*3 By no surprise, it worked best for medicines. Reeves’s television spot for Anacin pain-reliever depicted a hammer striking inside someone’s head and promised, based on the unique formula, “fast, fast, fast relief.” Reeves even found a way to sell candy as a solution to a problem—M&M’s: they “melt in your mouth, not in your hand.”14
The softer, brand-driven side of advertising made its own comeback, personified by Leo Burnett, a Chicago adman who, working at Cadillac under Theodore MacManus, learned the art of warm associations. Burnett’s personal specialty was developing the brand mascot, at which he had no peer. He transformed the Jolly Green Giant, formerly a terrifying ogre, into a beneficent friend, a deity of abundance and protection. “None of us can underestimate the glacier-like power of friendly familiarity” was how Burnett put it, and the Pillsbury Doughboy was his creation as well.15 Promoting Kellogg’s cereals, Tony the Tiger emerged as a sort of eater’s id incarnate, announcing with a roar, “They’re GRRREAT!” But Burnett’s most famous makeover was performed on a human. The Marlboro cigarette had originally been conceived to appeal to women in the late 1920s, with the slogan “Mild as May.” When Philip Morris decided to target men instead, Burnett thought that smoking cowboys would efficiently convey masculinity.*4 The Economist would later describe the Marlboro Man as “a mysterious wanderer, a modern Odysseus journeying who knew where; or perhaps a Jungian archetype, ranging the primeval savannah as man had done for most of the past 10,000 years. He was alone by choice in the vastness of the hills and plains, running his cattle and closely encountering wild white horses: alone save for that manly cigarette lodged in his thin, grim lips. Flinty and unconcerned, he would light the next smoke from a flaming stick plunged into his campfire.”16