Book Read Free

The Attention Merchants

Page 23

by Tim Wu


  Since its invention at midcentury, the electronic computer had been an object of enduring fascination. But by the 1970s, it remained a machine for serious business, institutional and industrial, not a consumer item, let alone one that might be used to harvest the public’s attention in any way comparable to how radio or television did. But when it was shown to have the potential for personal entertainment, the computer’s fate—and our story—were irrevocably changed.

  The breakthrough game would come in a form largely lost to our times. Over the 1970s, a California company, Atari, built a cabinet with a computer inside. Atari’s founder, Norman Bushnell, had seen the Odyessy and, like Baer, was enough of a tech geek to know that computers could, in fact, be great fun. He and other computer scientists with access to the mainframes had already been playing a game named SpaceWar! So Bushnell designed a cabinet, about the height of a man, configured somewhat like a phone booth (another obsolete receptacle); for a quarter, one could play the game inside like a pinball machine. Soon cabinets with various games—beginning with Pong, another tennislike challenge—started cropping up in various public places, sometimes collected in amusement arcades, alongside pinball machines. But it wasn’t until Japanese developers got involved that computer games really hit the mainstream.

  In 1977, Tomohiro Nishikado’s boss at the Tokyo firm Taito, a manufacturer of jukeboxes and arcade games, asked him to make a Japanese version of the American video game Breakout.* Earlier in his career Nishikado had done the same with Atari’s Pong, resulting in Elepong, which proved a modest success, making him the natural choice. Nishikado took the assignment, and for some reason decided to carry it out entirely alone. Working long hours, he made substantial changes to the original. The opponents now moved across and down the screen, dropping bombs on the player’s paddle, now transformed into a crude artillery unit, capable of firing back laser beams in response.2

  In an early prototype, the opponents were tanks and soldiers but Nishikado’s supervisors rejected it. “You must not create the image of war,” he was told, perhaps out of Japanese sensitivity about past imperial aggression. As it happened, Nishikado had around the same time been reading about the American hit film Star Wars. So he decided to name his game Space Monsters. In 1978, Taito released it in Japan as Supēsu Inbēdā—and in America as Space Invaders.

  In both markets Space Invaders was a sudden and unexpected success—nothing quite like it had ever been seen. “Outer Space Invaders are taking over the U.S.,” reported the Youngstown Vindicator. The Washington Post reporter assigned to try the game described his experience this way: “I dropped in a quarter and saw 55 rectangles waving little arms and dropping laser bombs on earth, which is at the bottom of the screen. I fired back with my three laser bases, which got bombed out in about 30 seconds….I was still pounding on the FIRE button at end of game. End of quarter. Start of addiction.”

  The themes of addiction and engrossment could be found in writing about video games from their debut. “It’s like drugs,” a Space Invaders distributor told The Washington Post in 1980. “They’ll tell you: ‘I got a $4-a-day habit.’ ” The first thing everyone noticed about Space Invaders was just how captivated its players were. It was a hard game—seemingly hopeless—yet something kept players coming back, trying to conquer it.3

  “What we are dealing with is a global addiction,” the novelist Martin Amis would write in his own meditation on video games in 1982. “I mean, this might all turn out to be a bit of a problem. Let me adduce my own symptoms, withdrawals, dry outs, crack-ups, benders.” Psychologists and other experts were perplexed and disturbed by the appeal of the games, especially to children. “Most of the kids who play them are too young for sex and drugs,” a Dr. Robert Millman told The New York Times in 1981. He proceeded to compare playing video games to sniffing glue. “The games present a seductive world….[Young people want] to be totally absorbed in an activity where they are out on an edge and can’t think of anything else. That’s why they try everything from gambling to glue sniffing.” Others seemed to think Space Invaders’s success had to do with the recent national experience. “It’s really Vietnam,” wrote Ted Nelson, a magazine editor. “It’s a body count war. You do it and you never ask why.”

  With Space Invaders, computers broke through as an indisputable part of the entertainment industry. Indeed, in 1982, the game would be the highest grossing entertainment product in the United States; outperforming even its inspiration, Star Wars, it earned more than $2 billion, one quarter at a time. But it was perhaps no surprise: by 1980, in the United States alone, video games were consuming 11.2 billion quarters annually, yielding $2.8 billion in revenue; by the early 1980s, the estimate was $5 billion, exceeding, for a while, the income of the film industry.

  Video games also consumed something else, human attention, in a way that was both old and new at the same time. As in any real game—be it tennis, pinball, or blackjack—the fast-flowing stimuli constantly engage the visual cortex, which reacts automatically to movement. No intentional focus is required, which explains why children and adults with Attention Deficit Disorder find the action of video games as engrossing as anyone else. Unlike games in reality, however, video games are not constrained by the laws of physics, making possible a more incremental calibration of the challenges involved, duration, and related factors in the attempt to keep players coming back.4

  But there were big differences between the new games and other things we have discussed that get us to pay attention, whether it be listening to Amos ’n’ Andy, watching a sitcom, or reading email. For one, the business model: it was cash for an experience, rather like seeing a play or reading a book. The attention merchant model would not be contemplated until much later. Second, playing a game like Space Invaders was, as we’ve said, challenging—to the point of utter frustration. Most couldn’t last longer than a minute. The games, at this stage at least, aimed for something completely different. The early players of Space Invaders were captured not by the dazzling graphics and sounds of today’s narrative-based games, but by the urge to match their skills against the machine in the hope of finding, if only for a moment, that perfect balance between their abilities and the ghosts or space monsters chasing them. In this way, you might say they effected another form of operant conditioning.

  Only a few games seem to have successfully achieved that balance. Some were simply too hard, others too easy, and those aren’t really the only variables anyhow. But the best could sustain excitement, or even induce a “flow state,” that form of contentment, of “optimal experience,” described by the cognitive scientist Mihalyi Csikszentmihalyi, in which people feel “strong, alert, in effortless control, unselfconscious, and at the peak of their abilities.” It was more than enough to keep people coming back, in their billions, across the world, and parting with their hard-earned money for a chance at such transcendence.

  As Space Invaders took off, Japanese and California companies rushed in to replicate its success, producing games like Asteroids, Galaga, Caterpillar, and the epic Donkey Kong. Genre aficionados would later describe the period between 1978, the release of Space Invaders, and 1985 or so as a golden age. Martin Amis described the English arcades of the early 1980s: “Zonked glueys, swearing skinheads with childish faces full of ageless evil, mohican punks sporting scalplocks in violet verticals and a nappy-pin through the nose.” The places, meanwhile, were run by “queasy spivs, living out a teen-dream movie with faggot overtones.”5

  As with any business, initial success in the new video game industry stirred thoughts of expansion. Moving beyond male teenagers would be a start.

  Namco, another Japanese company, had set its sights on winning over girls and women. “Back then,” said Namco designer Toru Iwatani, the arcade “was a playground for boys. It was dirty and smelly. So we wanted to include female players, so it would become cleaner and brighter.” Iwatani, then twenty-five, was the creator of Gee Bee and its two sequels, Bomb Bee and Cutie Q; he was task
ed with producing something suitable. As he describes it, “My aim was to come up with a game that had an endearing charm, was easy to play, involved lots of light-hearted fun, and that women and couples could enjoy.” He originated a concept centered on eating, because he noticed his girlfriend liked desserts. Then, “the idea occurred to me of constructing a maze in which movement was restricted to the four basic directions—up and down, left and right.” For the game play, he decided on a “chase” inspired by Tom & Jerry, the cat and mouse duo of cartoon fame. He then created, as lead character, the great eater, who was nothing more than a moving mouth named “Pakku Man,” based on the onomatopoeia “paku-paku,” the sound Japanese speakers would say the mouth makes while it is eating. Visually, the eater was inspired, in part, by a stylized version of the Japanese character for mouth, O, with a part removed, like a pizza missing a slice.

  In the United States Pakku Man became Puck Man; and later, to remove any temptation for vandals, Pac-Man. Finally, out of nowhere really, Iwatani had the idea of making the antagonists ghosts with distinct personalities. They were, in the original game, the “Urchin,” “Romp,” “Stylist,” and “Crybaby,” and each had a different way of chasing Pac-Man. In English their names were Blinky, Pinky, Inky, and Clyde.

  Pac-Man soon became an even more lasting and profitable hit than Space Invaders, as its appeal went beyond its demographic target. Some early video game critics were dismissive. Omni magazine decried “the latest craze,” which it called “a silly ‘gooble or be goobled’ game.” “Those cute little PacMen with their special nicknames, that dinky signature tune, the dot-munching Lemon that goes whackawhackawhackawhacka: the machine has an air of childish whimsicality,” wrote Amis. Nonetheless, 400,000 of the arcade cabinets were sold worldwide, grossing billions of quarters, 100-yen coins, and other denominations. Over its entire run, by one estimate, Pac-Man earned more than $2.5 billion by the 1990s. To his credit, Amis gave solid advice on how to play the game. “PacMan player, be not proud, nor too macho, and you will prosper on the dotted screen.”6

  Having conquered, and then diversified, the arcade, there was really only one place for gaming to go: back to the future, where Magnavox Odyssey had started. And so the computer would enter the home to resume the contest whose seeds Ralph Baer had planted: that between the second screen as broadcast platform and as gaming peripheral. Atari, still run by its founder Bushnell, thought a home video game system could succeed by allowing people to play the most popular arcade games, like Space Invaders, on their televisions. The effort gained heft when Warner Communications (forerunner of Time Warner) bought Atari and started pushing the consoles. But real success wouldn’t come until Atari licensed Space Invaders in 1979; it would go on to sell a million units that year; then two million units in 1980, and by 1982 it had sold 10 million, making Atari at that point the fastest growing company in U.S. history.

  The significance of those little Atari 2600 consoles appearing in homes would only become apparent later. For many people, here were the first computers to come into the home, and the first new screens, after the television, to breach those walls. They made easier the entry into the home of not just more consoles, but also home computers, like the Apple II or the Commodore 64, for it was one thing to buy an expensive machine that supposedly would be used for work or programming; it was another to get all that along with the spoonful of sugar, namely, a machine that also came with even better games than the Atari had. In this way video games were arguably the killer app—the application that justifies the investment—of many computers in the home. As a game machine, sometimes used for other purposes, computers had gained their foothold. There they would lie for some time, a sleeping giant.7

  * * *

  * Breakout was written by Apple’s cofounders, Steve Wozniak and Steve Jobs, as a side project, as described in the Master Switch, chapter 20.

  CHAPTER 16

  AOL PULLS ’EM IN

  In 1991, when Steve Case, just thirty-three years old, was promoted to CEO of AOL, there were four companies, all but one lost to history, that shared the goal of trying to get Americans to spend more leisure time within an abstraction known as an “online computer network.” Their names were CompuServe, Prodigy, AOL, and GEnie, and merely to state their mission is to make clear why theirs was no easy sell.

  Despite a certain whiff of excitement and novelty, neither personal computers nor online networks were seen as particularly entertaining, except to those who used them to play video games. The machines had a devoted cult following, and there was a mystique to them as portals into the mysterious virtual world named “Cyberspace” as depicted in novels like William Gibson’s Neuromancer or Neal Stephenson’s Snow Crash. But most who had a computer usually kept it in the den or basement; the machine itself was unwieldy and ugly, consisting of a large, boxy body and a screen smaller than today’s laptops. In an age before widespread use of graphical interfaces like Windows, a glowing text, orange or green, was still what one faced; it had been that way since the first fully assembled home computers with dedicated screens, the Apple II and the Commodore PET, were marketed in 1977.*1 As for a mouse, that was still a creature known to reside in a small hole.

  Meanwhile, just as today, the television remained prominent in the living room, with its large attractive screen and dozens of channels; to use it required little expertise, nothing of the sort of arcane knowledge needed to operate a modem. Thus even to speak of the computer as a competitor to television in 1991 would have been a laughable proposition.

  So if they were going to get people to sit in front of a computer screen and “dial-in” to networks (the commercial predecessors of the Internet), the online networking companies needed to do something to lure Americans away from television (as well as family, magazines, and other lesser draws). Over the 1990s each of our four companies tried something different. The story of what finally worked is in many ways the story of how networking and the Internet—and the third screen—would come to win for itself such an amazing share of the nation’s and the world’s attention. It’s also the story of how a new breed of attention merchants came into being.

  Before going further, let’s observe that, with one exception, the business model of these “computer information services” as they called themselves, was not that of attention merchant but solely subscription-based. At the time, there was no easy way to reach the Internet, which was still a government-run network devoted mainly to research. The four services offered customers access to proprietary networks, primitive little Internet-like spaces, with attractions such as news, discussion forums, and online games. To go online—connect to a network—also required the purchase of a modem, into which one had to enter a string of somewhat byzantine commands, while also occupying the family phone line.*2 The firms charged a fixed monthly fee, plus an hourly rate for time spent online and various extras. AOL for instance charged $9.95 for up to five hours a month and $3.50 for every additional hour. That it expected roughly 90 percent of its customers to be satisfied with five hours a month gives some indication of usage levels, and attractions, at the time.

  The market leader in 1991 was the oldest of the four firms, the solid and serious CompuServe, owned by H&R Block. Its subscribers, numbering roughly 600,000, were largely male, a mix of computer hobbyists and some businessmen. CompuServe took the way of substance over style, betting that pure data was what mattered to its customers, who seemed to appreciate the no-nonsense, text-only interface. Since it originally used idle time on commercial mainframes, CompuServe gave its users numerical identifications (example: “70014,2316”), and sold them on utility: “When you join CompuServe,” its ads promised, “your computer becomes a time-saving, money-making, life-enhancing tool.”1

  Just behind CompuServe—even ahead of it by some measures—was the most ambitious and audacious of the four. Prodigy was a fast-growing operation founded by the odd combination of Sears, CBS, and IBM. It styled itself as the visionary alternative, the kind of
place where managers would say things like “The future is here.” Aiming for mainstream consumers, Prodigy was betting that, for most users, online would be a world of shopping and entertainment, and that whoever had the most and best of both to offer would win. It also foresaw the importance of being an attention merchant, by betting on online advertising as its business model. That approach certainly put them ahead of GEnie, or the “General Electric Network for Information Exchange”; run by GE on CompuServe principles, GEnie was another text-only service with about 350,000 users.

  Lingering in fourth place was the D.C.-based America Online (previously called Quantum Link). With just thirty thousand subscribers and no rich owners, it had an uneven track record, including several flirtations with bankruptcy. Its founder, William von Meister, had once been described as “a cackling Amadeus…the ringmaster at the circus.”2 He was long gone, but the zany spirit remained. To the degree it had a strategy, AOL was catering to people who knew nothing about computers, the shopping and entertainment users that Prodigy had more of.

  Of the four, it was Prodigy that, in many ways, would prove most interesting, particularly considering the gap between its soaring vision and what it ultimately offered. The big corporate backing bred wild overconfidence. “The issue really isn’t success or failure,” the chairman of Sears, Edward Brennan, said in 1989. “We’re not going to fail. It’s really a question of how big the success is going to be.”3

  Prodigy initially spent an enormous sum (about $650 million) to develop and advertise its network, and built a veritable virtual palace, at least by the standards of the early 1990s. Seeing the text-only approach as too dreary or intimidating, they created a primitive graphic user interface, making it as easy to use as possible. Then came more heavy investment, now in branded content from partnerships with CNN, expensive commissions of well-known columnists, paid interviews with movie stars and athletes, and even a full-time newsroom staffed 24/7 with human reporters in its White Plains, New York, headquarters. To draw in new users, it turned to advertising, and in particular to J. Walter Thompson, which developed the slogan “You gotta get this thing!” in magazines, newspapers, and even national broadcast spots.

 

‹ Prev