The Code

Home > Other > The Code > Page 30
The Code Page 30

by Margaret O'Mara


  Companies, cueing in to an overstretched workforce and anxious to retain valuable employees, attempted a little behavioral modification. AMD paid for multiple psychotherapy sessions for any employee who wanted them. Hard-charging Intel had a moment of concern for its employees’ work-life balance and timed all the lights in the building to switch off at 7:00 p.m., forcing everyone to go home for dinner. Yet the lines between professional and personal lives continued to blur as each twelve-hour workday passed. “You are terribly afraid to stop working,” noted another local therapist, “because somebody will get ahead of you.”26

  As if its big blue shadow over Silicon Valley didn’t already loom long enough, in December 1982 IBM bought a 12 percent stake in Intel, whose 8088 chips powered the PC. Six months later, it bought up 15 percent of ROLM, signaling its ambitions to make the IBM desktop a node in a fully wired “electronic office” of the future. The Valley soared higher than ever before, but it seemed to many that it had flown too close to the sun.27

  THE PLATFORM

  Yet the thing that really made IBM’s PC into the ultimate disrupter wasn’t the hardware. It was Microsoft’s software, coupled with Intel’s chips, that broke apart the full-stack regime and upended the economics of the computer hardware and software business.

  The “PC platform”—an entirely different OS from the ones used by Apple and the other early microcomputer makers—rapidly became the industry standard, shoving nearly every other contender out of the way, because, unlike Apple’s closed system, the MS-DOS used by IBM wasn’t restricted to IBM machines. The Intel chips that powered the IBM PC could power other computers too. From Texas to Tokyo, companies sprang up building “clone” machines that offered the same platform and apps as the IBM, for a fraction of the price. It was a boon for consumer and small-business markets, where customers were reluctant to drop thousands of dollars on fancy machines.

  As the PC platform scaled up, software companies finally found their market opportunity. The market now teemed with little operations who had been busily trying to monetize something that computer people long had thought should just come for free. Now they had an obvious destination for their products. After so many years of drought, the programs started multiplying like rabbits: video games, spreadsheets, word processors, educational software. The growth far outpaced the number of programs being created to run on Apples. Now, the big start-up success stories were software companies.

  Silicon Valley might have been the cradle of the first microcomputers, but now some of the biggest hits started happening in other places. The ever-astute Ben Rosen, now a venture capitalist in Texas, sniffed out some of the best of the new breed, and he made a killing in the process. In 1982, Rosen backed an IBM-clone maker in Houston called Compaq, whose sales topped $100 million in its first year of business. The same year, Rosen took a bet on a young software developer in Boston, who was building programs for the PC platform. There were plenty of software entrepreneurs trying to do the same thing those days, but Rosen sensed something was different about this one. His name was Mitch Kapor, and his company was called Lotus Development Corporation.

  Tech had been blowing Mitch Kapor’s mind ever since he’d picked up a copy of Ted Nelson’s Computer Lib soon after graduating from college in the early 1970s. It wasn’t the programming that drew him in—by his own admission, he was only an “ok” programmer—but something largely unappreciated at the time: software design. “The software designer leads a guerilla existence,” he would write several years later, “formally unrecognized and often unappreciated.” Yet good design—a lack of bugs, ease of use, and an interface that delighted the user—was fundamental to good software. “One of the main reasons most computer software is so abysmal is that it’s not designed at all, but merely engineered.” Throughout his varied career, Kapor was determined to change that.

  After a few years of post-college drift that included a stint as a DJ and a six-month-long advanced training in Transcendental Meditation, he scraped up money to buy his first personal computer. It changed his entire trajectory. Soon he was writing and selling software for Apple IIs out of his Boston-area apartment. Before long, Kapor moved over to Personal Software, publisher of VisiCalc, the app that turned the Apple into a business machine. There he kept writing more programs and, because he was a consultant, keeping a nice chunk of the royalties.

  Drawn in by the “micro revolution” buzz about the Valley, he transferred out to Personal’s Sunnyvale office in 1980. He hated it. “It was a monoculture,” he remembered, where people “could only think one big thought at any time.” Amid the suburban blandness and California aridity, he ached to return to the bookstores of Harvard Square. After six months, he did.

  Kapor’s time in Sunnyvale may not have been an enlivening one, but it implanted the start-up bug within. Everyone out there was starting companies; why not him? The IBM PC needed software; why not build some apps as good as VisiCalc for the new platform? Venture capitalists intimidated Kapor—he’d once met the steely Arthur Rock, and found him “very scary”—but he knew Ben Rosen, who had been a user of Kapor’s first software product. So he wrote Rosen a seventeen-page letter asking for money to launch his software business, which he had named Lotus. Rosen had been a VC for mere months by that point, but he said yes. It turned out to be a great bet. Kapor’s company had $53 million in sales in 1983, its first year of operation. The next year, revenues tripled.28

  Lotus Software was on its way to becoming one of the biggest success stories of the 1980s, the crest of a new wave of software companies building apps for the ever-expanding empire of the PC. The micro’s spiritual home may have been the Bay Area, but many of these software upstarts—Lotus in Boston, WordPerfect in Utah, Ashton-Tate in Los Angeles, Aldus in Seattle—were just like Microsoft: outsiders, ready to give the Valley a run for its money.29

  1984

  As 1984 began, the men and women of Apple were standing outside looking in at the growing empire of the PC. Education had become a huge success story for Apple, aided greatly by California’s adoption of “Kids Can’t Wait.” Yet the education numbers were minuscule compared to the massive office market. Same for home computing. The market was vastly larger than it had been, but the number of North American households owning a personal computer still stood at about 8 percent. The percentage of people using computers at work was three times that, and it was spiking up rapidly.30

  Chinks appeared in Steve Jobs’s supremely confident armor. “IBM wants to wipe us off the face of the Earth,” he admitted to a reporter. Apple—and Jobs—needed a very big win. The man who embodied Apple in the media sensed that his celebrity was slipping along with the company’s market share, and it annoyed him no end that the business press didn’t take him as seriously as they did IBM’s John Opel. When a member of Regis McKenna’s PR team informed Jobs that he wouldn’t be appearing on the cover of Fortune, the Apple chief became so enraged that he took the glass of water he had been sipping and threw it in the associate’s face. Dripping and equally furious, the latest victim of Jobsian wrath left the meeting and drove the twenty minutes to RMI. “Look what your ‘son’ did to me,” the associate told McKenna in disbelief. “Regis, the guy is insane.” A chastened Jobs called with an apology soon after. The dampened exec suspected that Regis had cajoled Steve into it, but agreed to return. After all, the pressure to beat IBM could make anyone go nuts.31

  John Sculley’s years in the cola wars made him more bullish. Just like Coke could exist alongside Pepsi, Apple could survive, and thrive, in the same universe as IBM. After all, IBM’s big Christmas 1983 release had been a dud: an adaptation of the PC for the home market called the PCjr. The dumpy little brother of the mighty PC hadn’t been developed via another Manhattan Project, but through IBM’s regular development channels—and it showed. The PCjr reminded the market that “Apple is definitely the people’s computer,” observed one software distributor. “This is the year Apple fig
hts back,” announced Sculley in January 1984. “We are betting the entire company.” And the big bet was the Macintosh.32

  After a couple of years of being outspent by its bigger competitors, Apple beefed up its marketing budget and hired Los Angeles–based ad agency Chiat/Day, a longtime computer-company favorite that had recently made a splash with its ads for Honda motorcycles and Nike running shoes. Psychographic research from Arnold Mitchell at SRI helped Apple home in on its target market: “achievers” who “would rather be individuals, not part of a group.” But the Mac, for all its pirate-flag cred, “must unequivocally be positioned/featured as a business product,” cautioned Chiat/Day’s planners.33

  The Mac had a killer market advantage: you didn’t need to know much about computers to use it. “People are intimidated by choosing” the right computer, Chiat/Day reminded Apple executives, and worried about whether what they buy might become obsolete too quickly. IBM’s Charlie Chaplin ads had successfully deployed “easy-to-use” messaging. Apple’s Macintosh was actually easy to use. It was stripped-down and simple compared to the Alto and the Lisa, but it still had graphics, icons, and a friendly little mouse. Yet Sculley kept reminding everyone of his Coke-versus-Pepsi model: Apple shouldn’t imitate, it should present itself as something completely different. Steve Jobs agreed. “We need ads that hit you in the face,” he said. “It’s like it’s so good we don’t have to show photographs of computers.” “Macintosh advertising,” the agency brass concluded, “must be distinctive and mirror the radical, revolutionary nature of the product.”34

  And distinctive it was. On January 22, 1984, the Mac debuted on the world stage in a $1.3 million television commercial aired during the Super Bowl. Directed by Hollywood science-fiction auteur Ridley Scott, the ad was sixty seconds of jaw-dropping visuals riffing off George Orwell’s 1984 and all the computational allusions made to it ever since. The tagline: “On January 24, Apple Computer will introduce the Macintosh. And you’ll see why 1984 won’t be like ‘1984.’” The computer itself never appeared.

  Macintosh ads blanketed the nation’s television airwaves in the months that followed, through the Winter and Summer Olympics, through a presidential campaign season of ads celebrating the Republicans’ “Morning in America” and the Democrats’ first female vice-presidential nominee. No other Mac ad was quite as memorable as that Super Bowl spot—few ads in history ever matched the buzz it created—but the message of all of them was the same. We are not IBM. We are not the establishment. Our computers will set you free. Over video of an elegantly manicured female finger clicking on a mouse, the tagline for all of them read: “Macintosh. The computer for the rest of us.”

  MORNING IN AMERICA

  By October, as the economy sped up to a gallop and the Mondale campaign limped toward an expected drubbing on Election Day, Regis McKenna had gotten philosophical. “The good news is that Mondale is going to lose and we’ll see the end of the traditional Democratic Party we have known,” he told Haynes Johnson of the Washington Post. “I see that as good because there’s a whole generation of young Democratic politicians coming up that are different. The bad news is that Reagan is terrifying, and I really mean that.”35

  But McKenna’s worries couldn’t dim the fact that the Valley was enjoying very good times in the autumn of 1984. Johnson’s election-eve piece on the Valley bore the title “Silicon Valley’s Satisfied Society,” and there was plenty of evidence of that satisfaction. Japan’s onslaught of cheap chips and electronics continued, but the personal-computer boom had vastly enlarged and diversified the market. The name of Silicon Valley’s game was no longer only semiconductors, but computer hardware and software too. And while plenty of IBM clones rolled off East Asian assembly lines, America was #1 when it came to building new generations of personal computers and building the software to run on them. Worries about global competition hadn’t disappeared, but they became softened by billowing blankets of money.

  That Christmas, Apple threw nineteen holiday parties, including one with a $110,000 price tag that featured the odd pairing of a Dickensian village theme—complete with thirty strolling performers in period costume—and a concert by Chuck Berry. Jerry Sanders went one better with AMD’s $700,000 black-tie bash featuring performances by a boys’ choir, a full orchestra, and arena-rock heroes Chicago. “Sure, I get letters about the starving children in Ethiopia,” said Sanders breezily. “But our people worked hard for this, and they earned it.”36

  The over-the-top company parties marked a new high-water mark of conspicuous consumption in the Valley. But this was not the last time that outrageous holiday shindigs would signal bumpier times just ahead. As 1985 began, Wall Street’s mania for personal computers cooled. Even Microsoft hung back for another year before having its IPO. The young market shook out further, matured, and some of its pioneers found themselves rather unceremoniously out of work. Others became more powerful than ever. The semiconductor makers continued to battle for market share, their fortunes finally stabilizing only when Japan’s economic miracle proved to be not so miraculous after all. New technologies and influential new players emerged in fields that not only built thinking machines but connected them to others: workstations and relational database software and computer networking.

  And amid all of this disruption was something that had been there from the start, had never really gone away, and that in the 1980s had become more influential—and, to some in the Valley, much more ominous—than it had been in several decades. It was the tech counterculture’s original Big Brother: the computer-powered federal government, and its very high-tech ways of making war.

  CHAPTER 17

  War Games

  “SHALL WE PLAY A GAME?” the computer asked David Lightman. “Love to,” the teenage geek responded. “How about Global Thermonuclear War?” With two text commands, the 1983 summer blockbuster WarGames shifted into high gear.

  Armed with nothing but an IMSAI computer and a modem in his bedroom, a kid in suburban Seattle inadvertently hacks into a top secret Defense Department mainframe. Before long, the protagonist and his female companion find themselves deep in the Air Command’s mountain bunker, frenetically trying to reprogram a supercomputer that is determined to launch thousands of nuclear warheads. At the last moment, Lightman’s programming skills save the world from mutually assured destruction. The story was pure Hollywood, but for the popcorn-chomping millions watching in the cool dark of American movie theaters that summer, WarGames wasn’t all that far from reality.

  After decades of test ban treaties and détente, America was once again ramping up its nuclear arsenal and employing the highest of high technology to do it. The post-Vietnam American military could no longer keep up with the Soviets in terms of size—the U.S. had abolished the draft, while the U.S.S.R. had forced conscription—but it had a huge advantage when it came to technology. America was the capital of microelectronics, and even if Japan was nipping at the U.S.’s computing heels, the Soviet Union would have to spend mightily to get anywhere close. Defense Secretary Caspar Weinberger wasn’t particularly bullish on tech, but he believed in the power of an economic strategy to drain the Soviet treasury. So, too, did Secretary of State George Shultz, who had been a Northern Californian ever since the Nixon years, and who had close ties to Stanford and the Valley tech community. Just as it had in the early days of the Eisenhower Administration, the defense agenda once again took a high-tech turn.1

  Just three months before WarGames’ release, Ronald Reagan had announced an audacious new program to create a sophisticated missile shield in space using satellites, lasers, and all kinds of computer-controlled technology. Northern Californian fingerprints were all over the proposal: the laser-based system had been championed loudly and early by Berkeley’s Edward Teller, father of the H-bomb and director of the Lawrence Livermore Laboratories. David Packard endorsed the idea as well. Called the Strategic Defense Initiative, or SDI, the program pushed the outer edge of
the technologically possible. And although the press reports included illustrations of lasers and satellites and pow-pow action in the upper atmosphere, SDI was really all about computers. Thus the arrival of SDI, and its accompanying political controversies, became inextricably intertwined with the other major DARPA initiative announced in the summer of 1983: Strategic Computing.2

  Reagan already was battling a reputation of being a warmonger—he famously proclaimed the USSR an “evil empire” mere weeks before the SDI announcement—and despite presidential assurances that the new program was about nuclear deterrence, many of the nation’s most prominent scientific names lined up to decry the program as a dangerous boondoggle. Technophilic Democrats in Congress howled too. Making the shield work would be “like hitting a bullet with a bullet,” one aide informed Senator Paul Tsongas, who had become a particularly vocal critic. The program quickly acquired a not-so-charitable nickname: “Star Wars.”3

  The prospect of space battles between the light and dark sides of The Force might have been far-fetched. The possibility of a WarGames-like scenario wasn’t. The program’s deep dependence on computer software raised the ominous possibility of bad code triggering accidental global annihilation. Worst of all, in the minds of Americans now conditioned to think of government as an inherently bad thing, it was going to be built and controlled by bureaucrats. “Technology is not a panacea for our ills,” explained WarGames director John Badham, when asked about the message of his film, “and bureaucracy is something that will surely get you in deep, deep trouble every time if left to run untrammeled.” Sam Ervin couldn’t have said it better himself.4

 

‹ Prev