The Code
Page 13
At a moment when wars, protests, and moon shots dominated the national headlines, Hoefler’s coverage of the strange little technological Galapagos of Silicon Valley set the tone for reporters following in his wake. The charismatic personalities and intensely competitive culture of the semiconductor industry made for great copy, and America certainly needed some new heroes.
The Wagon Wheel was a good place to find them. As big contractors and aerospace companies up and down the West Coast went into their post-Vietnam tailspin, the chipmakers boomed. The Valley still had plenty of suit-and-tie outposts of Eastern electronics giants, but its semiconductor companies were the rising stars. They still were young and agile enterprises with low fixed capital costs—a stark contrast to the increasingly sclerotic old-line manufacturers who were scrambling to stay competitive. And they no longer needed defense contracts to survive.
The king of the Fairchildren was Intel, founded when Bob Noyce and Gordon Moore decamped from Fairchild Semiconductor in 1968 after years of chafing under the micromanagement of its East Coast parent company. In contrast, Intel was entirely venture-funded by local firms. Then there was National’s CEO Charlie Sporck, the cost-conscious son of a taxi driver, the guy who didn’t like to pay to mow the lawn and who, while at Fairchild, had pioneered the idea of offshoring chip assembly to East Asia. Down the Valley was Advanced Micro Devices (AMD), founded by Jerry Sanders, who had grown up a street-fighting kid from the South Side of Chicago and morphed into a sales executive fond of loud suits, sporty cars, and Gucci loafers. Sanders poached twelve other Fairchild employees to come along with him.3
The late-’60s hot market for mergers and acquisitions added to the diaspora, as Eastern electronics giants bought up local start-ups. Faced with the prospect of adapting to stuffy corporate culture or—even worse—relocating to corporate headquarters, the employees of these acquired companies, as one of them put it, “started looking for other pastures.”4
Yet Don Hoefler was still writing about a very, very niche market. If you were taking bets on the place that would become the center of the computerized universe at the start of the 1970s, Northern California remained a long shot. IBM still ruled the business-machine world. Texas pumped out far more microchips. The new technology setting the computer world on fire, minicomputers, was a Boston business. And the vast majority of investment capital—including venture capital operations like the funds headed by Ned Heizer and David Morgenthaler—was still based east of the Mississippi.
Wall Street analysts had no interest in following the semiconductor industry (“The computer industry is IBM,” one coolly informed Regis McKenna), and The Wall Street Journal refused to write about any company that wasn’t listed on the stock exchange. Making Silicon Valley more illegible to the wider world was the fact that its firms sold to other electronics companies, not to consumers. An Intel chip might be inside the computer down the hall or in the calculator on your desk, but you wouldn’t know it.
Ten years on, the Valley had vastly increased in size and influence and was very much in the consumer-electronics business. Another decade after that, Don Hoefler’s snappy headline had become shorthand to refer to the entire computer hardware and software industry.
Why did “Silicon Valley” not only beat out its regional competitors, but become two words that were synonymous with the entire American high-tech industry? Technology, it turns out, was only part of the story.
THE COMPUTER ON A CHIP
Out of the Cold War cradle, Northern California chipmakers had established a healthy non-defense business by the start of the 1970s, making memory chips for Eastern computer manufacturers as well as for a booming new market: electronic calculators. Only a few months after Intel’s founding, it had gotten a commission from a Japanese manufacturer to custom-build a sophisticated chip for a line of desktop calculators. This precipitated a design process that eventually led to a breakthrough rivaling Shockley’s transistor and Noyce’s integrated circuit: the microprocessor. The device leapt beyond merely placing multiple circuits on a chip: now there were even more of them, and they were programmable. With a stored-memory microprocessor inside, any sort of appliance or device—a car, a telephone, a bedside clock—effectively turned into a computer. Fast, powerful, and less expensive than mechanical controls, the microprocessor could “be stuck in every place,” as Gordon Moore put it.5
Marketed as “a computer on a chip,” the Intel 4004 made its public debut with an ad in Electronic News in November 1971, less than a year after Hoefler’s giddy series gave Silicon Valley its name. Only a few months later, Intel followed on with the release of the twice-as-powerful 8008, followed by the 8080 in 1974. By that time, Intel had marketing as carefully designed as the products it was selling, having brought in Regis McKenna Himself to provide vision and execution. Other shops in town could do ads and brochures, but McKenna understood the semiconductor business like no one else.
“First time technologies required ‘education of the market,’” McKenna remembered. It wasn’t just about ads and sales brochures; it was about placing articles in trade journals where systems designers could see them, and running educational seminars for corporate managers who didn’t know the first thing about semiconductor design and application. But the ads mattered too: crisply modern and speaking a language that regular business people could understand, Intel’s were a different breed from the usual kind seen in the industry, which tended to be heavy on technical specs and light on illustrations. “The 8080 Microcomputer is here,” blazed one brightly colored spread, “incredibly easy to interface, simple to program and up to 100 times the performance.”6
Now, a sliver of silicon contained all the computing power of a mainframe or a minicomputer that cost tens of thousands of dollars. Dearly expensive and space-hogging technology was on the verge of becoming accessible to nearly anyone. The microprocessor set the miniaturization of the computer into hyperdrive, turned all sorts of products from analog to digital, and gave Intel and the rest of the chipmakers a conviction that they were truly changing the world. “We are really the revolutionaries in the world today—not the kids with the long hair and beards who were wrecking the schools a few years ago,” said Gordon Moore.7
By 1975, Intel had 3,200 employees and sales of $140 million. National Semiconductor had sales of $235 million. Northern California’s 1950s and 1960s had been about manufacturing bespoke, expensive products for a small number of deep-pocketed customers: the Defense Department, NASA, the mainframe computer makers. Silicon Valley’s 1970s were about turning these small electronics into market commodities. Intel operations chief Andy Grove famously referred to the company’s products as “high technology jelly beans.” But their blueprint for making microchips at scale wasn’t the mass-production assembly line of Henry Ford. It was the franchise model of McDonald’s hamburgers. Manufacturing grew by building small-to-medium fabrication plants across the country and, increasingly, overseas.8
Within headquarters, chip executives grouped their employees into small teams that competed against one another to develop the best product. “Big is bad,” Bob Noyce declared in a keynote address to a group of businessmen in December 1976. “The spirit of the small group is better and the work is much harder.” Intel avoided hiring people over age thirty. But this wasn’t a search for anti-establishment rebels—it was a quest to find people with ambition to create a new industry.9
Amid Seventies malaise, semiconductor industry profit soared. Silicon Valley’s denizens became more unabashed about the money they were making and their desire to make even more. “The basic thing that drives the technology is the desire to make money,” said Robert Lloyd of National. Don Hoefler made big-spending vignettes a feature of his industry coverage. By 1972, he had so much good copy that he began his own weekly newsletter, Microelectronics News, chronicling all the happenings at local companies. The biggest personalities got the most attention. “Hardly had the ink dried on J
erry Sanders’s order for his $64,000 Rolls-Royce Corniche,” Hoefler dished in late 1975, “than the Mercedes-Benz importer phoned him from New York to offer a 7.5 liter bomb for $40,000, which M-B bows next year. Jerry’s response: wrap one up; I’ll take it. So goes all of Jerry’s 1976 salary.”10
As the cash flowed and the flash increased, more East Coast journalists started trekking out to Silicon Valley. The term “Silicon Valley” very gradually started popping up in the business sections of The New York Times and The Wall Street Journal (it nearly always appeared in quotation marks). Gene Bylinsky of Fortune rolled out a series of euphoric articles about high-tech execs and the venture capitalists who financed them, sketching portraits of risk-taking iconoclasts that sounded a lot like Hoefler’s Wagon Wheel chronicles and blind items. This wasn’t just another business story: it was an entrepreneurial story of people audacious or foolhardy enough to strike out on their own. “If you are a capitalist—and I am—you graduate to the Olympics of capitalism by starting new businesses,” one Silicon Valley executive told Bylinsky.11
THE SILICON VALLEY STYLE
It certainly seemed to outsiders like this was something different. The shadow of the world’s worst boss, the rigid and imperious Bill Shockley, still haunted the industry. The chipmakers didn’t want to be my-way-or-the-highway micromanagers; they wanted to give their employees room to test out new ideas. They remained men of the electronics lab, too, choosing their hires on the basis of who was “smart” and priding themselves on their commitment to meritocracy.
Yet Valley “meritocracy” also placed great value on known quantities: people who came from familiar, top-ranked engineering programs, or who had worked at familiar local companies, or whose references came from known and trusted sources. The high degree of job-hopping between companies facilitated this, creating a mobile workforce that often worked in a series of different enterprises, sometimes with the same managers and colleagues.
The hiring habits set in place by the semiconductor companies continued over the Valley’s successive technological generations. By the end of the 1990s, dot-com-era firms were filling close to 45 percent of engineering vacancies by referrals from current employees. By the 2010s, software giants were throwing “Bring a Referral” happy hours and offering up free vacations and cash bonuses to employees who helped snag a successful hire. It made sense: topflight engineering distinguished great tech companies from the merely good. From the age of first-generation chipmakers to the era of Google and Facebook, this talent was in chronically short supply. Plus, hires who were known quantities were able to hit the ground running, adapt quickly, and produce results at the speed the market demanded.12
And it was a fiercely competitive market. Rising up at a moment when America’s postwar boom was giving way to economic precariousness and new global rivalries, the chipmakers of Silicon Valley took the technology-driven, total-immersion ethos of HP and added a topcoat of Darwinian struggle that reflected an insanely competitive business of high risk and high reward. There were no reserved parking spots for top executives—a message of meritocracy, but also a signal of the value of pulling long hours. Everyone knew who came in early and snagged the best spots by the front door. Everyone saw whose car lingered in the parking lot long after dark.13
And although most of its leaders exuded genial charm, the chip business was unrepentantly macho. Beyond the secretaries and the quick-fingered women on the microchip assembly lines, the industry was nearly entirely male. The result was a profanity-laced, chain-smoking, hard-drinking hybrid of locker room, Marine barracks, and scientific lab. Meanwhile, as one executive’s wife told Noyce biographer Leslie Berlin, the women “stayed home and did your thing so that the warriors could go and build the temple.” In firms trying to keep up with a frenetic product cycle, the all-or-nothing nature of hardware and software design—things either worked, or they didn’t—translated into business organizations where work overtook family life, unvarnished criticism was the norm, and self-doubt was a fatal weakness.14
The high-testosterone vibe reverberated throughout the Valley. When Ann Hardy discovered that she was the only Tymshare manager not invited to an offsite retreat, and confronted the meeting’s organizer about the omission, he responded, “If we include you, then we need to include all the spouses.” Why is that a problem? she asked. “Well,” he said matter-of-factly, “we only go to these offsite meetings so we can spend our evenings with prostitutes.” Hardy marched off to CEO Tom O’Rourke to complain. The organizer disappeared. Hardy wasn’t sure what happened to the prostitutes.15
The culture of 1970s Silicon Valley could be as old-school as its gender relations. At a moment when the Bay Area had become synonymous with the drop-in, drop-out counterculture and the freewheeling Me Decade that followed, the chipmakers’ main concession to the changing times was to grow slightly longer sideburns. They leaned toward free-market Republicanism like that practiced by Dave Packard, yet were aware of how government shaped their operations, and paid deference to the system. As Bob Noyce put it in 1970: “This really is a controlled society, controlled out of Washington, and if you’re trying to steer around in all the traffic out there, you’d better listen to what the cop is telling people.”16
The organization charts of these growing companies looked a lot like those of typical “old economy” corporations. They featured all the requisite support functions (sales, marketing, human resources) that had become critical to doing business in the modern era. Yet they differed in important ways. For one, they moved through product cycles far more quickly than other kinds of manufacturing, as the propulsive force of Moore’s Law made their products faster, cheaper, and more ubiquitous by the year. For another, from Hewlett and Packard to Noyce, Moore, and Grove, the founders of firms often stayed at the helm as their CEOs or chairmen. They blended the organizational chart of the twentieth-century corporation with the personal sensibilities of the nineteenth-century sole proprietorship.
Of course, when company founders were sober-minded engineers, this highly personalized approach worked well. When they were more freewheeling, it could generate chaos. Take, for example, the video game pioneer Atari.
Founded near San Jose in 1972 by a group led by a charismatic twenty-nine-year-old named Nolan Bushnell, Atari was an early market leader in one of many industries made possible by faster and cheaper microchips. Within months of its founding, Atari was disrupting the slightly seedy world of pinball and Pop-A-Shot with its arcade phenomenon Pong.
Both in its arcade form and in the home-console version that Atari released three years later, Pong was marvelously simple, devilishly difficult, and irresistibly addictive. The tech was straightforward: a black screen, with pixelated white lines on each border representing table tennis paddles, and a white digital dot of a ball that pinged back and forth. You had three choices of game: singles, doubles, and catch—where instead of returning the ball back to a partner, you tried to snag it in a small opening in your paddle.
The semiconductor guys may have surrendered reserved parking spaces and let loose after work at the Wagon Wheel, but Atari took California casual to a whole new level. The very early years were characterized by management squabbles among its young executives, drug use on the manufacturing line, and goofy product ideas that never would have made it past most corporate decision-making structures. There were games that only could have come out of a company whose designers and engineers were all young men, most notoriously 1973’s Gotcha, whose controls were designed to look and feel like women’s breasts. “They didn’t have bumps on them or anything,” helpfully explained an Atari designer, “but the way they were the size of grapefruits next to each other, you got the picture of what they were supposed to be.”
As one Atari employee recalled mistily, the company was “a bunch of free thinking, dope smoking, fun loving people. We sailed boats, flew airplanes, smoked pot and played video games.” Atari executives—“known to pa
rtake of the ganja as well”—were self-aware enough to realize that their company vibe was a little edgy for its quiet suburban surroundings. The first employee newsletter opened with a plea to “show as much sophistication to the outside community as possible,” because “the thought of a company composed of longhairs is frightening to them.”17
Amid this fast and loose organizational structure, Atari had some stumbles as it tried to turn Pong’s success into a lasting business. But in 1975, it hit the consumer-product jackpot with its semiconductor-powered gaming consoles that plugged into living room televisions. Atari wasn’t alone, as the established and well-capitalized electronics giant Magnavox entered the home market at the same time. Magnavox’s Odyssey and Atari’s home-game version of Pong became the must-have Christmas gifts of the year, addicting a generation of children and teenagers to the hypnotic blips and beeps of video games.
Atari’s products were exactly the right diversion from inflation-wracked family incomes and oil-embargoed hours spent waiting in line for gas. In big cities and small towns across the country, retail giant Sears placed working consoles at the center of their showrooms for prospective customers. Kids who’d associated Sears with boring family shopping trips for Toughskins and washing machines now lined up three deep for their turn at Pong—the first arcade game that you didn’t need a quarter to play. Then they’d run home and plead with their parents to buy them one.
Providing escape from ’70s stagflation paid off handsomely. In 1976, after deciding against a Wall Street IPO, Bushnell sold Atari to Warner Communications for $28 million, netting $15 million personally. The video game revolution—one that introduced America’s future software engineers to the wonders of manipulating pixels on a transistorized screen—had begun.18