The Master Switch
Page 31
This anger, though understandable and predictable, as with any failed union, is ultimately misdirected. It rightly belongs with J.C.R. Licklider and Vint Cerf. Without specifically intending to, the founders of the Internet had foreordained by the radicalism of their conception that Levin and Case’s great image of the future would have—despite its head of gold, its belly of brass, and legs of iron—feet of clay.
The design of the Internet blesses some companies and curses others. For if net neutrality destroyed the value of AOL Time Warner, it would catapult to riches the likes of Google and Amazon, firms that, far from discouraging or circumscribing consumer choice, would aim to put everything one could want within easy reach. In this fulfillment of the Net’s dream of connecting any user with any other, comes the power behind the great business success stories of the still young Internet age. In such a world, the advantage of owning everything from soup to nuts is far from evident; it may be no advantage at all.
In 2008, at Revolution headquarters, I asked Case whether he regretted the merger. “Yes,” he said, without hesitation. What would he have done differently? Acknowledging that nonintegrated, or “pure play,” firms like Google would in the end succeed where AOL failed, Case has a different vision in hindsight: “I would have bought Google.”
In some sense the work remaining to our story is to assess the merits of that answer.
CHAPTER 20
Father and Son
Steve Jobs stood before an audience of thousands, many of whom had camped out overnight to share this moment. In his signature black turtleneck and blue 501s he was completely in his element: in perfect control of his script, the emotions of the crowd, and the image he projected to the world. Behind him was an enormous screen—another of his trademarks—flashing words, animations, and surprising pictures. In the computer world, and particularly among members of the cult of the Mac, the annual Jobs keynote at Apple’s Macworld is a virtual sacrament. During this one, on January 9, 2007, Jobs was to announce his most important invention since the Apple Macintosh.1
“Today,” said Jobs, “we’re introducing three revolutionary new products. Three things: a widescreen iPod with touch controls; a revolutionary mobile phone; a breakthrough Internet communications device.”
Loud cheers.
“An iPod, a phone … are you getting it? These are not three separate devices!”
Louder cheers.
“We are calling it iPhone!”
The audience rose to its feet. On the screen: “iPhone: Apple reinvents the phone.”
The iPhone was beautiful; it was powerful; it was perfect. After demonstrating its many features, Jobs showed how the iPhone could access the Internet as no phone ever had before, through a full-featured real browser.
“Now, you can’t—you can’t really think about the Internet, of course, without thinking about Google.… And it’s my pleasure now to introduce Dr. Eric Schmidt, Google’s CEO!”
To more cheers, Schmidt came jogging in from stage left, wearing an incongruously long orange tie. The two men shook hands warmly at center stage, like two world leaders. A member of the Apple board, Schmidt thanked Jobs and began his comments with a perhaps ill-advised joke about just how close Apple and Google had become. “There are a lot of relationships between the boards, and I thought if we just sort of merged the companies we could call them AppleGoo,” he said. “But I’m not a marketing guy.”
Indeed, in 2007 Google and Apple were about as close as two firms could be. Schmidt was not the only member of both corporate boards. The two firms were given to frequent and effusive public acclamations of each other. Their respective foundings a generation apart, Google and Apple were, to some, like father and son—both starting life as radical, idealistic firms, dreamed up by young men determined to do things differently. Apple was the original revolutionary, the protocountercultural firm that pioneered personal computing, and, in the 1970s, became the first company to bring open computing, then merely an ideological commitment, to mass production and popular use. Google, meanwhile, having overcome skepticism about its business model at every turn, had by the new millennium become the incarnation of the Internet gospel of openness. It had even hired Vint Cerf, one of the network’s greatest visionaries, giving him the title “Chief Internet Evangelist.”2
Their corporate mottoes, “Think Different” and “Do No Evil,” while often mocked by critics and cynics, were an entirely purposeful way of propounding deeply counterintuitive ideas about corporate culture. Both firms, launched out of suburban garages a few miles apart, took pride in succeeding against the grain. Google entered the search business in 2000, when searching was considered a “commodity,” or low-profit operation, and launched a dot-com after the tech boom went bust. Apple’s revolution had been even more fundamental: in the 1970s, still the era of central mainframe machines, it built a tiny personal computer and later gave it a “desktop” (the graphic user interface of windows and icons and toolbars that is now ubiquitous), as well as a mouse. The two firms also shared many real and imagined foes: Microsoft, mainstream corporations, and uptight people in general.
Back in San Francisco, Schmidt, done with his jokes, continued his presentation.
“What I like about this new device [the iPhone] and the new architecture of the Internet is that you can actually merge without merging.… Internet architectures allow you now to take the enormous brain trust that is represented by the Apple development team and combine that with the open protocols and data service that companies like Google [provide].”
Unnoticed by most, here was enunciated a crucial idea, a philosophy of business organization radical in its implications. Schmidt was suggesting that, on a layered network, in an age of open protocols, all the advantages of integration—the “synergies” and efficiencies of joint operation—could be realized without actual corporate mergers. Call it Google’s theory of the firm. With the Internet, there was no need for mergers and exclusive partnerships. Each company could focus just on what it did best. Thanks to the Internet, the age of Vail, Rockefeller, and Carnegie, not to mention the media conglomerates created by Steven Ross and Michael Eisner—the entire age of giant corporate empires—was, according to this revelation, over.
But was it really? The warmth of Jobs’s greeting concealed the fact that Apple’s most important partner for the iPhone launch was not Google—not by a long shot—but rather one of Google’s greatest foes. At the end of his speech, in an understated way, Jobs dropped a bomb. The iPhone would work exclusively on the network of one company: AT&T.*
“They are the best and most popular network in the country,” said Jobs. “Fifty-eight million subscribers. They are number one. And they’re going to be our exclusive partner in the U.S.”
In entering this partnership, Apple was aligning itself with the nemesis of everything Google, the Internet, and once even Apple itself stood for.
• • •
We don’t know whether Ed Whitacre, Jr., AT&T’s CEO, was listening to Eric Schmidt’s speech at the iPhone launch. But we can be sure that he would have disagreed with Schmidt that the age of grand mergers was over. Just one week earlier, Whitacre had quietly gained final federal approval for the acquisitions that would bring most of the old Bell system back under AT&T’s control. Unfazed by the arrival of the Internet, Whitacre and his telephone Goliath were practicing the old-school corporate strategies of leveraging size to achieve domination, just as AT&T had done for more than a hundred years. The spirit of Theodore Vail was alive and well in the resurrected dominion of the firm with which Apple was now allied.
Within two years of the iPhone launch, relations between Apple and Google would sour as the two pursued equally grand, though inimical, visions of the future. In 2009 hearings before the FCC, they now sat on opposite sides. Steve Jobs accused Google of wasting its time in the mobile phone market; a new Google employee named Tim Bray in 2010 described Apple’s iPhone as “a sterile Disney-fied walled garden surrounded by sharp-to
othed lawyers.… I hate it.”3
As this makes clear, where once there had been only subtle differences there now lay a chasm. Apple, while it had always wavered on “openness,” had committed to a program that fairly suited not just the AT&T mind-set, but also the ideals of Hollywood and the entertainment conglomerates as well. Despite the many missteps, including the AOL–Time Warner merger, the conglomerates were still at bottom looking for their entry point into the Internet game. By 2010, Apple would clearly seem the way—whether through its iTunes music store, its online videos, or the magic of the iPad. In fact, the combination of Apple, AT&T, and Hollywood now held out an extremely appealing prospect: Hollywood’s content, AT&T’s lines, and Apple’s gorgeous machines—an information paradise of sorts, succeeding where AOL–Time Warner had failed.
For its part, Google would remain fundamentally more radical with utopian, even vaguely messianic, ideals. As Apple befriended the old media, Google’s founders continued to style themselves the challengers to the existing order, to the most basic assumptions about the proper organization of information, the nature of property, the duties of the American corporation, and even the purpose of life. They envisioned taking the Internet revolution into every sector of the information realm—to video and film, television, book, newspaper, and magazine publishing, telephony—every way that humans send or receive information.
You might think that such splits are simply the way the capitalist cookie crumbles and one shouldn’t dwell overmuch on the rupture between two firms. But these are not just any two firms. These are, in communications, the industrial and ideological leaders of our times. These are the companies that are determining how Americans and the rest of the world will share information. If Huxley could say in 1927 that “the future of America is the future of the world,” we can equally say that the future of Apple and Google will form the future of America and the world.4
What should be apparent to any reader having reached this point is that here in the twenty-first century, these firms and their allies are fighting anew the age-old battle we’ve recounted time and time again. It is the perennial Manichaean contest informing every episode in this book: the struggle between the partisans of the open and of the closed, between the decentralized and the consolidated visions of a proper order. But this time around, as compared with any other, the sides are far more evenly matched.
APPLE’S RADICAL ORIGINS
Apple is a schizophrenic company: a self-professed revolutionary closely allied with both of the greatest forces in information, the entertainment conglomerates and the telecommunications industry. To work out this contradiction we need to return to Apple’s origins and see how far it has come. Let’s return to 1971, when a bearded young college student in thick eyeglasses named Steve Wozniak was hanging out at the home of Steve Jobs, then in high school. The two young men, electronics buffs, were fiddling with a crude device they’d been working on for more than a year. To them it must have seemed just another attempt in their continuing struggle to make a working model from a clever idea, just as Alexander Bell and Watson had done one hundred years earlier.5
That day in 1971, however, was different. Together, they attached Wozniak’s latest design to Jobs’s phone, and as Wozniak recalls, “it actually worked.”6 It would be their first taste of the eureka moment that would-be inventors have always lived for. The two used the device to place a long distance phone call to Orange County. Apple’s founders had managed to hack AT&T’s long distance network: their creation was a machine, a “blue box,” that made long distance phone calls for free.
Such an antiestablishment spirit of enterprise would underlie all of Jobs and Wozniak’s early collaborations and form the lore that still gives substance to the image long cultivated: the iconoclast partnership born in a Los Altos garage, which, but a few years later, in March of 1976, would create a personal computer called “the Apple,” one hundred years to the month after Bell invented the telephone in his own lonely workshop.
In the 1970s this imagery would be reinforced by the pair’s self-styling as bona fide counterculturals, with all the accoutrements—long hair, opposition to the war, an inclination to experiment with chemical substances as readily as with electronics. Wozniak, an inveterate prankster, ran an illegal “dial-a-joke” operation; Jobs would travel to India in search of a guru.
But, as is often the case, the granular truth of Apple’s origins was a bit more complicated than the mythology. For even in the beginning, there was a significant divide between the two men. There was no real parity in technical prowess: it was Wozniak, not Jobs, who had built the blue box. And it was Wozniak who would conceive of and build the Apple and the Apple II, the most important Apple products ever, and arguably among the most important inventions of the later twentieth century.* For his part, Jobs was the businessman and the dealmaker of the operation, essential as such, but hardly the founding genius of Apple computers, the man whose ideas were turned into silicon to change the world; that was Wozniak. The history of the firm must be understood in this light. For while founders do set the culture of a firm, they cannot dictate it in perpetuity; as Wozniak withdrew from the operation, Apple became more and more concerned with, as it were, the aesthetics of radicalism than with its substance.
Steve Wozniak is not the household name that Steve Jobs is, but his importance to communications and culture in the postwar period merits a closer look. While Apple’s wasn’t the only personal computer invented in the 1970s, it was the most influential. For the Apple II took personal computing, an obscure pursuit of the hobbyist, and made it into a nationwide phenomenon, one that would ultimately transform not just computing, but communications, culture, entertainment, business—in short, the whole productive part of American life.
We’ve seen these moments before, when a hobbyist or limited-interest medium becomes a mainstream craze; it happened with the telephone in 1894, with the birth of radio broadcasting in 1920, and with cable television in the 1970s. But the computer revolution was arguably more radical than any of these advances on account of having posed such a clear ideological challenge to the information economy’s status quo. As we’ve seen, for most of the twentieth century, innovators would lodge the control and power of new technologies within giant institutions. Innovation begat industry, and industry begat consolidation. Wozniak’s computer had the opposite effect: he took the power of computing, formerly the instrument of large companies with mainframe resources, and put it in the hands of individuals. That feat, and every manifestation of communications freedom that has flowed from it, is doubtless his greatest contribution to society. It was almost unimaginable at the time: a device that made ordinary individuals sovereign over information by means of computational powers they could tailor to their individual needs. Even if that sovereignty was limited by the primitive capacities of the Apple II—48 KB of RAM, puny compared with even our present-day telephones but also with industrial computers of the time—the machine nevertheless planted the seed that would change everything.
With slots to accommodate all sorts of peripheral devices and an operating system that ran a variety of software, the Wozniak design was open in ways that might be said still to define the concept in the computing industries. Wozniak’s ethic of openness extended even to disclosing design specifications. He once gave a talk and put the point this way: “Everything we knew, you knew.”7 In the secretive high-tech world, such transparency was unheard of, as it is today. Google, for example, despite its commitment to network openness, keeps most of its code and operations secret, and today’s Apple, unlike the Apple of 1976, guards technical and managerial information the way Willy Wonka guarded candy recipes.
Put another way, Wozniak welcomed the amateur enthusiast, bringing the cult of the inspired tinkerer to the mass-produced computer. That ideology wasn’t Wozniak’s invention, but rather in the 1970s it was an orthodoxy among computing hobbyists like the Bay Area’s Homebrew computer club, where Wozniak offered the first public
demonstration of the Apple I in 1976. As Wozniak described the club, “Everyone in the Homebrew Computer Club envisioned computers as a benefit to humanity—a tool that would lead to social justice.” These men were the exact counterparts of the radio pioneers of the 1910s—hobbyist-idealists who loved to play with technology and dreamed it could make the world a better place. And while a computer you can tinker with and modify may not sound so profound, Wozniak contemplated a spiritual relationship between man and his machine, the philosophy one finds in Matthew Crawford’s Shop Class as Soulcraft or the older Zen and the Art of Motorcycle Maintenance. “It’s pretty rare to make your engineering an art,” said Wozniak, “but that’s how it should be.”8
The original Apple had a hood; and as with a car, the owner could open it up and get at the guts of the machine. Indeed, although it was a fully assembled device, not a kit like earlier PC products, one was encouraged to tinker with the innards, to soup it up, make it faster, add features, whatever. The Apple’s operating system, using a form of BASIC as its programming language and operating environment, was, moreover, one that anyone could program. It made it possible to write and sell one’s programs directly, creating what we now call the “software” industry.
In 2006, I briefly met with Steve Wozniak on the campus of Columbia University.
“There’s a question I’ve always wanted to ask you,” I said. “What happened with the Mac? You could open up the Apple II, and there were slots and so on, and anyone could write for it. The Mac was way more closed. What happened?”
“Oh,” said Wozniak. “That was Steve. He wanted it that way. The Apple II was my machine, and the Mac was his.”