Book Read Free

The Innovators

Page 43

by Walter Isaacson


  * * *

  Peer-to-peer sharing and commons-based collaboration were nothing new. An entire field of evolutionary biology has arisen around the question of why humans, and members of some other species, cooperate in what seem to be altruistic ways. The tradition of forming voluntary associations, found in all societies, was especially strong in early America, evidenced in cooperative ventures ranging from quilting bees to barn raisings. “In no country in the world has the principle of association been more successfully used, or more unsparingly applied to a multitude of different objects, than in America,” Alexis de Tocqueville wrote.139 Benjamin Franklin in his Autobiography propounded an entire civic creed, with the motto “To pour forth benefits for the common good is divine,” to explain his formation of voluntary associations to create a hospital, militia, street-sweeping corps, fire brigade, lending library, night-watch patrol, and many other community endeavors.

  The hacker corps that grew up around GNU and Linux showed that emotional incentives, beyond financial rewards, can motivate voluntary collaboration. “Money is not the greatest of motivators,” Torvalds said. “Folks do their best work when they are driven by passion. When they are having fun. This is as true for playwrights and sculptors and entrepreneurs as it is for software engineers.” There is also, intended or not, some self-interest involved. “Hackers are also motivated, in large part, by the esteem they can gain in the eyes of their peers by making solid contributions. . . . Everybody wants to impress their peers, improve their reputation, elevate their social status. Open source development gives programmers the chance.”

  Gates’s “Letter to Hobbyists,” complaining about the unauthorized sharing of Microsoft BASIC, asked in a chiding way, “Who can afford to do professional work for nothing?” Torvalds found that an odd outlook. He and Gates were from two very different cultures, the communist-tinged radical academia of Helsinki versus the corporate elite of Seattle. Gates may have ended up with the bigger house, but Torvalds reaped antiestablishment adulation. “Journalists seemed to love the fact that, while Gates lived in a high-tech lakeside mansion, I was tripping over my daughter’s playthings in a three-bedroom ranch house with bad plumbing in boring Santa Clara,” he said with ironic self-awareness. “And that I drove a boring Pontiac. And answered my own phone. Who wouldn’t love me?”

  Torvalds was able to master the digital-age art of being an accepted leader of a massive, decentralized, nonhierarchical collaboration, something that Jimmy Wales at Wikipedia was doing at around the same time. The first rule for such a situation is to make decisions like an engineer, based on technical merit rather than personal considerations. “It was a way of getting people to trust me,” Torvalds explained. “When people trust you, they take your advice.” He also realized that leaders in a voluntary collaborative have to encourage others to follow their passion, not boss them around. “The best and most effective way to lead is by letting people do things because they want to do them, not because you want them to.” Such a leader knows how to empower groups to self-organize. When it’s done right, a governance structure by consensus naturally emerges, as happened both with Linux and Wikipedia. “What astonishes so many people is that the open source model actually works,” Torvalds said. “People know who has been active and who they can trust, and it just happens. No voting. No orders. No recounts.”140

  * * *

  The combination of GNU with Linux represented, at least in concept, the triumph of Richard Stallman’s crusade. But moral prophets rarely indulge in victory celebrations. Stallman was a purist. Torvalds wasn’t. The Linux kernel he eventually distributed contained some binary blobs with proprietary features. That could be remedied; indeed Stallman’s Free Software Foundation created a version that was completely free and nonproprietary. But there was a deeper and more emotional issue for Stallman. He complained that referring to the operating system as “Linux,” which almost everybody did, was misleading. Linux was the name of the kernel. The system as a whole should be called GNU/Linux, he insisted, sometimes angrily. One person who was at a software expo recounted how Stallman had reacted when a nervous fourteen-year-old boy asked him about Linux. “You ripped into that boy and tore him a brand new asshole, and I watched as his face fell and his devotion to you and our cause crumpled in a heap,” the onlooker later berated Stallman.141

  Stallman also insisted that the goal should be to create what he called free software, a phrase that reflected a moral imperative to share. He objected to the phrase that Torvalds and Eric Raymond began to use, open-source software, which emphasized the pragmatic goal of getting people to collaborate in order to create software more effectively. In practice, most free software is also open-source and vice versa; they are usually thrown together under the rubric of free and open-source software. But to Stallman it mattered not only how you made your software but also your motivations. Otherwise the movement might be susceptible to compromise and corruption.

  The disputes went beyond mere substance and became, in some ways, ideological. Stallman was possessed by a moral clarity and unyielding aura, and he lamented that “anyone encouraging idealism today faces a great obstacle: the prevailing ideology encourages people to dismiss idealism as ‘impractical.’ ”142 Torvalds, on the contrary, was unabashedly practical, like an engineer. “I led the pragmatists,” he said. “I have always thought that idealistic people are interesting, but kind of boring and scary.”143

  Torvalds admitted to “not exactly being a huge fan” of Stallman, explaining, “I don’t like single-issue people, nor do I think that people who turn the world into black and white are very nice or ultimately very useful. The fact is, there aren’t just two sides to any issue, there’s almost always a range of responses, and ‘it depends’ is almost always the right answer in any big question.”144 He also believed that it should be permissible to make money from open-source software. “Open source is about letting everybody play. Why should business, which fuels so much of society’s technological advancement, be excluded?”145 Software may want to be free, but the people who write it may want to feed their kids and reward their investors.

  * * *

  These disputes should not overshadow the astonishing accomplishment that Stallman and Torvalds and their thousands of collaborators wrought. The combination of GNU and Linux created an operating system that has been ported to more hardware platforms, ranging from the world’s ten biggest supercomputers to embedded systems in mobile phones, than any other operating system. “Linux is subversive,” wrote Eric Raymond. “Who would have thought that a world-class operating system could coalesce as if by magic out of part-time hacking by several thousand developers scattered all over the planet, connected only by the tenuous strands of the Internet?”146 Not only did it become a great operating system; it became a model for commons-based peer production in other realms, from Mozilla’s Firefox browser to Wikipedia’s content.

  By the 1990s there were many models for software development. There was the Apple approach, in which the hardware and the operating system software were tightly bundled, as with the Macintosh and iPhone and every iProduct in between. It made for a seamless user experience. There was the Microsoft approach, in which the operating system was unbundled from the hardware. That allowed more user choices. In addition, there were the free and open-source approaches, which allowed the software to be completely unfettered and modifiable by any user. Each model had its advantages, each had its incentives for creativity, and each had its prophets and disciples. But the approach that worked best was having all three models coexisting, along with various combinations of open and closed, bundled and unbundled, proprietary and free. Windows and Mac, UNIX and Linux, iOS and Android: a variety of approaches competed over the decades, spurring each other on—and providing a check against any one model becoming so dominant that it stifled innovation.

  * * *

  I. After they became successful, Gates and Allen donated a new science building to Lakeside and named its audit
orium after Kent Evans.

  II. Steve Wozniak’s unwillingness to tackle this tedious task when he wrote BASIC for the Apple II would later force Apple to have to license BASIC from Allen and Gates.

  III. Reading a draft version of this book online, Steve Wozniak said that Dan Sokol made only eight copies, because they were hard and time-consuming to make. But John Markoff, who reported this incident in What the Dormouse Said, shared with me (and Woz and Felsenstein) the transcript of his interview with Dan Sokol, who said he used a PDP-11 with a high-speed tape reader and punch. Every night he would make copies, and he estimated he made seventy-five in all.

  IV. The lawyers were right to be worried. Microsoft later was involved in a protracted antitrust suit brought by the Justice Department, which charged that it had improperly leveraged its dominance of the operating system market to seek advantage in browsers and other products. The case was eventually settled after Microsoft agreed to modify some of its practices.

  V. By 2009 the Debian version 5.0 of GNU/Linux had 324 million source lines of code, and one study estimated that it would have cost about $8 billion to develop by conventional means (http://gsyc.es/~frivas/paper.pdf).

  Larry Brilliant (1944– ) and Stewart Brand on Brand’s houseboat in 2010.

  William von Meister (1942–1995).

  Steve Case (1958– ).

  CHAPTER TEN

  * * *

  ONLINE

  The Internet and the personal computer were both born in the 1970s, but they grew up apart from one another. This was odd, and all the more so when they continued to develop on separate tracks for more than a decade. This was partly because there was a difference in mind-set between those who embraced the joys of networking and those who got giddy at the thought of a personal computer of their very own. Unlike the utopians of the Community Memory project who loved forming virtual communities, many early fans of personal computers wanted to geek out alone on their own machines, at least initially.

  There was also a more tangible reason that personal computers arose in a way that was disconnected from the rise of networks. The ARPANET of the 1970s was not open to ordinary folks. In 1981 Lawrence Landweber at the University of Wisconsin pulled together a consortium of universities that were not connected to the ARPANET to create another network based on TCP/IP protocols, which was called CSNET. “Networking was available only to a small fraction of the U.S. computer research community at the time,” he said.1 CSNET became the forerunner of a network funded by the National Science Foundation, NSFNET. But even after these were all woven together into the Internet in the early 1980s, it was hard for an average person using a personal computer at home to get access. You generally had to be affiliated with a university or research institution to jack in.

  So for almost fifteen years, beginning in the early 1970s, the growth of the Internet and the boom in home computers proceeded in parallel. They didn’t intertwine until the late 1980s, when it became possible for ordinary people at home or in the office to dial up and go online. This would launch a new phase of the Digital Revolution, one that would fulfill the vision of Bush, Licklider, and Engelbart that computers would augment human intelligence by being tools both for personal creativity and for collaborating.

  EMAIL AND BULLETIN BOARDS

  “The street finds its own uses for things,” William Gibson wrote in “Burning Chrome,” his 1982 cyberpunk story. Thus it was that the researchers who had access to the ARPANET found their own use for it. It was supposed to be a network for time-sharing computer resources. In that it was a modest failure. Instead, like many technologies, it shot to success by becoming a medium for communications and social networking. One truth about the digital age is that the desire to communicate, connect, collaborate, and form community tends to create killer apps. And in 1972 the ARPANET got its first. It was email.

  Electronic mail was already used by researchers who were on the same time-sharing computer. A program called SNDMSG allowed a user of a big central computer to send a message to the personal folder of another user who was sharing the same computer. In late 1971 Ray Tomlinson, an MIT engineer working at BBN, decided to concoct a cool hack that would allow such messages to be sent to folders on other mainframes. He did it by combining SNDMSG with an experimental file transfer program called CPYNET, which could exchange files between distant computers on the ARPANET. Then he came up with something that was even more ingenious: in order to instruct a message to go to the file folder of a user at a different site, he used the @ sign on his keyboard to create the addressing system that we all use now, username@hostname. Thus Tomlinson created not only email but the iconic symbol of the connected world.2

  The ARPANET allowed researchers at one center to tap into the computing resources somewhere else, but that rarely happened. Instead email became the main method for collaborating. ARPA’s director, Stephen Lukasik, became one of the first email addicts, thus causing all researchers who needed to deal with him to follow suit. He commissioned a study in 1973 which found that, less than two years after it was invented, email accounted for 75 percent of the traffic on the ARPANET. “The largest single surprise of the ARPANET program has been the incredible popularity and success of network mail,” a BBN report concluded a few years later. It should not have been a surprise. The desire to socially network not only drives innovations, it co-opts them.

  Email did more than facilitate the exchange of messages between two computer users. It led to the creation of virtual communities, ones that, as predicted in 1968 by Licklider and Taylor, were “selected more by commonality of interests and goals than by accidents of proximity.”

  The earliest virtual communities began with email chains that were distributed to large self-selected groups of subscribers. They became known as mailing lists. The first major list, in 1975, was SF-Lovers, for science fiction fans. The ARPA managers initially wanted to shut it down out of fear that some senator might not be amused by the use of military money to support a sci-fi virtual hangout, but the moderators of the group successfully argued that it was a valuable training exercise in juggling large information exchanges.

  Soon other methods of forming online communities arose. Some used the backbone of the Internet; others were more jury-rigged. In February 1978 two members of the Chicago Area Computer Hobbyists’ Exchange, Ward Christensen and Randy Suess, found themselves snowed in by a huge blizzard. They spent the time developing the first computer Bulletin Board System, which allowed hackers and hobbyists and self-appointed “sysops” (system operators) to set up their own online forums and offer files, pirated software, information, and message posting. Anyone who had a way to get online could join in. The following year, students at Duke University and the University of North Carolina, which were not yet connected to the Internet, developed another system, hosted on personal computers, which featured threaded message-and-reply discussion forums. It became known as “Usenet,” and the categories of postings on it were called “newsgroups.” By 1984 there were close to a thousand Usenet terminals at colleges and institutes around the country.

  Even with these new bulletin boards and newsgroups, most average PC owners could not easily join virtual communities. Users needed a way to connect, which wasn’t easy from home or even most offices. But then, in the early 1980s, an innovation came along, part technological and part legal, that seemed small but had a huge impact.

  MODEMS

  The little device that finally created a connection between home computers and global networks was called a modem. It could modulate and demodulate (hence the name) an analog signal, like that carried by a telephone circuit, in order to transmit and receive digital information. It thus allowed ordinary people to connect their computers to others online by using phone lines. The online revolution could now begin.

  It was slow in coming because AT&T had a near-monopoly over the nation’s phone system, even controlling the equipment you could use in your home. You couldn’t connect anything to your phone
line, or even to your phone, unless Ma Bell leased it to you or approved it. Although AT&T offered some modems in the 1950s, they were clunky and costly and designed mainly for industrial or military use, rather than being conducive to homebrew hobbyists creating virtual communities.

  Then came the Hush-A-Phone case. It involved a simple plastic mouthpiece that could be snapped onto a phone to amplify your voice while making it harder for those nearby to overhear you. It had been around for twenty years, causing no harm, but then an AT&T lawyer spotted one in a shopwindow, and the company decided to sue on the absurd ground that any external device, including a little plastic cone, could damage its network. It showed how far the company would go to protect its monopoly.

  Fortunately, AT&T’s effort backfired. A federal appeals court dismissed the company’s claim, and the barriers to jacking into its network began to crumble. It was still illegal to connect a modem into the phone system electronically, but you could do so mechanically, such as by taking your phone’s handset and cradling it into the suction cups of an acoustical coupler. By the early 1970s there were a few modems of this type, including the Pennywhistle, designed for the hobbyist crowd by Lee Felsenstein, that could send and receive digital signals at three hundred bits per second.I

 

‹ Prev