The Innovators

Home > Memoir > The Innovators > Page 25
The Innovators Page 25

by Walter Isaacson


  Innovation requires having at least three things: a great idea, the engineering talent to execute it, and the business savvy (plus deal-making moxie) to turn it into a successful product. Nolan Bushnell scored a trifecta when he was twenty-nine, which is why he, rather than Bill Pitts, Hugh Tuck, Bill Nutting, or Ralph Baer, goes down in history as the innovator who launched the video game industry. “I am proud of the way we were able to engineer Pong, but I’m even more proud of the way I figured out and financially engineered the business,” he said. “Engineering the game was easy. Growing the company without money was hard.”28

  * * *

  I. A sample of Doc Smith’s prose, from his novel Triplanetary (1948): “Nerado’s vessel was completely ready for any emergency. And, unlike her sister-ship, she was manned by scientists well-versed in the fundamental theory of the weapons with which they fought. Beams, rods and lances of energy flamed and flared; planes and pencils cut, slashed and stabbed; defensive screens glowed redly or flashed suddenly into intensely brilliant, coruscating incandescence. Crimson opacity struggled sullenly against violet curtains of annihilation. Material projectiles and torpedoes were launched under full-beam control; only to be exploded harmlessly in mid-space, to be blasted into nothingness or to disappear innocuously against impenetrable polycyclic screens.”

  II. Three years later, in 1975, when Atari decided to build a home version of Pong, the venture capital industry had caught fire, and Bushnell was able to get $20 million in funding from Don Valentine, who had just founded Sequoia Capital. Atari and Sequoia helped to launch each other.

  J. C. R. Licklider (1915–90).

  Bob Taylor (1932– ).

  Larry Roberts (1937– ).

  CHAPTER SEVEN

  * * *

  THE INTERNET

  VANNEVAR BUSH’S TRIANGLE

  Innovations often bear the imprint of the organizations that created them. For the Internet, this was especially interesting, for it was built by a partnership among three groups: the military, universities, and private corporations. What made the process even more fascinating was that this was not merely a loose-knit consortium with each group pursuing its own aims. Instead, during and after World War II, the three groups had been fused together into an iron triangle: the military-industrial-academic complex.

  The person most responsible for forging this assemblage was Vannevar Bush, the MIT professor who in 1931 built the Differential Analyzer, the early analog computer described in chapter 2.1 Bush was well suited to this task because he was a star in all three camps: dean of the MIT School of Engineering, a founder of the electronics company Raytheon, and America’s top military science administrator during World War II. “No American has had greater influence in the growth of science and technology than Vannevar Bush,” MIT’s president Jerome Wiesner later proclaimed, adding that his “most significant innovation was the plan by which, instead of building large government laboratories, contracts were made with universities and industrial laboratories.”2

  Bush was born near Boston in 1890, the son of a Universalist minister who had begun his career as a cook on a mackerel smack. Both of Bush’s grandfathers were whaling captains, which instilled in him a salty and forthright manner that helped make him a decisive manager and charismatic administrator. Like many successful technology leaders, he was an expert in both engineering products and making crisp decisions. “All of my recent ancestors were sea captains, and they have a way of running things without any doubt,” he once said. “That left me with some inclination to run a show once I was in it.”3

  Also like many good technology leaders, he grew up loving both the humanities and the sciences. He could quote Kipling and Omar Khayyam “by the yard,” played the flute, loved symphonies, and read philosophy for pleasure. His family, too, had a basement workshop, where he built little boats and mechanical toys. As Time later reported in its inimitable old style, “Lean, sharp, salty, Van Bush is a Yankee whose love of science began, like that of many American boys, in a passion for tinkering with gadgets.”4

  He went to Tufts, where in his spare time he built a surveying machine that used two bicycle wheels and a pendulum to trace the perimeter of an area and calculate its size, thus being an analog device for doing integral calculus. He got a patent on it, which became the first of forty-nine that he would accumulate. While at Tufts, he and his roommates consulted with a series of small companies and then, after graduating, founded Raytheon, which grew into a sprawling defense contractor and electronics firm.

  Bush earned a PhD in electrical engineering jointly from MIT and Harvard, then became a professor and dean of engineering at MIT, where he built his Differential Analyzer. His passion was elevating the role of science and engineering in society at a time, the mid-1930s, when not much exciting seemed to be happening in either field. Televisions were not yet a consumer product, and the most notable new inventions put into the time capsule at the New York 1939 World’s Fair were a Mickey Mouse watch and a Gillette Safety Razor. The advent of World War II would change that, producing an explosion of new technologies, with Vannevar Bush leading the way.

  Worried that America’s military was lagging in technology, he mobilized Harvard president James Bryant Conant and other scientific leaders to convince President Franklin Roosevelt to form the National Defense Research Committee and then the military’s Office of Scientific Research and Development, both of which he headed. With an ever-present pipe in his mouth and a pencil in his hand, he oversaw the Manhattan Project to build the atom bomb as well as the projects to develop radar and air-defense systems. Time dubbed him “General of Physics” on its cover in 1944. “If we had been on our toes in war technology ten years ago,” the magazine quoted him as saying as he banged his fist on his desk, “we would probably not have had this damn war.”5

  With his no-nonsense style tempered by a personal warmth, he was a tough but endearing leader. Once a group of military scientists, frustrated by some bureaucratic problem, walked into his office to resign. Bush couldn’t figure out what the snafu was. “So I just told them,” he recalled, “ ‘One does not resign in time of war. You chaps get the hell out of here and get back to work, and I’ll look into it.’ ”6 They obeyed. As MIT’s Wiesner later observed, “He was a man of strong opinions, which he expressed and applied with vigor, yet he stood in awe of the mysteries of nature, had a warm tolerance for human frailty, and was open-minded to change.”7

  When the war ended, Bush produced a report in July 1945 at Roosevelt’s behest (which ended up being delivered to President Harry Truman) that advocated government funding of basic research in partnership with universities and industry. Bush chose an evocative and quintessentially American title, “Science, the Endless Frontier.” His introduction deserves to be reread whenever politicians threaten to defund the research needed for future innovation. “Basic research leads to new knowledge,” Bush wrote. “It provides scientific capital. It creates the fund from which the practical applications of knowledge must be drawn.”8

  Bush’s description of how basic research provides the seed corn for practical inventions became known as the “linear model of innovation.” Although subsequent waves of science historians sought to debunk the linear model for ignoring the complex interplay between theoretical research and practical applications, it had a popular appeal as well as an underlying truth. The war, Bush wrote, had made it “clear beyond all doubt” that basic science—discovering the fundamentals of nuclear physics, lasers, computer science, radar—“is absolutely essential to national security.” It was also, he added, crucial for America’s economic security. “New products and new processes do not appear full-grown. They are founded on new principles and new conceptions, which in turn are painstakingly developed by research in the purest realms of science. A nation which depends upon others for its new basic scientific knowledge will be slow in its industrial progress and weak in its competitive position in world trade.” By the end of his report, Bush had reache
d poetic heights in extolling the practical payoffs of basic scientific research: “Advances in science when put to practical use mean more jobs, higher wages, shorter hours, more abundant crops, more leisure for recreation, for study, for learning how to live without the deadening drudgery which has been the burden of the common man for past ages.”9

  Based on this report, Congress established the National Science Foundation. At first Truman vetoed the bill because it mandated that the director be appointed by an independent board rather than the president. But Bush turned Truman around by explaining that this would buffer him from those seeking political favors. “Van, you should be a politician,” Truman told him. “You have some of the instincts.” Bush replied, “Mr. President, what the hell do you think I’ve been doing around this town for five or six years?”10

  The creation of a triangular relationship among government, industry, and academia was, in its own way, one of the significant innovations that helped produce the technological revolution of the late twentieth century. The Defense Department and National Science Foundation soon became the prime funders of much of America’s basic research, spending as much as private industry during the 1950s through the 1980s.I The return on that investment was huge, leading not only to the Internet but to many of the pillars of America’s postwar innovation and economic boom.11

  A few corporate research centers, most notably Bell Labs, existed before the war. But after Bush’s clarion call produced government encouragement and contracts, hybrid research centers began to proliferate. Among the most notable were the RAND Corporation, originally formed to provide research and development (hence the name) to the Air Force; Stanford Research Institute and its offshoot, the Augmentation Research Center; and Xerox PARC. All would play a role in the development of the Internet.

  Two of the most important of these institutes sprang up around Cambridge, Massachusetts, just after the war: Lincoln Laboratory, a military-funded research center affiliated with MIT, and Bolt, Beranek and Newman, a research and development company founded and populated by MIT (and a few Harvard) engineers. Closely associated with both of them was an MIT professor with a Missouri drawl and an easygoing talent for teambuilding. He would become the single most important person in creating the Internet.

  J. C. R. LICKLIDER

  In searching for fathers of the Internet, the best person to start with is a laconic yet oddly charming psychologist and technologist, with an open-faced grin and show-me attitude, named Joseph Carl Robnett Licklider, born in 1915 and known to everyone as “Lick.” He pioneered the two most important concepts underlying the Internet: decentralized networks that would enable the distribution of information to and from anywhere, and interfaces that would facilitate human-machine interaction in real time. Plus, he was the founding director of the military office that funded the ARPANET, and he returned for a second stint a decade later when protocols were created to weave it into what became the Internet. Said one of his partners and protégés, Bob Taylor, “He was really the father of it all.”12

  Licklider’s father was a poor Missouri farm boy who became a successful insurance salesman in St. Louis and then, when the Depression wiped him out, a Baptist minister in a tiny rural town. As a doted-upon only child, Lick turned his bedroom into a model plane production facility and rebuilt clunker cars with his mother standing by his side handing him tools. Nevertheless, he felt trapped growing up in an isolated rural area filled with barbed-wire fences.

  He escaped first to Washington University in St. Louis and then, after getting a doctorate in psychoacoustics (how we perceive sounds), joined Harvard’s psychoacoustics lab. Increasingly interested in the relationship between psychology and technology, how human brains and machines interacted, he moved to MIT to start a psychology section based in the Electrical Engineering Department.

  At MIT Licklider joined the eclectic circle of engineers, psychologists, and humanists gathered around Professor Norbert Wiener, a theorist who studied how humans and machines worked together and coined the term cybernetics, which described how any system, from a brain to an artillery aiming mechanism, learned through communications, control, and feedback loops. “There was tremendous intellectual ferment in Cambridge after World War II,” Licklider recalled. “Wiener ran a weekly circle of forty or fifty people who got together. They would gather together and talk for a couple of hours. I was a faithful adherent to that.”13

  Unlike some of his MIT colleagues, Wiener believed that the most promising path for computer science was to devise machines that would work well with human minds rather than try to replace them. “Many people suppose that computing machines are replacements for intelligence and have cut down the need for original thought,” Wiener wrote. “This is not the case.”14 The more powerful the computer, the greater the premium that will be placed on connecting it with imaginative, creative, high-level human thinking. Licklider became an adherent of this approach, which he later called “man-computer symbiosis.”

  Licklider had a mischievous but friendly sense of humor. He loved watching the Three Stooges and was childishly fond of sight gags. Sometimes, when a colleague was about to give a slide presentation, Licklider would slip a photo of a beautiful woman into the projector’s carousel. At work he energized himself with a steady supply of Cokes and candies from the vending machines, and he gave out Hershey bars to his kids and students whenever they delighted him. He was also devoted to his graduate students, whom he would invite to dinners at his home in the Boston suburb of Arlington. “To him, collaboration was what it was all about,” his son Tracy said. “He wandered around setting up islands of people and encouraging them to be inquisitive and solve problems.” That was one reason he became interested in networks. “He knew that getting good answers involved distant collaboration. He loved spotting talented people and tying them together in a team.”15

  His embrace, however, did not extend to people who were pretentious or pompous (with the exception of Wiener). When he thought a speaker was spouting nonsense, he would stand up and ask what seemed to be innocent but were in fact devilish questions. After a few moments, the speaker would realize he had been deflated and Licklider would sit down. “He didn’t like poseurs or pretenders,” Tracy recalled. “He was never mean, but he slyly pricked people’s pretensions.”

  One of Licklider’s passions was art. Whenever he traveled he would spend hours at museums, sometimes dragging along his two reluctant children. “He became a nut about it, couldn’t get enough of it,” said Tracy. Sometimes he would spend five hours or more in a museum marveling at each brushstroke, analyzing how each picture came together, and attempting to fathom what it taught about creativity. He had an instinct for spotting talent in all fields, arts as well as sciences, but he felt that it was most easy to discern in its purest forms, such as the brushstroke of a painter or the melodic refrain of a composer. He said he looked for the same creative strokes in the designs of computer or network engineers. “He became a really skilled scout of creativity. He often discussed what made people creative. He felt it was easier to see in an artist, so he tried even harder to spot it in engineering, where you can’t see the brushstrokes quite as readily.”16

  Most important, Licklider was kind. When he worked at the Pentagon later in his career, according to his biographer Mitchell Waldrop, he noticed the cleaning woman admiring the art prints on his wall late one evening. She told him, “You know, Dr. Licklider, I always leave your room until last because I like to have time by myself, with nothing pressing, to look at the pictures.” He asked which print she liked most, and she pointed to a Cézanne. He was thrilled, since it was his favorite, and he promptly gave it to her.17

  Licklider felt that his love of art made him more intuitive. He could process a wide array of information and sniff out patterns. Another attribute, which would serve him well when he helped put together the team that laid the foundations for the Internet, was that he loved to share ideas without craving credit for them. His ego was so tam
ed that he seemed to enjoy giving away rather than claiming credit for ideas that were developed in conversation. “For all his considerable influence on computing, Lick retained his modesty,” said Bob Taylor. “His favorite kind of joke was one at his own expense.”18

  TIME-SHARING AND MAN-COMPUTER SYMBIOSIS

  At MIT Licklider collaborated with the artificial intelligence pioneer John McCarthy, in whose lab the hackers of the Tech Model Railroad Club had invented Spacewar. With McCarthy in the lead, they helped to develop, during the 1950s, systems for computer time-sharing.

  Up until then, when you wanted a computer to perform a task, you had to submit a stack of punch cards or a tape to the computer’s operators, as if handing an offering to the priests who shielded an oracle. This was known as “batch processing,” and it was annoying. It could take hours or even days to get results back; any little mistake might mean having to resubmit your cards for another run; and you might not be able to touch or even see the computer itself.

  Time-sharing was different. It allowed a whole lot of terminals to be hooked up to the same mainframe, so that many users could type in commands directly and get a response almost instantly. Like a grandmaster playing dozens of games of chess simultaneously, the mainframe’s core memory would keep track of all the users, and its operating system would be capable of multitasking and running many programs. This provided users with an enchanting experience: you could have a hands-on and real-time interaction with a computer, like a conversation. “We had a kind of little religion growing here about how this was going to be totally different from batch processing,” said Licklider.19

 

‹ Prev