The Innovators

Home > Memoir > The Innovators > Page 15
The Innovators Page 15

by Walter Isaacson


  Q: In the first line of your sonnet which reads “Shall I compare thee to a summer’s day,” would not “a spring day” do as well or better?

  A: It wouldn’t scan.

  Q: How about “a winter’s day.” That would scan all right.

  A: Yes, but nobody wants to be compared to a winter’s day.

  Q: Would you say Mr. Pickwick reminded you of Christmas?

  A: In a way.

  Q: Yet Christmas is a winter’s day, and I do not think Mr. Pickwick would mind the comparison.

  A: I don’t think you’re serious. By a winter’s day one means a typical winter’s day, rather than a special one like Christmas.

  Turing’s point was that it might not be possible to tell whether such a respondent was a human or a machine pretending to be a human.

  Turing gave his own guess as to whether a computer might be able to win this imitation game: “I believe that in about fifty years’ time it will be possible to programme computers . . . to make them play the imitation game so well that an average interrogator will not have more than 70 percent chance of making the right identification after five minutes of questioning.”

  In his paper Turing tried to rebut the many possible challenges to his definition of thinking. He swatted away the theological objection that God has bestowed a soul and thinking capacity only upon humans, arguing that this “implies a serious restriction of the omnipotence of the Almighty.” He asked whether God “has freedom to confer a soul on an elephant if He sees fit.” Presumably so. By the same logic, which, coming from the nonbelieving Turing was somewhat sardonic, surely God could confer a soul upon a machine if He so desired.

  The most interesting objection, especially for our narrative, is the one that Turing attributed to Ada Lovelace. “The Analytical Engine has no pretensions whatever to originate anything,” she wrote in 1843. “It can do whatever we know how to order it to perform. It can follow analysis; but it has no power of anticipating any analytical relations or truths.” In other words, unlike the human mind, a mechanical contrivance cannot have free will or come up with its own initiatives. It can merely perform as programmed. In his 1950 paper, Turing devoted a section to what he dubbed “Lady Lovelace’s Objection.”

  His most ingenious parry to this objection was his argument that a machine might actually be able to learn, thereby growing into its own agent and able to originate new thoughts. “Instead of trying to produce a programme to simulate the adult mind, why not rather try to produce one which simulates the child’s?” he asked. “If this were then subjected to an appropriate course of education, one would obtain the adult brain.” A machine’s learning process would be different from a child’s, he admitted. “It will not, for instance, be provided with legs, so that it could not be asked to go out and fill the coal scuttle. Possibly it might not have eyes. . . . One could not send the creature to school without the other children making excessive fun of it.” The baby machine would therefore have to be tutored some other way. Turing proposed a punishment and reward system, which would cause the machine to repeat certain activities and avoid others. Eventually such a machine could develop its own conceptions about how to figure things out.

  But even if a machine could mimic thinking, Turing’s critics objected, it would not really be conscious. When the human player of the Turing Test uses words, he associates those words with real-world meanings, emotions, experiences, sensations, and perceptions. Machines don’t. Without such connections, language is just a game divorced from meaning.

  This objection led to the most enduring challenge to the Turing Test, which was in a 1980 essay by the philosopher John Searle. He proposed a thought experiment, called the Chinese Room, in which an English speaker with no knowledge of Chinese is given a comprehensive set of rules instructing him on how to respond to any combination of Chinese characters by handing back a specified new combination of Chinese characters. Given a good enough instruction manual, the person might convince an interrogator that he was a real speaker of Chinese. Nevertheless, he would not have understood a single response that he made, nor would he have exhibited any intentionality. In Ada Lovelace’s words, he would have no pretensions whatever to originate anything but instead would merely do whatever actions he was ordered to perform. Similarly, the machine in Turing’s imitation game, no matter how well it could mimic a human being, would have no understanding or consciousness of what it was saying. It makes no more sense to say that the machine “thinks” than it does to say that the fellow following the massive instruction manual understands Chinese.95

  One response to the Searle objection is to argue that, even if the man does not really understand Chinese, the entire system incorporated in the room—the man (processing unit), instruction manual (program), and files full of Chinese characters (the data)—as a whole might indeed understand Chinese. There’s no conclusive answer. Indeed, the Turing Test and the objections to it remain to this day the most debated topic in cognitive science.

  * * *

  For a few years after he wrote “Computing Machinery and Intelligence,” Turing seemed to enjoy engaging in the fray that he provoked. With wry humor, he poked at the pretensions of those who prattled on about sonnets and exalted consciousness. “One day ladies will take their computers for walks in the park and tell each other ‘My little computer said such a funny thing this morning!’ ” he japed in 1951. As his mentor Max Newman later noted, “His comical but brilliantly apt analogies with which he explained his ideas made him a delightful companion.”96

  One topic that came up repeatedly in discussions with Turing, and would soon have a sad resonance, was the role that sexual appetites and emotional desires play in human thinking, unlike in machines. A very public example occurred in a January 1952 televised BBC debate that Turing had with the brain surgeon Sir Geoffrey Jefferson, moderated by Max Newman and the philosopher of science Richard Braithwaite. “A human’s interests are determined, by and large, by his appetites, desires, drives, instincts,” said Braithwaite, who argued that to create a true thinking machine, “it would seem to be necessary to equip the machine with something corresponding to a set of appetites.” Newman chimed in that machines “have rather restricted appetites, and they can’t blush when they’re embarrassed.” Jefferson went even further, repeatedly using “sexual urges” as an example and referring to a human’s “emotions and instincts, such as those to do with sex.” Man is prey to “sexual urges,” he said, and “may make a fool of himself.” He spoke so much about how sexual appetites affected human thinking that the BBC editors cut some of it out of the broadcast, including his assertion that he would not believe a machine could think until he saw it touch the leg of a female machine.97

  Turing, who was still rather discreet about being a homosexual, fell quiet during this part of the discussion. During the weeks leading up to the recording of the broadcast on January 10, 1952, he was engaged in a series of actions that were so very human that a machine would have found them incomprehensible. He had just finished a scientific paper, and he followed it by writing a short story about how he planned to celebrate: “It was quite some time now since he had ‘had’ anyone, in fact not since he had met that soldier in Paris last summer. Now that his paper was finished he might justifiably consider that he had earned another gay man, and he knew where he might find one who might be suitable.”98

  On Oxford Street in Manchester, he picked up a nineteen-year-old working-class drifter named Arnold Murray and began a relationship. When he returned from taping the BBC show, he invited Murray to move in. One night Turing told young Murray of his fantasy of playing chess against a nefarious computer that he was able to beat by causing it to show anger, then pleasure, then smugness. The relationship became more complex in the ensuing days, until Turing returned home one evening and found that his house had been burglarized. The culprit was a friend of Murray’s. When Turing reported the incident to the police, he ended up disclosing to them his sexual relationship with Murr
ay, and they arrested Turing for “gross indecency.”99

  At the trial in March 1952, Turing pled guilty, though he made clear he felt no remorse. Max Newman appeared as a character witness. Convicted and stripped of his security clearance,VI Turing was offered a choice: imprisonment or probation contingent on receiving hormone treatments via injections of a synthetic estrogen designed to curb his sexual desires, as if he were a chemically controlled machine. He chose the latter, which he endured for a year.

  Turing at first seemed to take it all in stride, but on June 7, 1954, he committed suicide by biting into an apple he had laced with cyanide. His friends noted that he had always been fascinated by the scene in Snow White in which the Wicked Queen dips an apple into a poisonous brew. He was found in his bed with froth around his mouth, cyanide in his system, and a half-eaten apple by his side.

  Was that something a machine would have done?

  * * *

  I. Stirling’s formula, which approximates the value of the factorial of a number.

  II. The display and explanations of the Mark I at Harvard’s science center made no mention of Grace Hopper nor pictured any women until 2014, when the display was revised to highlight her role and that of the programmers.

  III. Von Neumann was successful in this. The plutonium implosion design would result in the first detonation of an atomic device, the Trinity test, in July 1945 near Alamogordo, New Mexico, and it would be used for the bomb that was dropped on Nagasaki on August 9, 1945, three days after the uranium bomb was used on Hiroshima. With his hatred of both the Nazis and the Russian-backed communists, von Neumann became a vocal proponent of atomic weaponry. He attended the Trinity test, as well as later tests on Bikini Atoll in the Pacific, and he argued that a thousand radiation deaths was an acceptable price to pay for the United States attaining a nuclear advantage. He would die twelve years later, at age fifty-three, of bone and pancreatic cancer, which may have been caused by the radiation emitted during those tests.

  IV. In 1967, at age sixty, Hopper was recalled to active duty in the Navy with the mission of standardizing its use of COBOL and validating COBOL compilers. By vote of Congress, she was permitted to extend her tour beyond retirement age. She attained the rank of rear admiral, and finally retired in August 1986 at age seventy-nine as the Navy’s oldest serving officer.

  V. The U.S. Constitution empowers Congress “to promote the progress of science and useful arts by securing for limited times to authors and inventors the exclusive Right to their respective writings and discoveries.” The U.S. Patent and Trademark Office throughout the 1970s generally would not grant patents to innovations whose only departure from existing technology was the use of a new software algorithm. That became murky in the 1980s with conflicting appeals court and Supreme Court rulings. Policies changed in the mid-1990s, when the DC Circuit Court issued a series of rulings permitting patents for software that produces a “useful, concrete and tangible result” and President Bill Clinton appointed as head of the Patent Office a person who had been the chief lobbyist for the Software Publishing Industry.

  VI. At Christmas 2013 Turing was posthumously granted a formal pardon by Queen Elizabeth II.

  John Bardeen (1908–91), William Shockley (1910–89), and Walter Brattain (1902–87) in a Bell Labs photograph in 1948.

  The first transistor at Bell Labs.

  William Shockley (at head of table) the day he won the Nobel Prize being toasted by colleagues, including Gordon Moore (seated left) and Robert Noyce (standing center with wine glass) in 1956.

  CHAPTER FOUR

  * * *

  THE TRANSISTOR

  The invention of computers did not immediately launch a revolution. Because they relied on large, expensive, fragile vacuum tubes that consumed a lot of power, the first computers were costly behemoths that only corporations, research universities, and the military could afford. Instead the true birth of the digital age, the era in which electronic devices became embedded in every aspect of our lives, occurred in Murray Hill, New Jersey, shortly after lunchtime on Tuesday, December 16, 1947. That day two scientists at Bell Labs succeeded in putting together a tiny contraption they had concocted from some strips of gold foil, a chip of semiconducting material, and a bent paper clip. When wiggled just right, it could amplify an electric current and switch it on and off. The transistor, as the device was soon named, became to the digital age what the steam engine was to the Industrial Revolution.

  The advent of transistors, and the subsequent innovations that allowed millions of them to be etched onto tiny microchips, meant that the processing power of many thousands of ENIACs could be nestled inside the nose cone of rocket ships, in computers that could sit on your lap, in calculators and music players that could fit in your pocket, and in handheld devices that could exchange information or entertainment with any nook or node of a networked planet.

  Three passionate and intense colleagues, whose personalities both complemented and conflicted with one another, would go down in history as the inventors of the transistor: a deft experimentalist named Walter Brattain, a quantum theorist named John Bardeen, and the most passionate and intense of them all—tragically so by the end—a solid-state physics expert named William Shockley.

  But there was another player in this drama that was actually as important as any individual: Bell Labs, where these men worked. What made the transistor possible was a mixture of diverse talents rather than just the imaginative leaps of a few geniuses. By its nature, the transistor required a team that threw together theorists who had an intuitive feel for quantum phenomena with material scientists who were adroit at baking impurities into batches of silicon, along with dexterous experimentalists, industrial chemists, manufacturing specialists, and ingenious tinkerers.

  BELL LABS

  In 1907 the American Telephone and Telegraph Company faced a crisis. The patents of its founder, Alexander Graham Bell, had expired, and it seemed in danger of losing its near-monopoly on phone services. Its board summoned back a retired president, Theodore Vail, who decided to reinvigorate the company by committing to a bold goal: building a system that could connect a call between New York and San Francisco. The challenge required combining feats of engineering with leaps of pure science. Making use of vacuum tubes and other new technologies, AT&T built repeaters and amplifying devices that accomplished the task in January 1915. On the historic first transcontinental call, in addition to Vail and President Woodrow Wilson, was Bell himself, who echoed his famous words from thirty-nine years earlier, “Mr. Watson, come here, I want to see you.” This time his former assistant Thomas Watson, who was in San Francisco, replied, “It would take me a week.”1

  Thus was the seed planted for a new industrial organization that became known as Bell Labs. Originally located on the western edge of Manhattan’s Greenwich Village overlooking the Hudson River, it brought together theoreticians, materials scientists, metallurgists, engineers, and even AT&T pole climbers. It was where George Stibitz developed a computer using electromagnetic relays and Claude Shannon worked on information theory. Like Xerox PARC and other corporate research satellites that followed, Bell Labs showed how sustained innovation could occur when people with a variety of talents were brought together, preferably in close physical proximity where they could have frequent meetings and serendipitous encounters. That was the upside. The downside was that these were big bureaucracies under corporate thumbs; Bell Labs, like Xerox PARC, showed the limits of industrial organizations when they don’t have passionate leaders and rebels who can turn innovations into great products.

  The head of Bell Labs’ vacuum-tube department was a high-octane Missourian named Mervin Kelly, who had studied to be a metallurgist at the Missouri School of Mines and then got a PhD in physics under Robert Millikan at the University of Chicago. He was able to make vacuum tubes more reliable by devising a water-cooling system, but he realized that tubes would never be an efficient method of amplification or switching. In 1936 he was pro
moted to research director of Bell Labs, and his first priority was to find an alternative.

  Kelly’s great insight was that Bell Labs, which had been a bastion of practical engineering, should also focus on basic science and theoretical research, until then the domain of universities. He began a search for the country’s brightest young physics PhDs. His mission was to make innovation something that an industrial organization could do on a regular basis rather than ceding that territory to eccentric geniuses holed up in garages and garrets.

  “It had become a matter of some consideration at the Labs whether the key to invention was a matter of individual genius or collaboration,” Jon Gertner wrote in The Idea Factory, a study of Bell Labs.2 The answer was both. “It takes many men in many fields of science, pooling their various talents, to funnel all the necessary research into the development of one new device,” Shockley later explained.3 He was right. He was also, however, showing a rare flash of feigned humility. More than anyone, he believed in the importance of the individual genius, such as himself. Even Kelly, the proselytizer for collaboration, realized that individual genius also needed to be nurtured. “With all the needed emphasis on leadership, organization and teamwork, the individual has remained supreme—of paramount importance,” he once said. “It is in the mind of a single person that creative ideas and concepts are born.”4

 

‹ Prev