by Steven Levy
The next day Bill Bennett came back to his office and found his mangled screwdriver with a sign on it. The sign read USED UP.
Chapter 6. Winners and Losers
By 1966, when David Silver took his first elevator ride to the ninth floor of Tech Square, the AI lab was a showcase community, working under the hallowed precepts of the Hacker Ethic. After a big Chinese dinner, the hackers would go at it until dawn, congregating around the PDP-6 to do what was most important in the world to them. They would waddle back and forth with their printouts and their manuals, kibitzing around whoever was using the terminal at that time, appreciating the flair with which the programmer wrote his code. Obviously, the key to the lab was cooperation and a joint belief in the mission of hacking. These people were passionately involved in technology, and as soon as he saw them, David Silver wanted to spend all his time there.
David Silver was fourteen years old. He was in the sixth grade, having been left back twice. He could hardly read. His classmates often taunted him. Later, people would reflect that his problem had been dyslexia; Silver would simply say that he “wasn’t interested” in the teachers, the students, or anything that went on in school. He was interested in building systems.
From the time he was six or so, he had been going regularly to Eli Heffron’s junkyard in Cambridge (where TMRC hackers also scavenged) and recovering all sorts of fascinating things. Once, when he was around ten, he came back with a radar dish, tore it apart, and rebuilt it so that it could pick up sounds—he rigged it as a parabolic reflector, stuck in a microphone, and was able to pick up conversations thousands of feet away. Mostly he used to listen to faraway cars, or birds, or insects. He also built a lot of audio equipment and dabbled in time-lapse photography. Then he got interested in computers.
His father was a scientist, a friend of Minsky’s, and a teacher at MIT. He had a terminal in his office connected to the Compatible Time-sharing System on the IBM 7094. David began working with it—his first program was written in LISP and translated English phrases into pig Latin. Then he began working on a program that would control a tiny robot—he called it a “bug”—which he built at home out of old telephone relays that he got at Eli’s. He hooked the bug to the terminal, and working in machine language, he wrote a program that made the two-wheeled bug actually crawl. David decided that robotics was the best of all pursuits—what could be more interesting than making machines that could move on their own, see on their own . . . think on their own?
So his visit to the AI lab, arranged by Minsky, was a revelation. Not only were these people as excited about computers as David Silver was, but one of the major activities at the lab was robotics. Minsky was extremely interested in that field. Robotics was crucial to the progress of artificial intelligence; it let us see how far man could go in making smart machines do his work. Many of Minsky’s graduate students concerned themselves with the theory of robotics, crafting theses about the relative difficulty of getting a robot to do this or that. The hackers were also heavily involved in the field—not so much in theorizing as in building and experimenting. Hackers loved robots for much the same reasons that David Silver did. Controlling a robot was a step beyond computer programming in controlling the system that was the real world. As Gosper used to say, “Why should we limit computers to the lies people tell them through keyboards?” Robots could go off and find out for themselves what the world was like.
When you program a robot to do something, Gosper would later explain, you get “a kind of gratification, an emotional impact, that is completely indescribable. And it far surpasses the kind of gratification you get from a working program. You’re getting a physical confirmation of the correctness of your construction. Maybe it’s sort of like having a kid.”
One big project that the hackers completed was a robot that could catch a ball. Using a mechanical arm controlled by the PDP-6, as well as a television camera, Nelson, Greenblatt, and Gosper worked for months until the arm could finally catch a Ping-Pong ball lobbed toward it. The arm was able to determine the location of the ball in time to move itself in position to catch it. It was something the hackers were tremendously proud of, and Gosper especially wanted to go further and begin work on a more mobile robot which could actually play Ping-Pong.
“Ping-Pong by Christmas?” Minsky asked Gosper as they watched the robot catch balls.
Ping-Pong, like Chinese restaurants, was a system Gosper respected. He’d played the game in his basement as a kid, and his Ping-Pong style had much in common with his hacking style: both were based on his love of the physically improbable. When Gosper hit a Ping-Pong ball, the result was something as loony as a PDP-6 display hack—he put so much English on the ball that complex and counterintuitive forces were summoned, and there was no telling where the ball might go. Gosper loved the spin, the denial of gravity that allowed you to violently slam a ball so that instead of sailing past the end of a table it suddenly curved down, and when the opponent tried to hit it, the ball would be spinning so furiously that it would fly off toward the ceiling. Or he would chop at a ball to increase the spin so much that it almost flattened out, nearly exploding in mid-air from the centrifugal force. “There were times when in games I was having,” Gosper would later say, “a ball would do something in mid-air, something unphysical, that would cause spectators to gasp. I have seen inexplicable things happen in mid-air. Those were interesting moments.”
Gosper was obsessed for a while with the idea of a robot playing the game. The hackers actually did get the robot to hold a paddle and take a good swat at a ball lobbed in its direction. Bill Bennett would later recall a time when Minsky stepped into the robot arm’s area, floodlit by the bright lights required by the vidicon camera; the robot, seeing the glare reflecting from Minsky’s bald dome, mistook the professor for a large Ping-Pong ball and nearly decapitated him.
Gosper wanted to go all the way; have the robot geared to move around and make clever shots, perhaps with the otherworldly spin of a good Gosper volley. But Minsky, who had actually done some of the hardware design for the ball-catching machine, did not think it an interesting problem. He considered it no different from the problem of shooting missiles out of the sky with other missiles, a task that the Defense Department seemed to have under control. Minsky dissuaded Gosper from going ahead on the Ping-Pong project, and Gosper would later insist that that robot could have changed history.
Of course, the idea that a project like that was even considered was thrilling to David Silver. Minsky had allowed Silver to hang out on the ninth floor, and soon Silver had dropped out of school totally so he could spend his time more constructively at Tech Square. Since hackers care less about people’s age than about someone’s potential contribution to hacking, fourteen-year-old David Silver was accepted, at first as sort of a mascot.
He immediately proved himself of some value by volunteering to do some tedious lock-hacking tasks. It was a time when the administration had installed a tough new system of high-security locks. Sometimes the slightly built teenager would spend a whole night crawling over false ceilings, to take apart a hallway’s worth of locks, study them to see how the mastering system worked, and painstakingly reconstruct them before the administrators returned in the morning. Silver was very good at working with machinist’s tools, and he machined a certain blank which could be fashioned into a key to open a particularly tough new lock. The lock was on a door protecting a room with a high-security safe which held . . . keys. Once the hackers got to that, the system “unraveled,” in Silver’s term.
Silver saw the hackers as his teachers—he could ask them anything about computers or machines, and they would toss him enormous chunks of knowledge. This would be transmitted in the colorful hacker jargon, loaded with odd, teddy-bearish variations on the English language. Words like winnitude, Greenblattful, gronk, and foo were staples of the hacker vocabulary, shorthand for relatively nonverbal people to communicate exactly what was on their minds.
Silver had all
sorts of questions. Some of them were very basic: What are the various pieces computers are made of? What are control systems made of? But as he got more deeply into robotics he found that the questions you had to ask were double-edged. You had to consider things in almost cosmic terms before you could create reality for a robot. What is a point? What is velocity? What is acceleration? Questions about physics, questions about numbers, questions about information, questions about the representation of things . . . it got to the point, Silver realized later, where he was “asking basic philosophical questions like what am I, what is the universe, what are computers, what can you use them for, and how does that relate? At that time all those questions were interesting because it was the first time I had started to contemplate, and started to know enough about computers, and was relating biological-, human-, and animal-type functions, and starting to relate them to science and technology and computers. I began to realize that there was this idea that you could do things with computers that are similar to the things intelligent beings do.”
Silver’s guru was Bill Gosper. They would often go off to one of the dorms for Ping-Pong, go out for Chinese food, or talk about computers and math. All the while, Silver was soaking up knowledge in this Xanadu above Cambridge. It was a school no one else knew about, and for the first time in his life he was happy.
The computer and the community around it had freed him, and soon David Silver felt ready to do serious work on the PDP-6. He wanted to write a big, complicated program: he wanted to modify his little robot “bug” so that it would use the television camera to actually “fetch” things that people would toss on the floor. The hackers were not fazed at the fact that no one, even experienced people with access to all sorts of sophisticated equipment, had really done anything similar. Silver went about it in his usual inquisitive style, going to ten or twenty hackers and asking each about a specific section of the vision part of the program. High-tech Tom Sawyer, painting a fence with assembly code. Hardware problems, he’d ask Nelson. Systems problems, Greenblatt. For math formulas, Gosper. And then he’d ask people to help him with a subroutine on that problem. When he got all the subroutines, he worked to put the program together, and he had his vision program.
The bug itself was a foot long and seven inches wide, made of two small motors strapped together with a plastic harness. It had erector-set wheels on either end, an erector-set bar going across the top, and copper welding bars sticking out in front, like a pair of antlers. It looked, frankly, like a piece of junk. Silver used a technique called “image subtraction” to let the computer know where the bug was at any time—the camera would always be scanning the scene to see what had moved, and would notice any change in its picture. Meanwhile the bug would be moving randomly until the camera picked it up and the computer directed it to the target, which would be a wallet that someone tossed nearby.
Meanwhile, something was happening that was indicative of a continuing struggle in this hacker haven. David Silver was getting a lot of criticism. The criticism came from nemeses of the Hacker Ethic: the AI theorists and grad students on the eighth floor. These were people who did not necessarily see the process of computing as a joyful end in itself: they were more concerned with getting degrees, winning professional recognition, and the, ahem, advancement of computer science. They considered hackerism unscientific. They were always demanding that hackers get off the machine so they could work on their “Officially Sanctioned Programs,” and they were appalled at the seemingly frivolous uses to which the hackers put the computer. The grad students were all in the midst of scholarly and scientific theses and dissertations which pontificated on the difficulty of doing the kind of thing that David Silver was attempting. They would not consider any sort of computer-vision experiment without much more planning, complete review of previous experiments, careful architecture, and a setup which included pure white cubes on black velvet in a pristine, dustless room. They were furious that the valuable time of the PDP-6 was being taken up for this . . . toy! By a callow teenager, playing with the PDP-6 as if it were his personal go-cart.
While the grad students were complaining about how David Silver was never going to amount to anything, how David Silver wasn’t doing proper AI, and how David Silver was never going to understand things like recursive function theory, David Silver was going ahead with his bug and PDP-6. Someone tossed a wallet on the grimy, crufty floor, and the bug scooted forward, six inches a second, moved right, stopped, moved forward. And the stupid little bug kept darting forward, right, or left until it reached the wallet, then rammed forward until the wallet was solidly between its “antlers” (which looked for all the world like bent shirt-hangers). And then the bug pushed the wallet to its designated “pen.” Mission accomplished.
The graduate students went absolutely nuts. They tried to get Silver booted. They claimed there were insurance considerations springing from the presence of a fourteen-year-old in the lab late at night. Minsky had to stand up for the kid. “It sort of drove them crazy,” Silver later reflected, “because this kid would just sort of screw around for a few weeks and the computer would start doing the thing they were working on that was really hard, and they were having difficulties and they knew they would never really fully solve [the problem] and couldn’t implement it in the real world. And it was all of a sudden happening and I pissed them off. They’re theorizing all these things and I’m rolling up my sleeves and doing it . . . you find a lot of that in hacking in general. I wasn’t approaching it from either a theoretical point of view or an engineering point of view, but from sort of a fun-ness point of view. Let’s make this robot wiggle around in a fun, interesting way. And so the things I built and the programs I wrote actually did something. And in many cases they actually did the very things that these graduate students were trying to do.”
Eventually the grad students calmed down about Silver. But the schism was constant. The grad students viewed the hackers as necessary but juvenile technicians. The hackers thought that grad students were ignoramuses with their thumbs up their asses who sat around the eighth floor blindly theorizing about what the machine was like. They wouldn’t know what The Right Thing was if it fell on them. It was an offensive sight, these incompetents working on Officially Sanctioned Programs which would be the subjects of theses and then tossed out (as opposed to hacker programs, which were used and improved upon). Some of them had won their sanctions by snow-jobbing professors who themselves knew next to nothing about the machines. The hackers would watch these people “spaz out” on the PDP-6 and rue the waste of perfectly good machine time.
One of these grad students, in particular, drove the hackers wild—he would make certain mistakes in his programs that would invariably cause the machine to try to execute faulty instructions, so-called “unused op-codes.” He would do this for hours and days on end. The machine had a way of dealing with an unused op-code—it would store it in a certain place and, assuming you meant to define a new op-code, get ready to go back to it later. If you didn’t mean to redefine this illegal instruction, and proceeded without knowing what you’d done, the program would go into a loop, at which point you’d stop it, look over your code, and realize what you’d done wrong. But this student, whom we will call Fubar in lieu of his long-forgotten name, could never understand this, and kept putting in the illegal instructions. Which caused the machine to loop wildly, constantly executing instructions that didn’t exist, waiting for Fubar to stop it. Fubar would sit there and stare. When he got a printout of his program, he would stare at that. Later on, perhaps, after he got the printout home, he would realize his mistake, and come back to run the program again. Then he’d make the same error. And the hackers were infuriated because by taking his printout home and fixing it there all the time, he was wasting the PDP-6—doing thumb-sucker, IBM-style batch processing instead of interactive programming. It was the equivalent of cardinal sin.
So one day Nelson got into the computer and made a hack that would respond to that
particular mistake in a different way. People made sure to hang around the next time Fubar was signed up for the machine. He sat down at the console, taking his usual, interminably long time to get going, and sure enough, within a half hour, he made the same stupid mistake. Only this time, on the display screen, he saw that the program was not looping, but displaying the part of his code which had gone wrong. Right in the middle of it, pointing to the illegal instruction he’d put in, was a huge, gleaming, phosphorescent arrow. And flashing on the screen was the legend, “Fubar, you lose again!”
Fubar did not respond graciously. He wailed about his program being vandalized. He was so incensed that he completely ignored the information that Nelson’s hack had given him about what he was doing wrong and what he might do to fix it. He was not, as the hackers had somehow hoped, thankful that this wonderful feature had been installed to help him find the error of his ways. The brilliance of the hack had been wasted on him.
• • • • • • • •
The hackers had a word to describe those graduate students. It was the same word they used to describe almost anyone who pretended to know something about computers and could not back it up with hacker-level expertise. The word was “loser.” The hackers were “winners.” It was a binary distinction: people around the AI lab were one or the other. The sole criterion was hacking ability. So intense was the quest to improve the world by understanding and building systems that almost all other human traits were disregarded. You could be fourteen years old and dyslexic, and be a winner. Or you could be bright, sensitive, and willing to learn, and still be considered a loser.
To a newcomer, the ninth floor was an intimidating, seemingly impenetrable passion palace of science. Just standing around the likes of Greenblatt or Gosper or Nelson could give you goose bumps. They would seem the smartest people in the world. And since only one person at a time could use the PDP-6, it took a lot of guts to sit down and learn things interactively. Still, anybody who had the hacker spirit in him would be so driven to compute that he would set self-doubt aside and begin writing programs.