by Katie Hafner
Baran felt that it was better to wait until “a competent organization came along.” And with that, after five years of struggle, Paul Baran shifted his attention to other things. It was, in no small way, a visit back to the “Are we rich or are we poor?” question he had posed decades earlier to his parents, whose starkly contrasting answers helped him understand that most things in life are a matter of perspective.
In London in the autumn of 1965, just after Baran halted work on his project, Donald Watts Davies, a forty-one-year-old physicist at the British National Physical Laboratory (NPL), wrote the first of several personal notes expounding on some ideas he was playing with for a new computer network much like Baran’s. He sent his set of notes to a few interested people, but—certain he would encounter stiff resistance from the authorities in charge of the British Post Office’s telephone service monopoly—mostly he kept his ideas to himself. Davies wanted time to validate his concepts. By the following spring, confident that his ideas were sound, he gave a public lecture in London describing the notion of sending short blocks of data—which he called “packets”—through a digital store-and-forward network. As the meeting was breaking up, a man from the audience approached Davies and said that he was from the Ministry of Defence. He told Davies about some remarkably similar work that had been circulated in the American defense community by a man named Paul Baran. Davies had never heard of Baran or his RAND studies.
Donald Davies was the son of working-class parents. His father, a clerk at a coal mine in Wales, died the year after Donald and his twin sister were born. Their mother moved her young family to Portsmouth, a British naval port, where she went to work as a counter clerk in the post office. Donald experimented with radio at a young age and took an early interest in physics. He was not yet fourteen the day his mother brought home a book, something an engineer had left behind at the post office, all about telephony. The technical volume described the logic and design of telephone-switching systems and “made hours of fascinating reading,” Davies recalled years later.
An exceptional student, Davies was offered scholarships to several universities. To celebrate its star student, his school declared a half-day holiday. “For a short time I was the most popular boy in the school,” he recalled. Davies chose the University of London’s Imperial College, and by the time he was twenty-three had earned degrees in physics and mathematics. In 1947 he joined a team of scientists led by the mathematician Alan Turing at the National Physical Laboratory and Davies played a leading part in building the fastest digital computer in England at the time, the Pilot ACE. In 1954 Davies won a fellowship to spend a year in the United States; part of that year, he was at MIT. He then returned to England, rose swiftly at the NPL, and in 1966, after describing his pioneering work on packet-switching, he was appointed head of the computer science division.
The technical similarity between Davies’ and Baran’s work was striking. Not only were their ideas roughly parallel in concept, but by coincidence they had even chosen the same packet size and data-transmission rate. Independently, Davies also came up with a routing scheme that was adaptive, like Baran’s, but different in detail.
There was just one major difference in their approaches. The motivation that led Davies to conceive of a packet-switching network had nothing to do with the military concerns that had driven Baran. Davies simply wanted to create a new public communications network. He wanted to exploit the technical strengths he saw in digital computers and switches, to bring about highly responsive, highly interactive computing over long distances. Such a network would have greater speed and efficiency than existing systems. Davies was concerned that circuit-switched networks were poorly matched to the requirements of interacting computers. The irregular, bursty characteristics of computergenerated data traffic did not fit well with the uniform channel capacity of the telephone system. Matching the network design to the new types of data traffic became his main motivation.
Rather than conduct the kind of redundancy and reliability studies to which Baran had devoted so much time, Davies focused on working out the details of configuring the data blocks. He also foresaw the need to overcome differences in computer languages and machine-operating procedures—differences in hardware and software—that would exist in a large public network. He envisioned the day when someone would sit down at one kind of computer and interact with a machine of a different kind somewhere else. To bridge the gap between the widely divergent computer systems of the era, he began outlining the features of an intermediary device—a new computer—that would serve as a translator, assembling and disassembling digital messages for the other machines.
The idea of splitting messages into uniform data “packets”—each the length of one typical line of text—was something Davies hit upon after studying advanced time-sharing systems and how they allocated processing time to multiple users. On a trip to the United States in 1965 he had observed MIT’s Project MAC time-sharing system. A few months later, the NPL had played host in London to a group from MIT, including Larry Roberts, for further discussions about time-sharing. It was in those meetings that the packet idea first struck Davies. Unlike AT&T’s chilly reaction to Baran, the British
telecommunications establishment embraced Davies’ ideas. He was encouraged to seek funding for an experimental network at NPL.
Time-sharing systems had already solved the nagging problem of slow turnaround time in batch processing by giving each user a slice of computer processing time. Several people could be running jobs at once without noticing any significant delay in their work. Analogously, in a digital communications network, a computer could slice messages into small pieces, or packets, pour those into the electronic pipeline, and allow users to share the network’s total capacity. Davies, like Baran, saw in the digital age the feasibility for a new kind of communications network.
Davies’ choice of the word “packet” was very deliberate. “I thought it was important to have a new word for one of the short pieces of data which traveled separately,” he explained. “This would make it easier to talk about them.” There were plenty of other possibilities—block, unit, section, segment, frame. “I hit on the word packet,” he said, “in the sense of small package.” Before settling on the word, he asked two linguists from a research team in his lab to confirm that there were cognates in other languages. When they reported back that it was a good choice, he fixed on it. Packet-switching. It was precise, economic, and very British. And it was far easier on the ear than Baran’s “distributed adaptive message block switching.” Davies met Baran for the first time several years later. He told Baran that he had been thoroughly embarrassed to hear of Baran’s work after he had finished his own, and then added, “Well, you may have got there first, but I got the name.”
Mapping It Out
In December 1966, when Larry Roberts arrived at the Pentagon, he knew Donald Davies from his trip to London the previous year, but didn’t know about Davies’ subsequent work in packet switching. And he had never heard the name Paul Baran.
A few years earlier, Roberts had decided that computing was getting old and everything worth doinginsidea computer had already been done. This had come to him as a revelation at a 1964 conference held in Homestead, Virginia, where Roberts, Licklider, and others stayed up until the small hours of the morning talking about the potential of computer networks. Roberts left the meeting resolved to begin work on communicationsbetweencomputers.
His first opportunity came a year later when he oversaw one of the first real experiments in uniting disparate machines over long distances. In 1965 psychologist Tom Marill, who had studied under Licklider and who was similarly entranced by computers, started a small time-sharing company, Computer Corporation of America (CCA). When Marill’s largest investor backed out at the last minute, Marill cast about for some R&D work. He proposed to ARPA that he conduct a networking experiment tying Lincoln’s TX-2 computer to the SDC Q-32 in Santa Monica. Marill’s company was so small, however, th
at ARPA recommended carrying out his experiment under the aegis of Lincoln Laboratory. Officials at Lincoln liked the idea and put Larry Roberts in charge of overseeing the project.
The objective was clear. As Marill argued in a letter to Roberts in 1965, computing had reached an unfortunate state of affairs; time-sharing projects were proliferating, but there was no “common ground for exchange of programs, personnel, experience, or ideas.” His impression of the computer science community was “of a number of essentially similar projects, each going off in its own direction with complete disregard for the others.” Why waste resources?
As far as it went, the TX-2 experiment was ambitious. The link between the two computers was made using a special Western Union four-wire full-duplex service (full duplex provides simultaneous transmission in both directions between two points). To this Marill attached a crude form of modem, operating at 2,000 bits per second, which he called an automatic dialer. By directly linking the machines, Marill got around the problem of incompatibilities between them. The idea was to connect the computers like a pair of Siamese twins, running programsin situ.Though not transferring files back and forth, the experiment did allow the machines to send messages to each other. Marill set up a procedure for grouping characters into messages, sending them across the link, and checking to see if the messages arrived. If no acknowledgment followed, the message was retransmitted. Marill referred to the set of procedures for sending information back and forth as a message “protocol,” prompting a colleague to inquire, “Why do you use that term? I thought it referred to diplomacy.”
In a 1966 report summarizing the initial results of the experiment, Marill wrote that he could “foresee no obstacle that a reasonable amount of diligence cannot be expected to overcome.” Diligence notwithstanding, when Marill and Roberts actually connected the two machines, the results were mixed. The connection itself worked as planned. But the reliability of the connection and the response time were, as Roberts would describe them several years later, just plain lousy.
Bringing together two different computers was one thing, but the project for which Roberts had been pulled away from Lincoln to work at ARPA was another, much greater challenge. Interconnecting a matrix of machines, each with distinct characteristics, would be exceedingly complicated. To pull it off was probably going to require calling on every expert Roberts knew in every area of computing and communications.
Fortunately, Roberts’s circle of colleagues was wide. One of his best friends from Lincoln Laboratory, with whom he had worked on the TX-2, was Leonard Kleinrock, a smart and ambitious engineer who had attended MIT on a full scholarship. If anyone influenced Roberts in his earliest thinking about computer networks, it was Kleinrock.
Kleinrock’s dissertation, proposed as early as 1959, was an important theoretical work that described a series of analytical models of communication networks. And in 1961, while working with Roberts, Kleinrock had published a report at MIT that analyzed the problem of data flow in networks. Kleinrock had also worked on random routing procedures, and had some early thoughts on dividing messages into blocks for efficient use of communication channels. Now Kleinrock was at UCLA, and Roberts gave him an ARPA contract to set up the official Network Measurement Center there, a lab devoted to network performance testing.
The friendship between Roberts and Kleinrock went well beyond the professional interests they shared. Brain teasers were one common interest. Money-making schemes were another. Each reinforced the other’s openness to fiscal adventure. Those who thought Roberts was all work and no play had never seen him in action with his friends.
Roberts and Kleinrock were inveterate casino gamblers. Roberts developed a “high-low” counting scheme for blackjack and taught it to Kleinrock. They never made it into the official rogues’gallery of blacklisted card counters, but more than once, they were spotted by casino detectives and asked to leave.
And in another daring episode, Roberts and Kleinrock cooked up a plan to cash in on the physics of roulette. The idea was to predict, employing rudimentary laws of motion, just when the ball would fall off its trajectory. To do this, they needed to know the speed of the ball, which traveled in one direction, and the speed of the wheel, which traveled in the other. They decided to build a small machine that would make the predictions, but they needed a little data. So Roberts got a tape recorder, put a microphone in his hand, and made a cast that made it appear he had a broken wrist. The two sat down at the table and Roberts placed his hand next to the wheel to record the sound of the passing ball, from which they could later extrapolate its speed. Kleinrock’s job was to distract the pit boss by playing several rounds of roulette. “Everything was working fine except for one thing,” Kleinrock said. “I started winning. It drew attention to me. The pit boss looks over and sees this guy with a broken arm with his hand near the roulette wheel, and he grabs Larry’s arm and says, ‘Let me see your arm!’Larry and I made fast tracks out of there.”
• • •
Roberts agreed with Taylor that fast response time for the network was critical, because a low message delay time was crucial to interactivity. Anyone who had used time-sharing systems that passed data over standard communications lines knew how sluggish they could be. Data traveled to and from the main computer at excruciatingly slow rates of a few hundred bits per second. Retrieving or sending even a small amount of information was a process that left plenty of time to pour yourself a cup of coffee, or even brew an entire pot, while the modem churned away. No one wanted a sluggish network.
During an early meeting of the loose group of advisors Roberts had assembled, someone banged his fist on the table and said, “If this network can’t give me a one-second response, it’s no good.” Optimistically, a half-second response time was written into the requirements. The second priority, of course, was reliability. If a network was to be effective, users needed complete confidence in its ability to send data back and forth without snafus.
Another source of consternation was the question of how the network would be mapped out. Several people had proposed that the resource sharing be done on a single centralized computer, sitting in, say Omaha, a popular place for long-distance telephone switches because it lay at the nation’s geographic center. If centralization made sense for a telephone network, why not for a computer network? Perhaps the network should use dedicated phone lines—a question that was still unresolved—which would help keep costs uniform. Baran had avoided a centralized system because that increased its vulnerability. Roberts, too, was opposed to a centralized approach, but decided to delay his final decision until he could bring up the topic with a large group. His chance came soon, at a meeting for ARPA’s principal investigators in Ann Arbor, Michigan, in early 1967.
Taylor had called the meeting, and the principal item on the agenda was the networking experiment. Roberts laid out his initial plan. The idea, as he described it, was to connect all of the time-sharing computers to one another directly, over dial-up telephone lines. The networking functions would be handled by the “host” computers at each site. So, in other words, the hosts would do double duty, as both research computers and as communications routers. The idea was greeted with little enthusiasm. People from the proposed host sites foresaw no end of trouble. No one wanted to relinquish an unknown portion of valuable computing resources to administer a network about which they were hardly excited. Then there were dozens of idiosyncratic variations to cope with, not the least of which was the fact that each machine spoke a language substantially different from the others. It seemed nearly impossible to standardize around one set of protocols.
The Ann Arbor meeting revealed the lack of enthusiasm, if not downright hostility, to Taylor and Roberts’s proposal. Few ARPA principal investigators wanted to participate in the experiment. This attitude was especially pronounced among researchers from the East Coast universities, who saw no reason to link up with campuses in the West. They were like the upper-crust woman on Beacon Hill who, when told th
at long-distance telephone service to Texas was available, echoed Thoreau’s famous line: “But what would I possibly have to say to anyone in Texas?”
Douglas Engelbart, a computer scientist at Stanford Research Institute (SRI) in 1967, remembered the meeting clearly. “One of the first reactions was, ‘Oh hell, here I’ve got this time-sharing computer, and my resources are scarce as it is.’ Another reaction was, ‘Why would I let my graduate students get sucked off into something like this?’” Nonetheless, it quickly became clear just how serious Roberts was. First he tried to allay the skepticism about resource-sharing by pointing out that everyone had something of interest on his computer that others might want. “I remember one guy turning to the other and saying, ‘What have you got on your computer that I could use?’” Engelbart recalled, “And the other guy replied, ‘Well, don’t you read my reports?’” No one was taken with the idea. “People were thinking, ‘Why would I need anyone else’s computer when I’ve got everything right here?” recalled Jon Postel, then a graduate student at UCLA. “What would they have that I want, and what would I have that I want anyone else to look at?”
An even more difficult problem lay in overcoming the communications barriers between disparate computers. How could anyone program the TX-2, for instance, to talk to the Sigma-7 at UCLA or the computer at SRI? The machines, their operating systems, their programming languages were all different, and heaven only knew what other incongruities divided them.