by Matthew Lyon
Vint was a wiry, intense, effusive kid. He joined his high school ROTC unit to avoid gym class. On the days he didn’t show up at school in his ROTC uniform, Vint wore a jacket and tie. And he always toted a large brown briefcase. By local standards, it was an unusual mode of dress, even in the late 1950s. “I used the coat and tie to distinguish myself from the crowd—maybe a nerd’s way of being different,” he recalled. Nonetheless, much to the consternation of his friends,Vint never had trouble attracting the attention of the opposite sex. He was, everyone agreed, one of a kind.
From an early age,Vint aspired to match the accomplished track record of his father, who had risen through the ranks to become a senior executive at North American Aviation (now Rockwell International). Both of Vint’s younger brothers played football and took turns as president of the student body.Vint was the bookworm. His literary tastes tilted toward fantasy. Well into his adult life, he regularly set aside several days to reread The Lord of the Rings trilogy.Vint did particularly well in chemistry, but his passion was math. When Steve Crocker started the math club atVan Nuys High,Vint was one of the first to join.
As a result of premature birth, Vint was hearing-impaired. Although hearing aids in both ears later corrected much of the deficit, he grew up devising clever strategies for communicating in the hearing world. Years later, after they became friends, Bob Kahn brought some of Cerf’s aural tricks to his friends’attention and Cerf eventually wrote a paper called “Confessions of a Hearing-Impaired Engineer,” in which he shared some of his secrets.
In particularly noisy environments (cafeterias, restaurants, and homes with dogs and small children), the deaf person’s reliance on conversational context often suffers badly. A typical strategy here is to dominate the conversation, not by doing all the talking, but by asking a lot of questions. In this way, the deaf listener will at least know what question the speaker is addressing, even if he cannot hear all of the response. In a group conversation, this can backfire embarrassingly if the question you ask is one which was just asked by someone else. A variation (equally embarrassing) is to enthusiastically suggest something just suggested, for example:
Friend A: I wonder what the origin of this term is?
Friend B: Why don’t we look it up in The Oxford English Dictionary?
Friend A: Yeah, but too bad we don’t have an O.E.D.
Cerf: I know. Why don’t we look it up in The Oxford English Dictionary?
Steve Crocker drifted in and out of Vint’s life. Steve’s parents were divorced, and he spent his high school years shuttling between suburban Chicago and the San Fernando Valley. Always precocious, Steve grew up knowing he was probably the smartest kid in any given room. At age thirteen, while home one day with a cold, he taught himself the elements of calculus. And at the end of tenth grade, he learned the rudiments of computer programming. “I remember being thrilled when I finally understood the concept of a loop,” Crocker recalled, “which enabled the computer to proceed with a very lengthy sequence of operations with only a relatively few instructions. I was a bit callow, but I remember thinking this was the kind of revelation that must have led Archimedes to run down the street naked yelling, ‘Eureka!’”
Around 1960, when Steve had returned to L.A., Vint followed him into the computer lab at UCLA. Although still in high school, Steve had gotten permission to use the UCLA computer, but the only free time he andVint had was on the weekends. One Saturday they arrived to find the computer lab building locked. “I couldn’t see any choice but to give up and go home,” said Crocker. But they looked up and saw an open second-story window. They looked at each other. “Next thing I know, Vint is on my shoulders,” Crocker recalled. Cerf went through the window and, once inside, opened the door and taped the latch so they could get in and out of the building. “When the Watergate burglars did the same thing a dozen years later and got caught, I shuddered,” said Crocker.
After high school, Cerf attended Stanford on a four-year scholarship from his father’s company. He majored in math but soon got hooked on serious computing. “There was something amazingly enticing about programming,” he said. “You created your own universe and you were the master of it. The computer would do anything you programmed it to do. It was this unbelievable sandbox in which every grain of sand was under your control.”
After graduating in 1965, Cerf decided he wanted to work for a while before going on to graduate school. IBM was recruiting on the Stanford campus, and Cerf took a job at IBM in Los Angeles. He went to work as the systems engineer for an IBM time-sharing system. Realizing he needed better grounding in computer science, he soon joined his friend Crocker, now a graduate student in UCLA’s computer science department. Computer science was still a young discipline, and UCLA’s Ph.D. program—one of the first in the country—was one of only a dozen in existence at the time. Cerf arrived just as Crocker was leaving for MIT. Crocker’s thesis advisor at UCLA was Jerry Estrin, the same professor Paul Baran had worked with a few years earlier. Estrin had an ARPA contract for the “Snuper Computer,” which used one computer to observe the execution of programs running on a second machine. Estrin took on Cerf as a research student for the project; it became the basis for Cerf’s doctoral thesis. In the summer of 1968 Crocker returned to UCLA and joined Cerf in Estrin’s group.
For both Cerf and Crocker, 1968 marked the beginning of a lifelong fascination with the networking of computers. For Cerf, computer networking would become the centerpiece of his professional career. Although Crocker would move on to other things for long stretches at a time, he too would eventually return to the field of networking.
In the fall of 1968, ARPA transferred its contract from Estrin to Len Kleinrock at UCLA. Kleinrock was setting up his Network Measurement Center, with a $200,000 annual contract from ARPA. By coincidence, when Kleinrock got the contract, the person in the office next door conveniently moved out, so Kleinrock expanded his domain; he tore down the wall between the two offices and installed a large conference table for meetings with students and staff. The meetings were frequent as Kleinrock busily built a small empire.
In planning the ARPA network, Larry Roberts had conceived of the Network Measurement Center as the organization that would be responsible for most of the performance testing and analysis. The measurement center was intended to be roughly analogous to a test track where drivers push the outer limits of high-performance cars. Kleinrock and his group were in charge of gathering data—total network response time, traffic density, delays, and capacity—the measures needed to evaluate how the network is performing. Like Bob Kahn, Kleinrock had a theoretician’s bent; his business was simulation, modeling, and analysis. Through simulations, he had come as close as he could to monitoring the ways in which networks perform without actually having a network to run. He welcomed the chance to test his theories on the real thing.
The engineers at BBN didn’t pay too much attention to Kleinrock. They thought he was a trifle heavy on theory and fairly light on engineering. The skepticism was mutual, for Kleinrock believed that the BBN team was largely uninterested in performance. BBN’s programmers were outstanding, but, said Kleinrock, “By and large, a programmer simply wants to get a piece of software that works. That’s hard enough. Whether it works efficiently or well is not usually the issue.” He was unaware, perhaps, of Walden and Crowther’s obsession with software efficiency, but in any case, perfecting network performance, Kleinrock decided, was his job.
Before long Kleinrock was managing forty students who helped run the center. Crocker and Cerf were among the senior members of Kleinrock’s group. Another important member was Jon Postel. He had a long bushy beard, wore sandals year-round and had never put on a tie in his life. Always dapper and generally more conservative, Cerf presented a striking contrast to Postel’s steadfastly casual appearance. Crocker, the unofficial leader, was somewhere in the middle. He had grown a beard at MIT (“Cops looked at me a little harder, but girls were a lot friendlier, and that was a trade-off
I could live with,” Crocker said), but was willing to put on a pair of dress shoes every now and then.
While Cerf and Crocker were academic stars, Postel, who was twenty-five, had had a more checkered academic career. He had grown up in nearby Glendale and Sherman Oaks, and he too had attended Van Nuys High School, where his grades were mediocre. Postel’s interest in computers developed at a local community college. By the time he got to UCLA to finish an undergraduate degree in engineering (the closest thing to computer science at the time) computing was his life. UCLA eventually decided to establish computer science as a formal department, at just about the time Postel was entering the university’s graduate school. Postel was quiet, but he had strong opinions. The people running the computer science department occasionally interpreted the firmness of Postel’s opinions as a bad attitude.
In 1966 Cerf had married a young illustrator named Sigrid. She was profoundly deaf. Their first meeting had been contrived by their hearing-aid dealer, who scheduled adjacent appointments for them one Saturday morning in hopes that they would cross paths and hit it off. They went to lunch and Sigrid was awestruck by her companion’s eclectic curiosity. Vint seemed to dance in his chair with excitement as he described his work with computers. They extended their tête-à-tête with a visit to the Los Angeles County Museum of Art to see some of Sigrid’s favorite paintings. Unschooled in art but eager to learn, Cerf stared for a long time at a huge Kandinsky. “This thing reminds me of a green hamburger,” he finally remarked. A year later they were married, with Steve Crocker as Vint’s best man (roles that would be reversed a few years later). Crocker’s electronics expertise came in handy when, minutes before the ceremony was to begin, he discovered the tape recorder for the wedding music was malfunctioning. Best man and frantic groom retreated to a tiny room near the altar and fixed it just in time.
Kleinrock, although only ten years older than the rest of his group, had a great reputation in queueing theory (the study of how long people and things spend waiting in lines, how long the lines get, and how to design systems to reduce waiting). He had already published a book and he was in charge of a growing lab; his energy seemed boundless. Moreover, he was one of just a handful of scientists who had produced analytic models of store-and-forward networks before Roberts got started on the ARPA project.
At the time, the UCLA computer science department owned a computer made by Scientific Data Systems called the Sigma-7, the latest in that firm’s line of computers. UCLA also had three major computer centers equipped with IBM 7094 mainframes. But the Sigma-7 was the machine assigned to the graduate students. No one liked the Sigma-7 much. It was unreliable and difficult to program. As a member of the UCLA team put it, the Sigma-7 was a dog. (“But it was our dog,” Cerf said years later.) It was also the only computer they had to play with—until, that is, the ARPA network came along. Not only would the computer scientists at UCLA be receiving the first IMP, but presumably the network would open doors to all kinds of different host machines at the other sites.
The most pressing task in the summer of 1969 was to build the interface—a combination of hardware and software—between the Sigma-7 and the IMP. As the UCLA guys understood it, BBN was working out some specifications for how to construct such a connection. The host-to-IMP interface had to be built from scratch each time a new site was established around a different computer model. Later, sites using the same model could purchase copies of the custom interface.
Nearly as urgent was the more far-reaching challenge of writing the software that allowed host computers throughout the network to communicate with one another. This was to be the host-to-host protocol, a very broad based set of operating terms that would be common to all machines. It had to be like a traveler’s check: good anywhere and able to support a gamut of applications, from remote log-ins to file transfers to text processing. Inventing it wouldn’t be easy.
The Search for Protocols
In the summer of 1968, a small group of graduate students from the first four host sites—UCLA, SRI, UC Santa Barbara, and the University of Utah—had met in Santa Barbara. They knew that the network was being planned, but they’d been given few details beyond that. But networking in general, and the ARPA experiment in particular, were hot topics.
The meeting was seminal, if only because of the enthusiasm it generated. “We had lots of questions—how IMPs and hosts would be connected, what hosts would say to each other, and what applications would be supported,” Crocker said. “No one had any answers, but the prospects seemed exciting. We found ourselves imagining all kinds of possibilities—interactive graphics, cooperating processes, automatic database query, electronic mail—but no one knew where to begin.”
From that meeting emerged a corps of young researchers devoted to working on, thinking through, and scheming about the network’s host-to-host communications. To speed up the process, they decided to meet regularly. Theoretically, a computer network would cut down on some of the ARPA-funded travel, but before long Crocker was traveling enough that Kleinrock had to procure a separate travel budget for him.
A month or so after the new group began meeting, it became clear to Crocker and others that they had better start accumulating notes on the discussions. If the meetings themselves were less than conclusive, perhaps the act of writing something down would help order their thoughts. Crocker volunteered to write the first minutes. He was an extremely considerate young man, sensitive to others. “I remember having great fear that we would offend whoever the official protocol designers were.” Of course, there were no official protocol designers, but Crocker didn’t know that. He was living with friends at the time and worked all night on the first note, writing in the bathroom so as not to wake anyone in the house. He wasn’t worried about what he wanted to say so much as he wanted to strike just the right tone. “The basic ground rules were that anyone could say anything and that nothing was official.”
To avoid sounding too declarative, he labeled the note “Request for Comments” and sent it out on April 7, 1969. Titled “Host Software,” the note was distributed to the other sites the way all the first Requests for Comments (RFCs) were distributed: in an envelope with the lick of a stamp. RFC Number 1 described in technical terms the basic “handshake” between two computers—how the most elemental connections would be handled. “Request for Comments,” it turned out, was a perfect choice of titles. It sounded at once solicitous and serious. And it stuck.
“When you read RFC 1, you walked away from it with a sense of, ‘Oh, this is a club that I can play in too,’” recalled Brian Reid, later a graduate student at Carnegie-Mellon. “It has rules, but it welcomes other members as long as the members are aware of those rules.” The language of the RFC was warm and welcoming. The idea was to promote cooperation, not ego. The fact that Crocker kept his ego out of the first RFC set the style and inspired others to follow suit in the hundreds of friendly and cooperative RFCs that followed. “It is impossible to underestimate the importance of that,” Reid asserted. “I did not feel excluded by a little core of protocol kings. I felt included by a friendly group of people who recognized that the purpose of networking was to bring everybody in.” For years afterward (and to this day) RFCs have been the principal means of open expression in the computer networking community, the accepted way of recommending, reviewing, and adopting new technical standards.
Before long, the assemblage began calling itself the Network Working Group, or NWG. It was a high commission for the country’s young and exceptionally talented communication programmers. Its main challenge was to agree in principle about protocols—how to share resources, how to transfer data, how to get things done. In real terms, that meant writing programs, or at least adopting certain rules for the way programs got written, rules to which a majority could consent. Agreement was the sine qua non. This was a community of equals. They could all write code—or rewrite the code someone else had written. The NWG was an adhocracy of intensely creative, sleep-deprived, idiosyncr
atic, well-meaning computer geniuses. And they always half-expected, any day, to be politely thanked for their work and promptly replaced by others whom they imagined to be the field’s true professionals. There was no one to tell them that they were as official as it got. The RFC, a simple mechanism for distributing documentation open to anybody, had what Crocker described as a “first-order effect” on the speed at which ideas were disseminated, and on spreading the networking culture.
Anticipating the construction of the network, the Network Working Group continued meeting regularly, and new terms and inventions often emerged by consensus. The very word “protocol” found its way into the language of computer networking based on the need for collective agreement among network users. For a long time the word has been used for the etiquette of diplomacy and for certain diplomatic agreements. But in ancient Greek, protokollon meant the first leaf of a volume, a flyleaf attached to the top of a papyrus scroll that contained a synopsis of the manuscript, its authentication, and the date. Indeed, the word referring to the top of a scroll corresponded well to a packet’s header, the part of the packet containing address information. But a less formal meaning seemed even more fitting. “The other definition of protocol is that it’s a handwritten agreement between parties, typically worked out on the back of a lunch bag,” Cerf remarked, “which describes pretty accurately how most of the protocol designs were done.”
But the first few meetings of the Network Working Group were less than productive. Over the course of the spring and summer of 1969, the group continued struggling with the problems of host-protocol design. Everyone had a vision of the potential for intercomputer communication, but no one had ever sat down to construct protocols that could actually be used. It wasn’t BBN’s job to worry about that problem. The only promise anyone from BBN had made about the planned-for subnetwork of IMPs was that it would move packets back and forth, and make sure they got to their destination. It was entirely up to the host computer to figure out how to communicate with another host computer or what to do with the messages once it received them. This was called the “host-to-host” protocol.