Dealers of Lightning

Home > Other > Dealers of Lightning > Page 12
Dealers of Lightning Page 12

by Michael Hiltzik


  Obviously, then, programmers must conform to a system. They instruct the machine to follow a series of conditions (if such-and-such a condition is met, do this; otherwise do that. . . . If you have done that, and such- and-such a state also exists, do this; else that). But the conditions must themselves conform to logic that has been burned into the machine's cir­cuits by the designer, or it will not comprehend. Computer programming is the process of telling a computer in its own language how to read and follow this cascade of "ifs." The programmer establishes a set of rules that happen to conform in a very fundamental way to the machine's own. It is the ultimate recursive endeavor, the joint discovery of rules and regula­tions leading to the invention of more rules and regulations that allow the machine to extend and expand its abilities and, consequently, those of its programmer and user. Alan Kay would become an expert in this partner­ship (and in a related field, the programming of programmers; but that lay far in the future).

  But there is a marvelous catch: These logical rules and regulations can apply to any abstract conditions the programmer chooses to define. As Kay put it years later: "Computers' use of symbols, like the use of symbols in language and mathematics, is sufficiently disconnected from the real world to enable them to create splendid nonsense... Although the hard­ware of the computer is subject to natural laws (electrons can move through the circuits only in certain physically defined ways), the range of simulations the computer can perfonn is bounded only by the limits of human imagination. In a computer, spacecraft can be made to travel faster than the speed of light, time to travel in reverse."

  His enchantment with a system so rigidly structured yet infinitely malleable may have had to do with his childhood in the bosom of a close but itinerant family. One year after his birth in 1940 in Spring­field, Massachusetts, the family had moved to Australia, his father's native land. Only four years later they were on the move again, fleeing a Japanese fleet that had already reached New Guinea and seemed prepared to continue its way south without resistance. Back in the United States the Kays took up residence in the Hadley, Massachu­setts, farmhouse of Alan's maternal grandfather. He was Clifton John­son, a writer, musician, and pioneering documentary photographer early in the century. In this farmhouse Kay's education began.

  Clifton Johnson had died the year of Alan's birth, inspiring a family fancy that the old man's inquisitive and creative temperament had been infused into the grandson's. More prosaically, Johnson had filled the house with books, five thousand of them, addressing every topic under the sun. Alan reached first grade as a five-year-old autodidact. "By the time I got to school, I had already read a couple hundred books. I knew in the first grade that they were lying to me because I had already been exposed to other points of view. They didn't like the idea of having different points of view, so it was a battle."

  There were some respites from the combat. One was music, taught to him by his mother, who had received her own musical training from Johnson himself. But otherwise the contest continued through his entire school career. This ranged, thanks to his father's career as a uni­versity scientist and a physiologist, from the elite Brooklyn Technical High School to public school in Port Washington on suburban Long Island. There were sickly periods leading to further self-education, including a bad bout of rheumatic fever in his senior year of high school, and further contentiousness (a dismissal for insubordination in Brooklyn).

  Port Washington in Kay's recollection was a community suffused with music. "This was a place where football players played in the band and orchestra for status. It was the tiling. The Congregational Church had five choirs, each with 100 voices. I'll never forget Easter, when they'd combine the choirs for sunrise services. Full orchestra. Five hundred voices. The best, best stuff." There he also met Chris Jeffers, who would introduce him to his first computer. Jeffers was a junior, a year behind Kay (although since Kay’s illness lost him a year of school, they graduated together). He was also a superb pianist with perfect pitch and a thriving jazz band. Kay joined up on guitar. The band played Dixieland jazz from Jeffers s effort­less arrangements, an interesting choice if one is looking for a form drat imposes strict formal rules on players who are encouraged to break them according to another set of strict formal rules.

  They split for college, Jeffers to the University of Colorado and Kay to Bethany College, a small West Virginia school with a decent pro­gram in biology. Academic disaster reunited them. As Kay tells the story, Bethany took umbrage at his charge that the administration imposed a Jewish quota to control the number of New Yorkers in its pre-medical program. The dean instructed him not to return to cam­pus after Easter recess. Kay called Jeffers, unaware that his friend had himself been suspended for spending all his time on a student musical production instead of classwork.

  "Guess what, Chris," Kay said. "I just got thrown out of school!"

  "Great, me too! When you coming out?"

  Jeffers had decided to stay in Denver, taking a job at the national reservations office of United Airlines, a vast computer depot located near Stapleton International Airport, until he could resume his educa­tion. The two friends took up residence in the basement of a con­demned building not far from the end of the runway. Kay found work in a music store, where he could wait for lightning to jolt him into the next stage of his life.

  One day Jeffers invited him to visit United. Kay understood comput­ers in the abstract, the way curious kids understood them in the days when the most modest machine represented a ten-million-dollar capi­tal investment. United's IBM 305 RAMAC was the first one he ever touched. It was huge, specifically designed to manage colossal data­bases like the fifty-two weeks' worth of reservations and seating records consigned to Denver's safekeeping. But what really struck Kay was the primitiveness of its operational routine. The system was ser­viced by platoons of attendants, full-time menials doing nothing more refined than taking stacks of punch cards from one machine and load­ing them in the next. To his amazement, digital electronics turned out to be as mindless and labor-intensive as laying a sewer line. As Kay's eyes followed the drones traversing the workhouse floor, the germ of an idea took hold. There was an exorbitant discrepancy between the purpose of the machine—which was to simplify human endeavor— and the effort required to realize it. Kay banked the insight. He would not begin to understand it until much later, well after the lark of taking an Air Force aptitude test metamorphosed into a serious career choice.

  The two-week IBM course he received courtesy of Conway AFB was effective, but rudimentary. "Programming is in two parts," he said later. "The bricklaying part, which IBM taught, and the architecture part, which can take two or three years."

  In those days every computer was different. There was nothing like today's standardized architectures, according to which all IBM-compatible machines, for instance, respond to the same set of operating instructions even though they may be manufactured by different companies accord­ing to widely variant specifications of memory, data storage, and even microprocessor design. Standardization has helped make computers a mass-market phenomenon. It allows users to be reasonably confident that a program bought off the shelf will work properly regardless of who manufactured their computer, just as they know they will find the accel­erator and brake pedal in the same location regardless of whether their car is a Ford or a Chevrolet.

  Nothing of the kind existed in the computer world in the 1960s. Machines differed in shape, size, and architecture down to the cir­cuitry inside their cabinets and the sequences of digital ones and zeros delivering instructions to the central processing unit. The same eight- bit sequence, say "11110000," might tell a Burroughs computer to add two numbers together and a Control Data 6600 to divide one by the other. Each machine had its unique method for everything from stor­ing files on disk or drum to performing basic mathematical functions. The differences were entirely arbitrary, no more consistent than if the pedal by the right foot operated the accelerator on a Ford but the headlights on a Chevy
.

  Nor did the manufacturers see any advantage to marketing machines even remotely like their competitors'. Once IBM sold a system to United Airlines it could rest assured that the frightful effort of rewrit­ing software, retraining staff, and moving tons of iron and steel cabi­nets around would make United think very long and hard before replacing its IBM system by one made by, say, Honeywell.

  Therefore Kay, who had programmed everything from a Burroughs 5000 at the Air Force Air Training Command to a Control Data 6600 at NCAR, the National Center for Atmospheric Research, was compelled to become a student of computer architectures. Subconsciously his mind was absorbing the principles of programming that would grow a few years hence at PARC into an extraordinary advance in software design. As he recalled later, however, at the moment "I barely saw it."

  So too did he assimilate only subconsciously an article in a technical magazine he came upon while debugging NCAR's giant CDC 6600 in Chippewa Falls, Wisconsin, in 1965. The magazine was Electronics. For its thirty-fifth anniversary issue it had invited a few industry leaders to plot a technology curve for the next ten years. The research director at Fairchild Semiconductor Co., a brilliant engineer named Gordon Moore, contributed a four-page piece insouciantly entitled "Cramming More Components onto Integrated Circuits." The essay forecast that as circuits became more densely packed with microscopic transistors, com­puting power would exponentially increase in performance and diminish in cost over the years. Moore contended that this trend could be pre­dicted mathematically, so that memory costing $500,000 in 1965 would come all the way down to $3,000 by 1985—an insight so basic to the sub­sequent growth and expansion of the computer industry that ever since then it has been known as "Moore's Law."

  That day in 1965, however, Alan Kay skimmed Moore's article and laid it aside, unmoved. The dream of a computer scaled down to serve a sin­gle human being would not come to him for another couple of years. As he toiled in Chippewa Falls on a room-sized, freon-cooled CDC 6600,

  Gordon Moore's astonishing prediction that electronics had embarked on a journey of unceasing miniaturization seemed to have no relevance to his life at all.

  "I was in an embryonic state. I didn't want to work and get a real job, but go to graduate school. The only criterion was that it had to be above four thousand feet in altitude."

  In 1966 Kay finally secured his bachelor's degree from the University of Colorado, a double major in mathematics and molecular biology. The only doctoral program he could find to fit his exacting specification was the one Dave Evans had established at Utah with a $5 million grant from Bob Taylor at ARPA. To his own amazement he got accepted as the seventh graduate student in the school's tiny department of com­puter science.

  "I discovered later that Evans never looked at my grades," Kay said. "He didn't believe in it. You had to send him a resume, which was all he ever looked at. He was like Al Davis of the Oakland Raiders; his theory was to let everybody into training camp and give them a really decent chance, then be incredibly savage cutting the roster. I was com­pletely thrilled that this guy seemed to think so much of my abilities. One thing I resolved was that he'd never find out the truth."

  Taylor's ARPA money had turned Utah into a hotbed of computer graphics. Kay discovered that the day he walked into Evans's office to meet his new mentor. Evans, an introverted gentleman of few words, reached over to a foot-high stack of documents bound in brown paper piled on his desk. He handed one to Kay and said, "Take this and read it."

  The title read, "Sketchpad: A Man-Machine Graphical Communica­tions System." The 1963 MIT doctoral thesis of Ivan Sutherland, Taylor's predecessor at IPTO, the paper described a program that had become the cornerstone of the young science of interactive computer graphics. Sketchpad worked on only one machine in the world, Wes Clark's TX-2 at Lincoln Lab. But its precepts were infinitely applicable to a whole range of increasingly nimble and powerful computers then coming into existence. Sketchpad was also, by Evans's mandate, the cardinal intro­duction to computing in his doctoral program. "Basically," Kay said, "you had to understand that before you were a real person at Utah."

  Sutherland's system could create graphic objects of dazzling complex­ity, all the more amazing given the severe limitations of the contempo­rary hardware. With Sketchpad the user could skew straight lines into curves ("rubber-banding"), make engineering-precise lines and angles (the system straightened out the draftsman's rough sketches), and zoom the display resolution in and out. The program pioneered the "virtual desktop," in which the user sketched on the visible portion of a theoret­ical drawing space about one-third of a mile square (the invisible por­tions were held in the computer's memory and could be scrolled into view). Contemplating the power of Sketchpad was "like seeing a glimpse of heaven," Kay said later. "It had all of the kinds of things that the com­puter seemed to promise. You could think of it as a light that was sort of showing us the way."

  That graphics could be a directly manipulable—and minutely per­sonalized—element of the computer interface was one of dozens of new concepts that bombarded Kay in his first few weeks at Utah. His mind on fire, he spent hours in the library stacks photocopying every­thing that grabbed his interest in the computing literature. He emerged with hundreds of articles, virtually a living history of comput­ing for his parched intellect to absorb.

  He soon came under other powerful influences. At one conference he heard the oracular Marvin Minsky speak. Minsky was an MIT psycholo­gist and a computing pioneer, a disciple of the child psychologist Jean Piaget and a founder of the new science of artificial intelligence, which aimed to reproduce human psychology in the computer. His speech was a "terrific diatribe" about how traditional education destroys the learn­ing aptitude of children, a subject that must have resounded to the pre­cocious Kay's very soul. Minsky did not specifically prescribe computers as the answer. But he made intriguing mention of the work a colleague had done in designing a computer language to help children learn pro­gramming.

  Early the next year Kay got to meet this colleague. Seymour Papert was a burly, bushy-bearded South African, a Cambridge-trained mathemati­cian who managed to combine a single-minded absorption with the learning skills of children with a profound absent-mindedness about everything else. Papert had devised a simple programming language known as "LOGO," the aim of which was to teach children about computers by giving them a tool to see the machine instantaneously respond to their commands. LOGO literally turned the computer into a toy. Its most conspicuous feature was a turtle-shaped robot the size of a dinner plate. This device would crawl about on a schoolroom floor according to simple commands children could type onto a computer screen: "forward 100" directed it in a straight line 100 turtle steps, "right 90" dictated a 90- degree right turn, and so on. A pen protruding from the turtles belly would trace its path on the floor, allowing the more adept of its young programmers to create patterns of almost limitless intricacy.

  LOGO'S genius was its ability to turn the abstract (one can command a computer to do something) into the concrete (one can direct the turtle to draw a parallelogram). To Kay it was a revelation to watch Papert's ten-, eleven-, and twelve-year-old subjects use a simple computer to create designs one would odierwise assume could only be achieved by main­frame systems loaded with complex algorithms. Papert showed the way toward reducing the machine from demigod to tool (in Wes Clark's phrase) by subjecting it to the unforgiving scrutiny of children. Kay never forgot the lesson. As he wrote later, "The best outputs that time-sharing can provide are crude green-tinted line drawings and square-wave musi­cal tones. Children, however, are used to finger paints, color television and stereophonic records, and they usually find the things that can be accomplished with a low-capacity time-sharing system insufficiently stimulating to maintain their interest." Or as Kay and his colleague Adele Goldberg wrote later: "If 'the medium is the message,' then the message of low-bandwidth time-sharing is 'blah.'"

  When his turn came to design a programming lan
guage at PARC, he would invest it with several unmistakable elements of Papert's system: its visual feedback, its accessibility to novices, and its orientation to the wonder and creativity of childhood. Partially in deference to this last factor, he would call it "Smalltalk."

  While Kay was taking these first mind-blowing excursions into Idea- space, the caliber of graphics research at Utah was exploding. Ivan Sutherland had joined the faculty to work with his friend Dave Evans (they would eventually form a partnership to manufacture interactive military simulators). Kay's fellow grad student John Warnock achieved a graphics milestone by solving the famous "hidden-line problem," which applied to how computers could draw the outline of a form when it is par­tially hidden behind another—the sides of a triangle hidden behind a ball, for example—so all the visible sides and angles convincingly line up. (Warnock s solution is a tour de force of such compactness that his doc­toral thesis, in which it is described, runs to only 32 pages.)

 

‹ Prev