Dealers of Lightning

Home > Other > Dealers of Lightning > Page 36
Dealers of Lightning Page 36

by Michael Hiltzik


  That left them no choice but to unleash their neutron bomb. Thanks to a lucky precognition, Shoch had equipped the worm with a self-destruct mechanism, like a spymaster providing his agent with a suicide capsule as insurance against some unpredictable disaster. He injected a specially coded packet into the Ethernet that instructed every worm to instantly stop whatever it was doing. For a few nerve-racking seconds he waited. Then he checked the system.

  To his relief, all the worm activity had ceased. That was the good news. The bad news was that the entire Ethernet had been figuratively reduced to a smoking ruin. Scattered around the building were 100 dead Altos. "The embarrassing results," he said, "were left for all to see."

  With that lone exception, the worm ranked as a welcome addition to the PARC programming arsenal. Shoch came to calling the plain-vanilla version the "existential" worm—it simply reached out for hosts, copied itself, and self-destructed after a programmed interval. But there were dozens of other applications. The Billboard worm, for example, snaked through the system depositing on every machine a bitmapped image, such as a "cartoon of the day" to greet workers in the morning. The Alarm Clock worm maintained a table of wakeup calls, and at the pertinent time accessed PARC's telephone directory and placed a call to the user's phone. The Peeker logged the results of each night's memory tests and notified PARC technicians which machines might need a new chip.

  Shoch always thought of these applications as "toy programs." In his view the worm's real value lay in tying widely distributed computers into multi-machine units of exceptional power. Yet the potential of parallel computing would not become evident to the outside world for a long time. Silicon microprocessors would soon become powerful enough to handle complex calculations without help. Only years later would scien­tists again need to harness the power of multiple processors at once, when massively parallel processing would become an integral part of supercomputing.

  Years later, too, the genealogy of Shoch’s worm would come full circle. Soon after he published a paper about the worm citing The Shockwave Rider, he received a letter from John Brunner himself. It seemed that most science fiction writers harbored an unspoken ambition to write a book that actually predicted the future. Their model was Arthur C. Clarke, the prolific author of 2001: A Space Odyssey, who had become world-famous for forecasting the invention of the geosynchronous com­munications satellite in an earlier short story.

  "Apparently they're all jealous of Arthur Clarke," Shoch reflected. "Brunner wrote that his editor had sent him my paper. He said he was 'really delighted to learn, that like Arthur C. Clarke, I predicted an event of the future.'" Shoch briefly considered replying that he had only bor­rowed the tapeworm's name but that the concept was his own and that, unfortunately, Brunner did not really invent the worm.

  But he let it pass.

  CHAPTER 21

  The Silicon revolution

  Years later Lynn Conway could still remember the moment she first laid eyes on the chip that would launch a new science. It was a week or two after Christmas 1979. She was seated before her second-floor window at PARC, which looked down on a lovely expanse of valley in its coat of lush winter green, sloping down toward Page Mill Road just out of view to the south. But her eyes were fixed on a wafer of silicon that had just come back from a commer­cial fabrication shop.

  There were dozens of chip designs on the wafer, mostly student efforts from a Stanford course being taught under PARC's technical supervision. They all strived toward an intricate machine elegance, comprising as they did tens of thousands of microscopic transistors packed into rectangular spaces the size of a cuticle, all arranged on a wafer that could fit comfort­ably in the palm of one's hand. A few years earlier the same computing power could not have fit on an acre of real estate.

  One design stood out, and not only because it bore along its edge the assertive hand-etched legend: "Geometry Engine © 1979 James Clark." Where the others looked to be simple arrays of devices that formed simple digital clocks and arithmetic search engines and the like, Clark's was obviously something more—larger, deeper, more complex than the oth­ers, even when viewed with the naked eye.

  Clark's got something really amazing going on in there, Conway thought to herself. But who knows what?

  What Clark had going on, as it would turn out, was the cornerstone of an entirely original technology. The "Geometry Engine," which he designed with the help of several of his Stanford students, was unique in compressing into a single integrated circuit the huge computing resources needed to render three-dimensional images in real time. After the appearance of Clark's chip, the art and science of computer graphics would never be the same: The computer-aided design of cars and air­craft, the "virtual reality" toys and games of the modern midway, the lum­bering dinosaurs of the movie Jurassic Park—they all sprang from the tiny chip Lynn Conway held by its edges that winter day.

  With the Geometry Engine as its kernel, Clark founded Silicon Graph­ics Incorporated and developed it into the multibillion-dollar company it is today. But without Lynn Conway and PARC, he could not have built the Geometry Engine. The irony is that when Conway first proposed that PARC step into the vanguard of the science of designing such extraordi­narily complex integrated circuits, many of her colleagues doubted it was worth doing at all.

  Conway's program would never even have gotten started had not Bert Sutherland decided that PARC needed a shot of "havoc."

  Sutherland had taken over management of the Systems Science Lab in 1975 after leaving Bolt, Beranek & Newman, the Boston consulting firm that had earlier given PARC Jerry Elkind, Bob Metcalfe, Dan Bobrow, and Warren Teitelman. Like them, he held strong views about research methods which did not always conform to PARC orthodoxy, especially as it was practiced in Bob Taylor's Computer Science Lab. Sutherland believed that research conducted in a closed environment was doomed to suffocate, like an animal trapped in an airtight cage. He admired the Computer Science Lab's work but regarded Taylor and some of his engi­neers as overly prone to facile prejudices and snap judgments—condi­tions, he thought, that deprived CSL of the necessary aeration. The har­vest was its self-destructive elitism.

  "They were the best and the brightest," he said later. "That was the good news. The bad news was that they knew it."

  Sutherland did not allow SSL to become so sequestered. His policy was to keep its atmosphere enriched via continual contact with the outside world. One of his first acts upon succeeding Hall, for example, had been to send the engineers Tim Mott and Bill Newman on an "archeological dig" to Xerox's copier sales office in Santa Clara, a few miles south of Palo Alto. The idea was for them to study how real office workers performed their daily routines, the better to design the equipment they would use in the future. This effort yielded OfficeTalk, a sophisticated and integrated system of office automation that heavily influenced the later design of the Star. Sutherland also recruited to SSL experts in cognitive science such as Stuart Card, Tom Moran, and John Seely Brown, whose research into how real people actually used computers, step by step and motion by motion, led to groundbreaking insights into man-machine ergonomics— insights that not even J. C. R. Licklider had anticipated when he wrote his own pioneering treatise on the subject in 1962.

  At CSL, unsurprisingly, Sutherland's democratic instincts provoked grumbling—wasting precious resources on anthropology, of all things!— even before he brought Carver Mead into the SSL tent. Then all hell broke loose.

  Mead was one of the most popular and influential professors in the computer science department at California Institute of Technology, where Sutherland's brother Ivan had recently become department chairman. Mead instantly struck him as the right person "to wander in and create some havoc" within PARC's insulating walls. For sheer intellectual brio, Sutherland knew, Carver Mead could stand toe to toe with Butler Lampson and the rest of Taylor's gunslingers any day. A compact, energetic man with a black mustache and goatee and lively, searching eyes, Mead possessed a confident mastery of
electrical engineering, particularly at the extremes of the infinitely complex and the infinitesimally small— regions where ordinary engineers hesitated to venture but which he con­sidered his personal preserve. He filled out that expertise with a breadth of interests that encompassed subjects as diverse as walnut farming and particle physics.

  At the time of his first visit to PARC, he and Ivan Sutherland were deeply engaged in studying what happened to electronic systems at the edges of the physical scale—in other words, how minuscule a transistor could be without its becoming non-functional, and how large and complex a system one could build without its becoming unmanageable. At their core these questions were identical, for as transistors got smaller and more densely crowded on the silicon surface of an integrated circuit, the chip became more complex. The implications of this dual phenomenon were only just becoming understood when Bert Sutherland invited Mead to give a technical address at the Systems Science Lab in 1976. Mead's formal topic was the design-of silicon-based integrated circuits, but his real purpose was to propose a new way of thinking about computer design—one that threatened to make much of PARC's work obsolete.

  As Moore's Law predicted, the technology of integrated circuits had been surging ahead ever since Intel—the company Moore co-founded— introduced its first microprocessor in 1971. The 4004 chip was funda­mentally an arrangement of microscopic transistors that packed into the space of a matchbook cover the computing power of a mainframe—circa 1946. That was hardly an achievement to prompt a major reconsideration of computer architectures; but a year later came the 8008, which had twice the power, and in 1974 another doubling again.

  There was no reason to think the trend would not continue well into the next millennium. From his academic aerie on Caltech's Pasadena campus, Mead imagined the curve of shrinking transistor size and mush­rooming density extending almost limitlessly into the distance. He believed that the traditional principles of computer design, of which MAXC and the Alto represented the intellectual pinnacle, were fated to fall off this curve well before it disappeared over the horizon. Both machines employed integrated circuits to help control their slowest peripheral devices, like the keyboard and mouse, but even those chips were of the passing generation known as MSI, or "medium-scale integra­tion." Mead had pioneered research into the next step—LSI, or "large- scale integration"—and he was still thinking ahead. In partnership with Ivan Sudierland, he began exploring the difficulties and possibilities pre­sented by the coming quantum leap in miniaturization, which would bring them to VLSI, or "very large-scale integration." This was the gospel he came to preach at PARC.

  Traditional computer design, he reminded his listeners, was essen­tially a mathematical exercise. One chose from the standard inventory of Boolean logic gates—ORs, ANDs, NORs, and so on—and arranged them to operate sequentially on a stream of bits. This worked fine as long as the logic elements (mostly transistors) were slow and expensive and the wires connecting them were relatively fast and cheap, as had been true throughout the history of digital computing. But it also meant that the blinding speed of digital computation was something of an illusion. The logic elements were such data bottlenecks that when you really examined what was happening inside the system, you could see that computers were still constrained "to perform individual steps on individual items of data"—that is, to do only one thing at a time.

  The new technology would turn that architecture inside out. As silicon- based chips got smaller and denser, the microscopic transistors that were packed on them to make up the logic became faster and cheaper than the wires linking them. The wires became the bottlenecks. Soon the most important factor limiting the computers efficiency would not be the sequence of gates, but their geometric arrangement on a flake of silicon and the rising relative cost of transporting electrons over the minuscule pathways linking one to another. Computers were about to cease doing one thing at a time, in favor of doing many things simultaneously. Conse­quently, their architects would have to abandon the old methods of designing them simply as linear sequences of logical functions. They would have to also consider how to get bits from one logical function to another along the shortest path.

  Traditional digital technology required designers to think like factory planners figuring out how to get raw materials in one end of a building and finished product out the other. Silicon, however, "forced you to think like an urban planner," Conway said later. "You had to think hard about where the roads go." Just as cities reaching a certain size sud­denly find themselves threatened by highway gridlock, she observed, in VLSI, "if you weren't careful you could end up having nothing but roads going nowhere." Fortunately VLSI also offered a way out of that quandary: Because the logic gates and other devices were now so cheap, "it didn't cost you anything to have more of them, if that paid you back by having less highway."

  For engineers who had reached the top of their game the old way, VLSI was full of murky ideas Many doubted it was physically possible even to fabricate functioning devices as tiny as the ones Mead prophe­sied. Even those who thought VLSI an interesting idea with great potential questioned whether it would ever supplant the tried- and-true architectural structures that had brought them this far. In CSL the general opinion was that VLSI was more than they needed to have on their plates. "We didn't have to be able to design chips," Lampson said—not while the industrial designers at Intel and other chip companies were already hard at work on it.

  In any case, PARC could hardly hope to contribute much to this nebu­lous science. At CSL "they were already out front in their own revolu­tion," one researcher later remarked. "To them VLSI was not really mainline, it was just this weird sort of thing happening somewhere else."

  But for two of Sutherland's laboratory scientists, Lynn Conway and Douglas Fairbairn, Mead's talk scored a direct hit.

  Conway was a rarity at PARC—an accomplished designer of advanced mainframes who chose to give the hardware gurus of the Computer Science Lab a wide berth. She ranked among PARC's senior veterans, having joined in 1972 from IBM, where she had helped design a super­computer at the Yorktown Heights lab, and Memorex, where she had worked as an architect of minicomputers. But at PARC she had played no role in developing the Alto or MAXC. On the contrary, something about the intellectual gunplay of CSL alarmed her, as did the intimidating pres­ence of Butler Lampson.

  "I always had a hard time dealing with Butler," she recalled. "He had this complete photographic memory of all theory that ever existed about anything, but sometimes that can be kind of a mental block to being cre­ative. You can be so confrontational and challenging about how smart you are that you can't always see that somebody else has got this cool idea."

  Like Kay, Tesler, and Shoup, Conway found the ambiance more oblig­ing among the Systems Science Lab's lunatic fringe. "Taylor was someone who could manage the 'neats' and Bert could manage the 'scruffies,'" she remarked. "In SSL I could survive. I could get all excited about an idea that was half-formed and go tell Bert about it, and he'd get all excited about it, maybe tell me somebody I should talk to about it. In CSL I'd be really afraid to present anything until it was perfect, and it would proba­bly get immediately shot down anyway."

  Her inaugural assignment at PARC had been something of an acid test in the implementation of half-formed ideas. The job was to design and build a combination fax and optical scanning system known as Sierra, the aim of which was to transmit pages of mixed text and graphics at high speed via the trick of stripping off the text and sending it in compressed digital form, leaving only the graphics to be conveyed by conventional (and slower) fax. The entire page, it was hoped, would therefore transmit much faster than if faxed as a single coherent image.

  Thanks to her big-iron training at IBM and hands-on experience at Memorex Conway was able to get the machine built in eighteen months, to everyone's candid surprise. To their disappointment, it emerged as two gargantuan racks of special-purpose hardware that devoured so much power one could heat a building w
ith it.

  "You could make it, but you wouldn't make any money off it," she recalled wistfully. "It was such a giant, kludged-up thing with so many exotic little systems that all it demonstrated was that architects could envision and build useful systems that would take too much circuitry to be financially viable."

  Sierra would never be feasible as long as it came in such an unwieldy package. Intel's new 4004, which packed thousands of transistors onto one chip—a full circuit board's worth of her hard-wired machine reduced to something you could hold between thumb and index finger—provided Conway with the first hint of how the circuitry might eventually be re-implemented in a manageable package. The hint of a new class of architec­tures was somewhere inside there, whispering to her. "The itch," she said, "was trying to be scratched."

  Doug Fairbairn, Mead's second true believer, had arrived at PARC by way of the Stanford artificial intelligence lab, where he had worked with Kay and Tesler. "After getting my master's at Stanford I'd gone to Europe," he recalled. "After six months I came back. I wasn't very driven to start a career but was thinking, what's my next job? Then I heard about Xerox and thought, 'If Alan Kay's there, I bet I won't have to wear a tie to the interview.' And I didn't."

 

‹ Prev