Book Read Free

Collected Essays

Page 37

by Rucker, Rudy


  UNIVACs began selling to businesses in a small way. Slowly, the giant IBM corporation decided to get into the computer business as well. Though their machines were not as good as the UNIVACs, IBM had a great sales force, and most businesses were in the habit of using IBM calculators and punch card tabulating machines. In 1956, IBM had pulled ahead, with 76 IBM computers installed vs. 46 UNIVACs.

  Six Generations Of Computers

  The 1950s and 1960s were the period when computers acquired many of their unpleasant associations. They were enormously expensive machines used only by large businesses and the government. The standard procedure for running a program on one of these machines was to turn your program into lines of code and to use a key punch machine to represent each line of code as a punch card. You would submit your little stack of punch cards, and when a sufficient number of cards had accumulated, your program would be run as part of a batch of programs. Your output would be a computer-printed piece of paper containing your output or, perhaps more typically, a series of cryptic error messages.

  The history of computers from the 1950s to the 1970s is usually discussed in terms of four generations of computers.

  The first generation of commercial computers ran from 1950 to about 1959. These machines continued to use vacuum tubes for their most rapid memory, and for the switching circuits of their logic and arithmetic units. The funky old mercury delay line memories were replaced by memories in which each bit was stored by a tiny little ring or “core” of a magnetizable compound called ferrite. Each core had three wires running through it, and by sending pulses of electricity through the wires, the bit in the core could be read or changed. Tens of thousands of these washer-like little cores would be woven together into a cubical “core stack” several inches on a side.

  The second generation of computers lasted from 1959 to 1963. During this period, computers used transistors instead of vacuum tubes. By now the vast majority of computers were made by IBM, but one of the most famous second generation computers was the first PDP (Programmed Data Processor) model from the Digital Equipment Corporation. The PDP-1 was of key importance because it was the first machine which people could use in real time. That is, instead of waiting a day to get your batch-processed answers back, you could program the PDP-1 and get answers back right away via the electric typewriter. It also had a screen capable of displaying a dozen or so characters at a time.

  The third generation of computers began with the IBM 360 series of computers in 1964. The first of these machines used “solid logic technology” in which several distinct electronic components were soldered together on a ceramic substrate. Quite soon, this kludge was replaced by small scale integrated circuits, in which a variety of electronic components were incorporated as etched patterns on a single piece of silicon. (A “kludge” is an ungainly bit of hardware or computer code.) Over the decade leading up to 1975, the integrated circuits got more and more intricate, morphing into what became called VLSI or “very large scale integrated” circuits.

  The fourth generation of computers began in 1975, when VLSI circuits got so refined that a computer’s complete logical and arithmetic processing circuits could fit onto a single chip known as a microprocessor. A microprocessor is the heart of each personal computer or workstation, and every year a new, improved crop of them appears, not unlike Detroit’s annual new lines of cars.

  Although computer technology continues to advance as rapidly as ever, people have dropped the talk about generations. The “generation of computer” categorization became devalued and confused. On the one hand, there was a lot of meaningless hype on the part of people saying they were out to “invent the fifth generation computer”—the Japanese computer scientists of the 1980s were particularly fond of the phrase. And on the other hand the formerly dynastic advance of computing split up into a family tree of cousins. Another reason for the demise of the “generation” concept is that rather than radically changing their design, microprocessor chips keep getting smaller and faster via a series of incremental rather than revolutionary redesigns.

  One might best view the coming of the decentralized personal computers and desktop workstations as an ongoing fifth generation of computers. The split between the old world of mainframes and the new world of personal computers is crucial. And if you want to push the generation idea even further, it might make sense to speak of the widespread arrival of networking and the Web as a late 1990s development which turned all of the world’s computers into one single sixth generation computer—a new planet-wide system, a whole greater than its parts.

  Moloch And The Hackers

  Though it was inspired by Fritz Lang’s Metropolis and the silhouette of the Sir Francis Drake Hotel against the 1955 San Francisco night skyline, the “Moloch” section of Allen Ginsberg’s supreme Beat poem “Howl” also captures the feelings that artists and intellectuals came to have about the huge mainframe computers such as UNIVAC and IBM:

  Moloch whose mind is pure machinery! Moloch whose blood is running money! Moloch whose fingers are ten armies! Moloch whose breast is a cannibal dynamo! Moloch the smoking tomb!

  Moloch whose eyes are a thousand blind windows! Moloch whose skyscrapers stand in the long streets like endless Jehovahs! Moloch whose factories dream and croak in the fog! Moloch whose smokestacks and antennae crown the cities!

  Moloch whose love is endless oil and stone! Moloch whose soul is electricity and banks! Moloch whose poverty is the specter of genius! Moloch whose fate is a cloud of sexless hydrogen! Moloch whose name is the Mind!

  [Allen Ginsberg, Howl, (annotated edition), HarperPerennial 1995, p. 6. Our film still from Metropolis is found in this wonderful book.]

  A Moloch machine in the movie Metropolis.

  Despite the negative associations of computers, many of the people associated with these machines were not at all interested in serving the Molochs of big business and repressive government. Even the very first von Neumann architecture mainframe, the 1949 EDSAC, was occasionally used for playful purposes. The EDSAC designer Maurice Wilkes reports:

  The EDSAC had a cathode ray tube monitor on which could be displayed…a matrix of 35 by 16 dots. It was not long before an ingenious programmer used these dots to make a primitive picture. A vertical line of dots in the center of the screen represented a fence; this fence had a hole in it that could be in either the upper or lower half of the screen, and by placing his hand in the light beam of the photoelectric paper tape reader, an operator could cause the hole to be moved from the lower half to the upper half. Periodically a line of dots would appear on the left hand side of the screen…in the upper or the lower half of the screen. If they met the hole in the fence, they would pass through; otherwise they would retreat. These dots were controlled by a learning program. If the operator moved the hole from top to bottom in some regular way, the learning program would recognize what was going on, and after a short time, the line of dots would always get through the hole. No one took this program very seriously. [Maurice Wilkes, Memoirs of a Computer Pioneer, (MIT Press).]

  This kind of interactive, noodling computer exploration blossomed into a movement at the Massachusetts Institute of Technology during the 1960s and 1970s. The catalyst was the first interactive machine, the PDP-1, built by DEC (Digital Equipment Corporation). As mentioned above, with the “real-time” PDP-1, instead of handing your batch of punch cards to the priestly keepers of a hulking giant mainframe, you could sit down at a keyboard, type things in, and see immediate feedback on a screen.

  Steven Levy’s wonderful book Hackers chronicles how the arrival of the PDP-1 at MIT in 1961 changed computing forever. A small cadre of engineering students began referring to themselves as computer hackers, and set to work doing creative things with the PDP-1. One of their most well-known projects was a video game called Spacewar, in which competing spaceships fired torpedoes at each other while orbiting around a central sun. Such games are of course a commonplace now, but Spacewar was the first.

 
When the improved PDP-6 arrived at MIT in the mid 1960s, it was used for a wide range of hacker projects, including The Great Subway Hack in which one of the hackers went down to New York City and managed to travel to every single subway stop using a single subway token, thanks to a schedule interactively updated by the PDP-6 on the basis of phone calls from MIT train spotters stationed around Manhattan.

  (By the way, Brian Silverman and some other hackers have recently reconstructed Spacewar. They recreated a historically accurate binary source code for the program and are running it on a PDP-1 emulator they wrote in Java as a Java application that you can run over the Web.)

  As I mentioned in the first essay, the meaning of the term “computer hacker” has changed over the years; “hacker” is now often used to refer to more or less criminal types who use computer networks for purposes of fraud or espionage. This linguistic drift has been driven by the kinds of stories about computers which the press chooses to report. Unable to grasp the concept of a purely joyous manipulation of information, the media prefer to look for stories about the dreary old Moloch themes of money, power and war. But in the original sense of the word, a computer hacker is a person who likes to do interesting things with machines—a person, if you will, who’d rather look at a computer monitor than at a television screen.

  According to Steven Levy’s book, the MIT hackers went so far as to formulate a credo known as the Hacker Ethic:

  1) Access to computers should be unlimited and total.

  2) All information should be free.

  3) Mistrust authority—promote decentralization.

  4) Hackers should be judged by their hacking, not bogus criteria such as degrees, age, race, or position.

  5) You can create art and beauty on a computer.

  6) Computers can change your life for the better.”

  [Steven Levy, Hackers: Heroes of the Computer Revolution, (Doubleday).]

  Personal Computers

  When first promulgated, the principles of the Hacker Ethic seemed like strange, unrealistic ideas, but now there are ever-increasing numbers of people who believe them. This is mostly thanks to the fact that personal computers have spread everywhere.

  In 1975, the Intel Corporation began making an integrated circuit chip which had an entire computer processor on it. The first of these chips used four-bit “words” of memory and was called the 4004; it was quickly followed by the eight-bit 8008. An obscure company called MITS (Model Instrumentation Telemetry Systems) in Albuquerque, New Mexico, had the idea of putting the Intel 8008 chip in a box and calling it the Altair computer. A mock-up of the Altair appeared on the cover of the January 1975 cover of Popular Electronics, and the orders began pouring in. This despite the daunting facts that: firstly, the Altair was sold simply as a kit of parts which you had to assemble; secondly, once the Altair was assembled the only way to put a program into it was by flicking switches (eight flicks per byte of program code); and thirdly, the only way to get output from it was to look at a row of eight tiny little red diode lights.

  Nowhere was the Altair more enthusiastically greeted than in Silicon Valley, that circuit-board of towns and freeways that sprawls along the south end of the San Francisco Bay from San Jose to Palo Alto. This sunny, breezy terrain was already filled with electronics companies such as Fairchild, Varian and Hewlett-Packard, which did good business supplying local military contractors like Lockheed. Catalyzed by the Altair, a hobbyist group named the Homebrew Computer Club formed.

  One of the early Homebrew high points was when a hardware hacker named Steve Dompier found that if he put his radio next to his Altair, the electrical fields from certain of the computer’s operations could make the radio hum at various pitches. After several days of feverish switch flicking, Dompier was able to make his Altair-plus-radio system play the Beatles’ “Fool on the Hill”—followed by “Daisy,” the same song that the dying computer HAL sings in the classic science fiction movie 2001.

  One of the regulars at the Homebrew Computer Club meetings was a shaggy young man named Steve Wozniak. Rather than assembling an Altair, Woz concocted his own computer out of an amazingly minimal number of parts. He and his friend Steve Jobs decided to go into business in a small way, and they sold about 50 copies of Wozniak’s first computer through hobbyist publications. The machine was called an Apple, and it cost $666.66. And then Wozniak and Jobs started totally cranking. In 1978 they released the Apple II, which had the power of the old mainframe computers of the 1960s…plus color and sound. The Apple II sold and sold; by 1980, Wozniak and Jobs were millionaires.

  The next big step in the development of the personal computer happened in 1981 when IBM released its own personal computer, the IBM PC. Although not so well-designed a machine as the Apple II, the IBM PC had the revolutionary design idea of using an open architecture which would be easy for other manufacturers to copy. Each Apple computer included a ROM (read-only memory) chip with certain secret company operating system routines on it, and there was no way to copy these chips. IBM, on the other hand, made public the details of how its operating system worked, making it possible for people to clone it. Their processor was a standard eight-bit Intel 8088 (not to be confused with the Altair’s 8008), soon replaced by the sixteen-bit 8086. The floodgates opened and a torrent of inexpensive IBM PC compatible machines gushed into the marketplace. Apple’s release of the Macintosh in 1984 made the IBM PC architecture look shabbier than ever, but the simple fact that IBM PC clones were cheaper than Macintoshes led to these machines taking the lion’s share of the personal computer market. With the coming of the Microsoft Windows operating systems, the “Wintel” (for Windows software with Intel chips) clone machines acquired Mac-like graphic user interfaces that made them quite comfortable to use.

  This brings us reasonably close to the present, so there’s not much point in going over more chronological details. One of the things that’s exciting about the history of computers is that we are living inside it. It’s still going on, and no final consensus opinion has yet been arrived at.

  The Joy of Hacking

  For someone who writes programs or designs computer hardware, there is a craftsperson’s pleasure in getting all the details right. One misplaced symbol or circuit wire can be fatal. Simply to get such an elaborate structure to work provides a deep satisfaction for certain kinds of people. Writing a program or designing a chip is like working a giant puzzle with rules that are, excitingly, never quite fully known. A really new design is likely to be doing things that nobody has ever tried to do before. It’s fresh territory, and if your hack doesn’t work, it’s up to you to figure out some way to fix things.

  Hackers are often people who don’t relate well to other people. They enjoy the fact that they can spend so much time interacting with a non-emotional computer. The computer’s responses are clean and objective. Unlike, say, a parent or an officious boss, the computer is not going to give you an error message just because it doesn’t like your attitude or your appearance. A computer never listens to the bombast of the big men on campus or the snide chatter of the cheerleaders, no, the computer will only listen to the logical arabesques of the pure-hearted hacker.

  Anyone with a computer is of necessity a bit of a hacker. Even if all you use is a word processor or a spread sheet and perhaps a little electronic mail, you soon get comfortable with the feeling that the space inside your computer is a place where you can effectively do things. You’re proud of the tricks you learn for making your machine behave. Thanks to your know-how, your documents are saved and your messages come and go as intended.

  The world of the computer is safe and controlled; inside the machine things happen logically. At least this is how it’s supposed to be. The computer is meant to be a haven from the unpredictable chaos of interpersonal relations and the bullying irrationality of society at large. When things do go wrong with your computer—like when you suffer up the learning curve of a new program or, even worse, when you install new hardware or a new operatin
g system—your anxiety and anger can grow quite out of proportion. “This was supposed to be the one part of the world that I can control!” But all computer ailments do turn out to be solvable, sometimes simply by asking around, sometimes by paying for a new part or for the healing touch of a technician. The world of the computer is a place of happy endings.

  Another engaging thing about the computer is that its screen can act as a window into any kind of reality at all. Particularly if you write or use graphics programs, you have the ability to explore worlds never before seen by human eye. Physical travel is wearying, and travel away from Earth is practically impossible. But with a computer you can go directly to new frontiers just as you are.

  The immediacy of a modern computer’s response gives the user the feeling that he or she is interacting with something that is real and almost alive. The space behind the screen merges into the space of the room, and the user enters a world that is part real and part computer—the land of cyberspace. Going outside after a long computer session, the world will look different, with physical objects and processes taking on the odd, numinous chunkiness of computer graphics and computer code. Sometimes new aspects of reality will become evident.

  I’ve always felt like television is, on the whole, a bad thing. It’s kind of sad to be sitting there staring at a flickering screen and being manipulated. Using a computer is more interactive than watching television, and thus seems more positive. But even so, computers are somewhat like television and are thus to some extent forces of evil. I was forcefully reminded of this just yesterday, when my son Rudy and I stopped in at the Boardwalk amusement park in Santa Cruz.

 

‹ Prev