by T. R. Reid
8
THE IMPLOSION
The decision that brought the chip into every household, and made “chip,” in its microelectronic sense, a household word, was a carbon copy of the decision a decade earlier that had done the same things for the chip’s immediate ancestor, the transistor. The decision maker in both cases was Patrick Haggerty, the farsighted and plucky chief executive of Texas Instruments. In the early 1950s, when the transistor was starting to become a cheap, reliable mass-production item, Haggerty developed a fascination, almost an obsession, with the notion that microelectronics should play a pervasive role in modern society. He came to believe that microelectronic devices would replace standard circuitry in existing electronic gear; that all tools and appliances controlled by traditional gears, springs, or timers would be fitted with chips; and that the availability of low-cost, low-power, high-reliability miniature components would create entire markets unknown before. “Pervasiveness” occurs like an idée fixe in Haggerty’s speeches and writings. The conviction that microelectronic devices would gradually pervade every aspect of life was at the root of many of his business decisions.
Haggerty first put the principle into practice in 1954, when Texas Instruments, then a small regional manufacturer of electronic parts, discovered that it was ahead of everybody else in transistor development. “We knew we were doing pretty well in our semiconductor endeavor,” Haggerty recalled twenty-five years later, “[but] we were facing a world that was pretty skeptical. . . . It seemed to me that it was imperative for T.I. to generate some kind of dramatic demonstration that reliable transistors really were available in mass production quantities and at moderate prices, and that T.I. was both ready and able to produce them.” In fact, Haggerty had a particularly dramatic demonstration project in mind.
He had called in his engineers and told them to produce something wholly new—a portable radio, completely transistorized, powered by penlight batteries, and small enough to carry in a pocket or install in a dashboard. Haggerty tried to persuade several major radio firms to sell the device. They all demurred, arguing that there was no market for a pocket radio—an accurate assertion, since there had never been a pocket radio. A “radio” in 1954 tended to be a big wooden tabletop thing; it wouldn’t fit in a suitcase, much less a pocket. But Haggerty persisted, and eventually found a small company, Regency, which introduced the pocket radio just in time for the Christmas sales rush in 1954. More than 100,000 Regency portables were sold the first year, and pocket radio sales increased astronomically thereafter. The product was as successful for TI as for Regency, because it made the transistor a familiar object throughout the world. Some Texas Instruments people say Haggerty’s multimillion-dollar crash program to put the transistor into a consumer product was actually aimed at a single consumer—Thomas Watson Jr., the head of IBM. If that was Haggerty’s aim, he hit a bull’s-eye. Watson bought a hundred Regency radios and distributed them among his engineers; according to an IBM executive, Watson told his people that “if that little outfit down in Texas can make these radios work . . . they can make transistors that will make our computers work, too.” Haggerty himself couldn’t have put it better. In 1957, Watson signed a purchase order that made Texas Instruments a key IBM supplier and provided a huge new market for TI’s transistors.
Eleven years after he had launched the radio, Haggerty got to thinking about the future of the integrated circuit. The chip was winning itself a niche in military and industrial markets but had yet to crack the computer industry or make even the smallest dent in the consumer market. Haggerty knew, and his engineers knew, that the chip represented a revolutionary advance in technology and that the product’s reliability, capacity, and value were increasing every year—but the world at large did not know that. What was needed, Haggerty decided, was a dramatic demonstration of the benefits that monolithic circuitry could provide. And he had a particularly dramatic demonstration project in mind.
On an airplane trip with one of his engineers in the fall of 1965, Haggerty started talking about his belief that tiny, inexpensive, but complex circuits in integrated form would lead to new dimensions of pervasiveness. Why, the day would come, he said, when the chip would be built into a wide range of consumer products—when there would be integrated circuits in every home. As a matter of fact, he had been thinking about a consumer product that would be a perfect vehicle for the monolithic circuit. Before the plane landed, Haggerty had ordered the engineer to invent something wholly new: a miniature calculator that would fit in the palm of a hand—much lighter, much smaller, and much cheaper than any calculating machine that had ever been thought of before.
The man on the receiving end of that order was one of the most respected and fastest-rising engineers in the company—Jack Kilby. Since his invention of the integrated circuit during his first month at Texas Instruments, Kilby had received a series of raises and promotions befitting an employee who had given the company a firm position on the leading edge of monolithic technology. In a company where titles really mattered, Kilby had progressed from simply “Engineer” to “Manager of Engineering” to “Manager, Integrated Circuits Development” to “Deputy Director, Semiconductor Research and Development Laboratory.” Best of all, from Kilby’s point of view, the company had essentially left him alone to do the kind of work he liked best—solving technical problems. When Haggerty came up with the formidable problem of building a pocket calculator, Jack Kilby was the obvious person to take it on.
“I sort of defined Haggerty’s goal to mean something that would fit in a coat pocket and sell for less than $100,” Kilby recalled. “It would have to add, subtract, divide, and multiply, and maybe do a square root, it would have to use some sort of battery as a power supply, to make it portable, and it couldn’t be too much heavier than a fairly small paperback book.” It was a tall order. Today, of course, the calculator Kilby described is utterly commonplace, but in 1965 it was something quite unprecedented. There were calculating machines available then, but none came close to meeting Haggerty’s terms. The standard electronic desk calculator was as large and as heavy as a full-size office typewriter. It contained racks and racks of electronic parts and dozens of feet of wire. It ran on 120 volts of electricity and cost roughly $1,200—about half the price of a family car.
The president’s own brainchild naturally became a matter of some priority at Texas Instruments. The company, which yields nothing to the CIA when it comes to secrecy, put Kilby in a shrouded office and told him always to refer to his new project by a code name. An earlier TI research program had been called Project MIT, so Kilby took the logical next step and named his effort Project Cal Tech. “It was a miserable choice,” Kilby recalled afterward. “Anybody who heard it would have figured out that we had a crash project going on calculator technology.” In any case, Cal Tech soon grew into a team effort. Kilby started looking around the semiconductor lab for engineers who would not be daunted by the sizable technical problems involved in inventing a new species of calculator. Among others, he settled on a friendly, easygoing young Texan named Jerry Merryman.
Merryman, who had come to Texas Instruments two years earlier at the age of thirty-one, represents a vanishing breed in the high-tech industry—the self-taught engineer. After finishing high school in the country town of Hearne, Texas, he floated around and through Texas A&M for a few years but never stayed in place long enough to establish a major, much less earn a degree. Instead, he learned electrical engineering on odd jobs here and there and developed an almost intuitive sense for circuitry. “He’s one of these guys who looks inside for a minute or two and then says, ‘Well, maybe if we put that green wire over there,’ ” Kilby said. Like Kilby, Merryman has a fundamental confidence that any technical problem can be solved. “I just know,” he said, “that you’re going to find an answer if you think about it right. Eventually it’ll come. A lot of inventions just happen on the way to work.”
It was almost an act of faith in 1965 to believe that you could reduce the
size, weight, and cost of an electronic calculator by factors of ten or more. One of the first things Kilby realized was that tearing apart existing adding machines to see how they worked—a process known as reverse engineering—would offer little, if any, help, because the basic architecture of this pocket-size device would have to be completely new. And so the team started at ground zero, setting down the fundamental elements that their calculator would require.
In accordance with the architecture worked out by Alan Turing and John von Neumann, all digital devices, from the most powerful mainframe supercomputer to the simplest handheld electronic game, can be divided into four basic parts serving four essential functions:
Input: The unit that receives information from a human operator, a sensory device, or another computer and delivers it to the processing unit. For the calculator, this meant a keyboard.
Memory: The unit that holds data—numbers, words, or instruction code—until the processing unit is ready to receive it.
Processor: The central control circuit that transfers data to and from various memory segments and manipulates the data. In a calculator, the processor performs the arithmetic.
Output: The display unit that shows the results of each calculation.
In addition to these four basic sections, Kilby had to worry about something that is no problem on nonportable electronic devices—a power supply. A calculator meant to be carried in a pocket and used anywhere could not be designed for the 120 volts of power available from a wall socket. Instead, the calculator would have to operate from a battery; it would be limited to about 5 volts.
Because Haggerty was in a hurry, and because Kilby had a basic confidence that the job could be done, the team decided on an all-points attack: the group would work on all their problems at once and hope that everything came together in the end. The input section—basically, the design of a small, power-efficient keyboard—was assigned to an engineer named James Van Tassel. Kilby himself took on the output section and the power supply. The memory and central processor—the calculator’s electronic innards—were Merryman’s responsibility.
“The basic rule was, everything had to be smaller than you’d ever thought you could make it,” Merryman said later. “Now, one thing we did, we reduced the whole memory down to a single register. [“Register” is computerese for a short chain of transistors that can store a dozen or so binary digits.] The only problem with that was you’d need a lot of wires coming into that register, and there wasn’t much room inside that case for a lot of wires.” To solve that, Merryman designed a special “shift register”—a storage circuit that is laid out something like a large auditorium with only a single narrow aisle for people to come in and out. “That way, all the bits could come in there, sort of march in single file, and since they come in one at a time you get a lot of numbers in with just one wire.” Merryman’s most serious problem, though, was with the logic circuitry. The desktop electronic calculators of that time employed thousands of logic gates—AND, OR, and NOT circuits—to carry out binary calculations. “We were trying to build this whole circuit with only three bars,” Merryman said (“bar” is the engineer’s slang term for a silicon chip). “That left me about 400 gates in all—maybe 4,000 transistors. And I worked out a processing unit that only needed 400 gates. It almost worked, too— except it never could figure out how to get the decimal point in the right place. That took a whole bunch of extra gates.” In the end, the team had to settle for a four-chip design with a total of 535 logic gates.
Meanwhile, Van Tassel had developed a working model for the keyboard, and Kilby had found a rechargeable battery that would run the device for three hours before running down. Memory, processor, input, and power supply were in good shape. That left only one problem—the output unit, for displaying the answer. But this proved to be an unusually thorny problem.
Contemporary desk calculators used a cathode ray tube—a miniature television set—for display, but such a system was far too heavy, fragile, and power-hungry for a portable machine. For a while, Kilby had hoped to use a row of tiny neon lights to display the answers. As it turned out, that system required at least 40 volts—out of the question. Just down the hall from the Cal Tech team, a TI researcher was working on a new electronic device—a light-emitting diode, or LED—that was supposed to shine with a bright colored light when a minute current passed through it. This technology did, in fact, become the standard display technique for calculators and watches a few years later; in 1965, though, the diodes were not yet emitting much light.
There was nothing to do but invent something new. So Kilby developed a thermal printing technique, in which a low-power printing head would “burn” images into heat-sensitive paper; the idea worked perfectly, and the process is still widely used in low-cost, power-efficient printers.
All this activity consumed a little more than twelve months. One day late in 1966, Merryman recalled, “the thing was all laid out on the table like a person spread out on an operating table, all split open, wire running all over, and we punched in a problem, and it worked!” Silently, and almost instantaneously, the right answer came spinning out of the machine on a strip of paper tape. The Cal Tech group took their prototype in to Haggerty, who nodded with satisfaction—and called in the patent lawyers. It took another year before the design was perfected and the patent application—for a “Miniature Electronic Calculator”—could be filed. Although handheld calculators have come a long, long way since then, the Cal Tech team’s architecture is still the gist of all such devices; even today many TI models carry the number of the original Kilby–Merryman–Van Tassel patent: 3,819,921.
The electronics of the new device were so far ahead of their time that it took years to turn out the initial production models. The world’s first pocket calculator, the Pocketronic, was not introduced until April 1971—April 14, to be exact (the marketing people thought they might win the attention of taxpayers working late on Form 1040). By today’s standards, that first model was a dinosaur —a four-function (add, subtract, multiply, divide) calculator that weighed 2½ pounds and cost about $150. But it sold like crazy.
You would need a fairly high-powered calculator to keep track of what happened next. Five million pocket calculators were sold in the United States in 1972. As new features were added and prices plummeted sales doubled year after year. To borrow a word from Patrick Haggerty, the pocket calculator became pervasive. Within a decade after the first pocket calculator was sold in the United States, the country had more calculators than people. As Haggerty had predicted, the new microelectronic gadget created a market that had simply not existed before. Tens of millions of people who never considered purchasing an adding machine or a slide rule decided they wanted to own a pocket calculator. “How many housewives actually need to know the square root of a number?” wrote Ernest Braun and Stuart MacDonald, two English scholars who analyzed the phenomenon. “But then, the technology is ridiculously cheap. For a fraction of the cost of one week’s housekeeping, one can have permanent access to any number’s square root.”
Today, with a four-function calculator—a model the industry calls “plain vanilla”—available for $3.95 or so, the U.S. market is virtually saturated. Yet Americans still buy between 26 million and 30 million replacement calculators each year, including specialized models for stockbrokers, accountants, tax return preparers, bond traders, cattle ranchers, bicycle racers, cooks, and any other market niche the salespeople can dream up. Worldwide, the calculator market is a billion-dollar-per-year business, with sales approaching 100 million calculators each year. In Japan, Casio makes an abacus with a built-in calculator. Some people still use the ancient counter to check the results that pop up on the calculator’s display screen.
Another consumer application of the chip, born the same year as the handheld calculator and just as “pervasive” now, was the electronic, or digital, wristwatch. As a technical matter, the digital watch is markedly easier to make than a calculating machine. It is
based on a convenient natural phenomenon called crystal oscillation, which is the physicist’s way of saying that pure crystals of certain elements will oscillate, or vibrate back and forth, when connected to a source of electric current (e.g., a small battery). The rate of vibration depends on the atomic structure of the element; for a given material, though, the rate never varies. Certain crystals of the common mineral quartz, for example, will vibrate back and forth, back-and-forth, precisely 3,579,545 times each second.
A precise oscillator, be it the 5-foot brass pendulum of a grandfather clock or the .5-cm flake of quartz in a wristwatch, is the heart of any timepiece. All the watchmaker needs is a mechanism to count the back-and-forth oscillations—and counting is one of the simple tasks that binary logic gates can perform. In the digital watch a logic gate called a JK flip-flop counts the vibrations of the crystal. Every time the count hits 3,579,545, the gate sends a pulse to the display unit and the watch records the passage of another second. Another set of gates on the same chip counts 60 seconds and updates the minute display; another counts minutes to update the hour. If you tore apart your digital watch (why not? you can get a new one for five bucks), you would find, in place of the gears, springs, bearings, and bushings of a traditional timepiece, only four parts: a battery, a crystal, a chip, and the display unit.
Nonetheless, the first digital watch—produced by an American firm under the Pulsar brand name and introduced in the fall of 1971—was marketed as a decidedly high-bracket item. The 24-karat gold Pulsar was priced at $2,000; a stainless steel model cost $275. Characteristically, as the electronic watch improved— getting smaller, easier to read, more power-efficient—its price fell.
The American reader may have spotted in this history a perfectly legitimate excuse for chauvinism. Despite the predominance of names like Sony and Seiko, Canon and Casio, the major consumer products of the microelectronic age all resulted from pure Yankee ingenuity, as did the fundamental breakthrough— the monolithic idea—that made such advances possible in the first place. Polls show that many Americans consider the microelectronic revolution just another import from Japan—one more manifestation of the Japanese genius for technology and marketing. In fact, the flow of genius has gone in the other direction. The history of microelectronics has been a history of Japanese firms—and other companies around the world—learning at the feet of American innovators. This familiar pattern was played out once again with the development of the device that has taken microelectronics further than ever down the path of pervasiveness— the microprocessor.