The Chip: How Two Americans Invented the Microchip and Launched a Revolution

Home > Other > The Chip: How Two Americans Invented the Microchip and Launched a Revolution > Page 3
The Chip: How Two Americans Invented the Microchip and Launched a Revolution Page 3

by T. R. Reid


  Noyce: “A large segment of the technical community was on the lookout for a solution.”

  Kilby: “There was just an awful lot going on. . . . It was pretty well accepted that this was the problem that had to be solved.”

  Noyce: “It was clear that a ready market awaited the successful inventor.”

  As it happened, the successful inventors were a pair of engineers working at competing manufacturing companies. That is to say, this global technological problem of earth-shaking importance was not turned over to scientists. It was not solved by academic researchers in a university laboratory, but rather by engineers at industrial lab benches just down the hall from the production line.

  Scientists and engineers tend to divide their work into two large categories, sometimes described as basic research and directed research. Some of the most crucial inventions and discoveries of the modern world have come about through basic research—that is, work that was not directed toward any particular use. Albert Einstein’s picture of the universe, Alexander Fleming’s discovery of penicillin, Niels Bohr’s blueprint of the atomic nucleus, the Watson-Crick “double helix” model of DNA—all these have had enormous practical implications, but they all came out of basic research. There are just as many basic tools of modern life—the electric light, the telephone, vitamin pills, the Internet—that resulted from a clearly focused effort to solve a particular problem. In a sense, this distinction between basic and directed research encompasses the difference between science and engineering. Scientists, on the whole, are driven by the thirst for knowledge; their motivation, as the Nobel laureate Richard Feynman put it, is “the joy of finding things out.” Engineers, in contrast, are solution-driven. Their joy is making things work.

  The monolithic idea was an engineering solution. It worked around the tyranny of numbers by reducing the numbers to one: a complete circuit would consist of just one part—a single (“monolithic”) block of semiconductor material containing all the components and all the interconnections of the most complex circuit designs. The tangible product of that idea, known to engineers as the monolithic integrated circuit and to the world at large as the semiconductor chip, has changed the world as fundamentally as did the telephone, the light bulb, and the horseless carriage. The integrated circuit is the heart of clocks, computers, cameras, and calculators, of pacemakers and Palm Pilots, of deep-space probes and deep-sea sensors, of toasters, typewriters, cell phones, and Internet servers. The National Academy of Sciences declared the integrated circuit the progenitor of the “Second Industrial Revolution.” The first Industrial Revolution enhanced man’s physical prowess and freed people from the drudgery of backbreaking manual labor; the revolution spawned by the chip enhances our intellectual prowess and frees people from the drudgery of mind-numbing computational labor. A British physicist, Sir Ieuan Madlock, Her Majesty’s Chief Science Advisor, called the integrated circuit “the most remarkable technology ever to hit mankind.” A California businessman, Jerry Sanders, founder of Advanced Micro Devices, Inc., offered a more pointed assessment: “Integrated circuits are the crude oil of the eighties.”

  All this came about because two young Americans came up with a new idea—or, more precisely, a not-so-new idea. In fact, the principle underlying the semiconductor revolution was one of the oldest ideas in electronics.

  2

  THE WILL TO THINK

  In addition to the laws, rules, constants, principles, axioms, theories, and hypotheses they have devised to explain the mysteries of the natural world, scientists and engineers have developed a series of unwritten rules that purport to explain the mysteries of their business. Among the latter is a humorous, or perhaps quasi-humorous, principle sometimes referred to as “the law of the most famous.” Briefly put, this natural law holds that whenever a group of investigators makes an important discovery, the most famous member of the group will get all the credit. And that seems to explain why the important principle of thermionic emission came to be known as the Edison Effect.

  Thermionic emission was observed for the first time in March 1883 in Thomas A. Edison’s Menlo Park laboratory, when the inventor and his associates noticed something strange going on inside one of his first light bulbs. In addition to the electric current flowing through the carbon filament, there seemed to be another, separate current flowing through the vacuum inside the glass bulb—something quite impossible, under contemporary explanations of electricity. Nobody at Menlo Park understood what was happening (the current was eventually found to be a flow of electrons boiling off the white-hot filament). But Edison, not one to miss a chance, wrote up the discovery and filed a patent for it. Then he set it aside. It was partly a matter of time and resources. Bedeviled by disputes with his creditors, legal battles over his patents, and frustrating efforts to improve his phonograph, microphone, and incandescent lamp, Edison was already working twenty hours a day, often more. But the main reason Edison abandoned the Edison Effect was that he saw no future in it. So what if current could flow through a vacuum? What good would that do?

  Looking back 120 years later, when solid-state, or semiconductor, devices have proven superior to vacuum tubes in just about all electronic equipment, we can see that Edison had a point. The science and industry of electronics was based on the thermionic effect inside a vacuum tube for more than fifty years; today, hindsight suggests that the half century of work on the vacuum tube was basically a digression. By the time Edison discovered his eponymous effect in 1883, electrical pioneers such as Edmond Becquerel, Ferdinand Braun, and Michael Faraday had already found that certain substances—known today as semiconductors—had a variety of useful electronic characteristics. Had the early work on these materials been continued, it is not too great a flight of fancy to suggest that the modern semiconductor revolution might have come a half century or more earlier than it did. But after the discovery of the Edison Effect, electronics research took a new direction—with dramatic results. The work at Menlo Park led, fourteen years later, to the experiment known as “the zero hour of modern physics”—the discovery of the electron— and from there, along a more or less straight line, to wireless telegraphy, radio, television, and the first generation of digital computers. It was all a digression, but a glorious one.

  In 1883, however, that point was considerably less than obvious, even to an exceptional visionary like Edison. At first, the Edison Effect held interest only for scientists—and scientific interest was not a commodity of great importance to Thomas A. Edison. “Well, I’m not a scientist,” the Wizard of Menlo Park said. “I measure everything I do by the size of the silver dollar. If it don’t come up to that standard then I know it’s no good.”

  That line was classic Edison. All his life he portrayed himself as the supreme pragmatist, relying on hard work and common sense to build a better life for his fellow man—and get rich in the process. It was an archetypal American picture, and essentially an accurate one, for Edison’s life has the elements of the classic American story. The seventh child of an infrequently successful businessman, he grew up in medium-size towns in Ohio and Michigan. He had about four years of school, counting the time his mother tutored him at home, and set out at the age of twelve to make his fortune. He sold snacks on the Detroit–Port Huron train. He started a newspaper called Paul Pry. He fell into and out of numerous jobs as a telegraph operator, a profession he enjoyed so much that he called his first two children Dot and Dash. Tinkering continually with his employers’ telegraphic equipment, he began to design useful improvements, and get paid for them. Gradually, he discovered that he could make a living as an inventor. By his thirty-fifth birthday, Edison was a millionaire, a leader of industry, and probably the best-known man on earth. When he announced early in 1878 that he might try to perfect an electric light, illuminating gas stocks plummeted on Wall Street. When the New York Daily Graphic reported that Edison had invented a machine that spun food and wine from mud and water, many newspapers failed to notice the April 1 dateline and ra
n the story straight—just one more miracle from Menlo Park. When Edison died, at eighty-four, in 1931, someone proposed that all the lights in the world be turned out for two minutes as a memorial. The idea was dropped on the ground that it would be impossible for the world to function that long without the electric light.

  Despite fame and fortune, Edison remained an uncouth hay-seed who flaunted his disdain for cleanliness, fashion, order, religion, and science. A journalist touring the famous Menlo Park laboratory in 1878 described the proprietor this way: “The hair, beginning to be touched with gray, falls over the forehead in a mop. The hands are stained with acid, his clothing is ‘readymade.’ He has the air of a mechanic, or with his particular pallor, of a night-printer.” Edison worked in 40-, 80-, or 100-hour bursts, catching a few intervals of sleep under a bench in the lab. He adopted the motto Perseverantia omnia vincit, a phrase that he subsequently translated to “Genius is 1 percent inspiration and 99 percent perspiration.” He had absolute confidence in this formula; he was certain he could solve any problem if he just tried enough solutions. After an assistant noted wearily that the Menlo Park team had worked through 8,000 different formulas in a futile effort to build a storage battery, the inventor replied that “at least we know 8,000 things that don’t work.” Struggling to find an efficient filament for his incandescent light, Edison decided to try everything on earth until something worked. He made a filament from dried grass, but that went haywire. He tried aluminum, platinum, tungsten, tree bark, cat’s gut, horse’s hoof, man’s beard, and some 6,000 vegetable growths before finding a solution in a carbonized length of cotton thread.

  Publicly, at least, the great empiricist had no truck with scientists, mathematicians, or any college graduate—“filled up with Latin, philosophy, and all that ninny stuff.” With great fanfare, he invented an “ignorameter” to prove that intellectuals had no common sense. When he hired a “science advisor” of his own— Francis Upton, holder of a Ph.D. from Princeton and student of the great German physicist Hermann Helmholtz—Edison promptly named him “Culture,” a term of derision among the gang at Menlo Park. Soon enough the newspapers were reporting how Edison had asked his man Culture to determine the volume, in cubic centimeters, of an empty glass bulb. Upton set to the task with a complex series of differential equations that would lead, after a few hours, to a close approximation of the correct figure. With a smirk, the pragmatic inventor grabbed the bulb, filled it with water, and poured the water into a measuring cup— determining the precise volume in less than a minute.

  Culture eventually became acculturated to life in the lab, and Edison eventually came to realize that Upton’s mathematical and physical skills were an important asset. Upton designed and built the dynamo—in modern terms, the power plant—that provided safe, efficient power for the Edison lighting system and the Edison electric train. Upton’s skills as an observer and theoretician were so central to the discovery of thermionic emission that the phenomenon might more fairly be called the Upton Effect.

  Edison’s electric lamp depended, of course, on a clear glass bulb. The bulb enclosed a vacuum. In that vacuum, electricity running through the filament caused the filament to glow with a white heat (the vacuum was necessary because without air, the filament would not burn away). The glow showed through the glass bulb and gave light. When the electric light was still in its birth throes, Edison and Upton noticed that, over time, the clear glass bulb tended to grow black. This was a problem, and the inventor set out in his standard way to eliminate the problem. He launched into an exhaustive series of trial-and-error experiments to find out what was wrong. Upton, meanwhile, made careful observations of the phenomenon. In March 1883 he suggested putting a small metal plate inside the bulb. This didn’t help (the problem of darkening bulbs was eventually solved by using purer materials for the filament), but it led to a fascinating discovery. Being scientists, being naturally curious people, they spent some time looking over their failed idea—that is, the light bulb with a metal plate in it. They found, to their surprise, that electric current was flowing in the metal plate. Where in the world could this current come from? Current was flowing through the filament, of course, but there was no connection whatsoever between the filament and the plate. Being naturally curious people, the researchers tried some tests. They increased the current through the filament—and found that the current in the metal plate increased proportionally. Evidently, electric current was flowing across the vacuum from filament to plate. This was quite astonishing—at least to Upton, who trusted scientists—because the experts had established incontrovertibly that electric current could not traverse a vacuum. Now Edison had proven them wrong.

  Over the next few years, the mysterious current-in-a-vacuum became a prize exhibit at electrical exhibitions on both sides of the Atlantic. Given its famous discoverer, the impossible current was quickly dubbed the Edison Effect. John A. Fleming, a scientist on the staff of Edison’s British subsidiary, ordered some Edison Effect lamps and tried a number of experiments. Edison had always used a direct current of electricity in his work—a current in which the electricity flowed in the same direction all the time. During the winter of 1884–85, Fleming tried something different. He hooked up the filament to a generator that produced alternating current—a current that constantly changes direction, back and forth, back and forth, as often as 120 times per second. He was mystified to see that, even with an alternating current racing back and forth through the filament, the current flowing to the metal plate was still direct current, never changing direction. The lamp had converted alternating current to direct current.

  No one could explain this result—least of all Edison, of course, who stood aside and smiled as the scientists did their stuff. Research on the Edison Effect was merely aesthetics, Edison wrote to a friend, and “I have never had time to go into the aesthetic part of my work. . . . But it has, I am told, a very important bearing on some laws now being formulated by the Bulged-headed fraternity of the Savanic world.”

  It did indeed. Savants in the United States and Europe undertook extensive experimentation on the flow of electricity through a vacuum. The basic apparatus for this work was an elongated glass tube with a piece of carbon or metal at one end that was heated—just like the filament in Edison’s light bulb—until the thermal energy emitted an electric current (hence “thermionic emission”) through the vacuum. The piece of metal that emitted the current was called a cathode, so the glass tube was known as a cathode ray tube. The current would beam down the tube to the far end; at the spot where the beam hit, the glass would phosphoresce, or glow. At the time, this exotic piece of scientific apparatus was found only in the finest laboratories in the United States and Europe. Today it is found in living rooms, basements, and bars everywhere, in the form of the television picture tube.

  How a cathode ray tube works.

  The savants hoped that the tube would provide a clear enough picture of cathode rays to permit an explanation of the electric force. As the nineteenth century neared an end, there was a curious gap between the engineers and the scientists. Thanks to the engineers and pragmatic inventors like Edison, electricity powered much of the world and made the nights shine as day. But the scientists still didn’t know precisely what this mighty force was. The mystery of electricity had prompted a number of contradictory hypotheses. Early researchers had postulated that electricity was a fluid (which is why we still talk today of “current” and “flow”). Then, in the 1880s, this notion gave way to a pair of competing theories. One view held that electricity was a wave phenomenon, like sound and light; the other school of thought considered the electric beam in the cathode ray tube to be a stream of particles, like grains of sand. Wave or particle? The greatest minds in physics pondered, debated, speculated over the question. The answer finally came from one of the most fascinating and formidable intellects in the history of physics—Professor Sir Joseph John Thomson.

  J. J. Thomson was born in 1856, the son of a bookseller
in Manchester. The family’s plan had been that J.J. would be apprenticed to a local engineer and take up that profession. But his father’s death, when J.J. was sixteen, left the Thomsons unable to pay the fees engineers charged for training apprentices. The boy won a scholarship at a local college and quickly came under the spell of mathematics and physics. The timing was perfect—Thomson’s professional career spanned the most fertile era in physics since Isaac Newton’s day—but nobody knew that during his college days. Quite the contrary, in fact: in the 1870s, there was a general sense that the interesting part of physics was over, and all that remained was refining the measurements. After all, everyone knew that all matter was made of indivisible particles called atoms, and that differences among elements were due to differences in their atoms. The great theoretician James Clerk Maxwell, on the august occasion of his nomination as Cavendish professor at Cambridge—in essence, the physicist laureate of England—noted the consensus “that in a few years all the great physical constraints will have been approximately estimated, and that the only occupation which will then be left to men of science will be to carry on those measurements to another place of decimals.” Already scientists had measured the mass of the smallest object in the universe—the hydrogen atom, weighing about .0000000000000000000000017 gram (which is to say, about .00000000000000000000000006 ounce).

  Thomson went to Cambridge in 1876, studied under the great Maxwell, and quickly distinguished himself. He won the Trinity Prize for the interesting discovery that a drop of water would not evaporate if given an electric charge. In an 1881 paper he suggested that there was a connection between the mass and the energy of a moving sphere—an idea that was crystallized by another physicist twenty-four years later in the equation E = MC2. As the preeminent Cambridge physicist, Thomson succeeded in 1884 to the chair—both the professorship and the piece of furniture— that Maxwell had held. He was twenty-eight years old. “Things have come to a pretty pass,” an older colleague groused, “when mere boys are appointed professors.” The mere boy took over Maxwell’s old office in the Cavendish Laboratory, a cream yellow neo-Gothic pile with the Cavendish family’s motto emblazoned over the door: Cavendo tutus, or “Always on the Lookout.”

 

‹ Prev