by T. R. Reid
Is it because people really believe that computers contain “electronic brains”—and thus don’t care to know about the human brainpower that made these mechanisms possible? Is it because we have swallowed the Orwellian notion that digital technology is a brutalizing, tyrannical force—and thus we don’t want to honor, or even know, the men who made it? Is it because we have grown so accustomed to new ideas coming out of huge corporate and governmental enterprises that we no longer recognize individual invention? Is it because the media that purvey fame and recognition among our contemporaries —People, Oprah, Larry King, Good Morning America, and the like—don’t trust their audiences to appreciate genuine intellectual accomplishment? Or maybe it’s because wealth matters more than achievement when it comes to choosing the people our society will look up to. Bill Gates was an accomplished and innovative programmer who launched the global industry of personal computer software; but who had ever heard of him until he showed up in People magazine as “the richest man in the world”? A Time magazine story about Bob Noyce shortly before his death focused on his investment earnings and described him as a “financial genie”—a classic case of missing the real point.
The list of the “most admired” in today’s world—a list assembled annually by the sophisticated surveying apparatus of the Gallup poll—suggests that the current vogue in admiration among Americans runs heavily to political figures, with an occasional clergyman or entertainer thrown in. For decades George Gallup and his organization have been asking Americans to name the two people they admire most. The answers vary little from year to year. The pope and Billy Graham are often on the list. Bob Hope and Walter Cronkite show up occasionally. The other names tend to be drawn from government; Ronald Reagan, Bill Clinton, and Colin Powell are among the hardy perennials in this garden. As the Gallup organization points out in a caveat accompanying its survey, the poll “tends to favor those who are currently in the news.” It’s hardly surprising, then, that men and women engaged in science and engineering tend to be left out, for such people are generally not treated as news—unless they become avid self-promoters (as Edison and Ford were) or unless, like William Shockley, they set aside their technical work and begin proselytizing for political causes.
And so, in an era when everybody is supposed to be famous for fifteen minutes, Jack Kilby and Bob Noyce have never come into their allotted quarter hour. There have been occasional stories about them in newspapers and magazines, particularly in the local media of Dallas and Silicon Valley. In Dallas, as a matter of fact, Jack Kilby is almost a minor-league celebrity, largely because of genuine pride in the hometown boy who did well. The local media call him the “Texas Edison.” There’s an official Texas Historical Marker on the site of the lab where Jack first wrote down the monolithic idea. Texas Instruments is particularly proud of the eminent inventor in its ranks, and that has increased Jack’s stature in Dallas. When the company opened its new Kilby Center in 1997, it hung a massive canvas poster on the wall of the building, visible to every commuter whizzing by on I-635: “The Chip That Jack Built Changed the World.” When a group of Dallas citizens created an annual award for people who contribute to human happiness through science—the laureates include physicians, chemists, botanists, etc.—they gave their prize the most appropriate name they could think of: the Kilby Awards.
Next to Dallas, the one place where the name “Jack Kilby” is fairly broadly recognized is Japan. The country that honors W. Edwards Deming also honors the inventor of the microchip, a product that helped lift the Japanese to global industrial prominence in the 1980s. This is partly because Jack’s name was a hot news item in Japan for almost two decades while Texas Instruments was battling Japanese semiconductor firms for patent license fees. The original TI patent for the integrated circuit is known in Japan as the “Ki-ru-bee tokkyoken”—that is, the “Kilby patent”—and thus the name was for years a common term on Japanese front pages. Beyond that, Jack Kilby is exactly the kind of person that Japan tends to admire. The founder of Sony, Akio Morita, another technologist who achieved heroic stature in Japan, said that the real key to his island country’s economic success was that “we are a society that honors engineers.” So it is natural that the Japanese would honor an engineer who changed everything.
Accordingly, Jack has traveled regularly from Texas to Tokyo, where the media literally line up for interviews with him. When Texas Instruments opened a big new engineering center in the “science city” of Tsukuba, Japan, the company, of course, sent its best-known engineer to do the honors. The ribbon-cutting ceremony was performed in characteristic leave-nothing-to-chance Japanese style. Jack and a host of Japanese officials were each given white gloves and golden scissors festooned with white, red, and purple streamers. They were directed to their assigned places at the entrance of the new building, and an announcer instructed them: “Hold the ribbon in your right hand, hold the scissors in your left, bring the ribbon up to the blade, pause ten seconds for the cameras, and now—CUT!” With the ribbon successfully cut, the announcer offered a breathless replay of what had just happened. “We are deeply honored,” he ended, “that an engineer who has won the esteem of all Japan could join us today to open our humble center.”
Back home, Jack could not quite claim the esteem of all America. But there were moments. Jack’s first integrated circuit was presented to the Smithsonian Institution, where—unlike Keuffel & Esser’s last slide rule—it is on permanent display in the National Museum of American History. The U.S. Postal Service issued a 33-cent stamp honoring the invention (but did not include the names of the two Americans who invented it). And shortly after Jack Kilby joined Ford and Edison in the Inventors Hall of Fame, Diane Sawyer flew to Dallas to interview him for the CBS Morning News. The segment lasted about three hundred seconds, with Sawyer tossing out peppy questions and Jack responding in his slow, laconic way.
“I mean, if you have to think of one thing that kept the United States at the forefront of technology,” Sawyer said, “it was really your invention.” Kilby paused, stewing it over. “Well, I hadn’t thought of it in those terms,” he said quietly. “Have you made money from this invention?” Sawyer asked. “Some, yeah,” Kilby replied. Things were just starting to get interesting when Sawyer got a signal from the director: time to move on. She turned quickly to the camera and said, “Coming up in a moment, Dr. Jerry Brodie on how to handle the death of a pet.” Jack Kilby’s moment in the sun was over.
Of course, if Jack had remained in the sun much longer, he probably would have been running toward the shade. The man is so down-to-earth, so genuinely modest, that he seems uncomfortable when people get wound up about his inventions. Far from encouraging interest in his achievements, Jack tends to play them down. When Texas Instruments opened its Kilby Center, the company magazine interviewed, or at least tried to interview, the center’s namesake about the monolithic idea and its consequences. “Did you have any idea that you were going to have such a profound effect on everybody’s daily life?” Jack was asked. “Well,” he replied, “I don’t know that I get credit for their profound effect. . . . What you see today is the work of probably tens of thousands of the world’s best engineers, all concentrating on improving the product, reducing the cost, things of that sort.” The interviewer persisted: “Is it nice to know that you made one of the most major contributions in making Texas Instruments what it is today?” Jack was having none of it: “Well, there’s a lot of water under the dam since that time,” he said, “and it’s hard to take any direct credit for the TI of today.”
One thing that Kilby particularly insists is that he not be treated as somebody special. When a school board member back in Kansas proposed changing the name of Great Bend High School to Jack S. Kilby High, the school’s most distinguished alumnus immediately scotched the idea. “When they play football against Dodge City, they don’t want to be ‘Kilby,’ ” Jack said. “They’ve got to be ‘Great Bend.’ Anyway, the whole thing would be a lot of trouble. I’
m not worth the fuss.”
In the fall of the year 2000, though, Jack Kilby found himself at the center of an enormous fuss. In the predawn hours of October 10, reporters in Europe started placing frantic calls to “J. Kilby” in Dallas. Jack does, in fact, have a listed telephone number, but he’s in the directory as J. S. Kilby. The J. Kilby who received those calls was his sister Jane. “As soon as the first fellow said the words ‘Nobel Prize,’ I knew it was going to be a busy day,” Jane said later. She raced over to her brother’s house and found a cluster of reporters on the porch, banging fruitlessly on the front door of a man who had removed his hearing aid for the night. Eventually, Jack came to the door in a green robe and expressed gratitude to the Royal Swedish Academy of Sciences. Then he headed back to the kitchen for breakfast.
But it was evident that Jack Kilby’s cherished life outside the spotlight was going to change, at least for a while. As the oldest, richest, and most prestigious honor in science, the Nobel Prize is the big enchilada of global awards, the ultimate accolade. The media might have looked the other way when Kilby won the Kyoto Prize in Advanced Technology or the Holley Medal of the American Society of Mechanical Engineers or the Institute of Electrical and Electronic Engineers Medal of Honor. But the Nobel Prize was not to be ignored.
A dozen years earlier, when it was already clear that the integrated circuit was a development of historic dimensions, I had asked both Jack Kilby and Robert Noyce why their invention hadn’t won them a Nobel Prize in something. Both men gave the same answer: The monolithic idea wasn’t the type of thing that won Nobel Prizes. “Basically, we’re looking at an engineering development,” Noyce replied, after noting that he was flattered to be asked. “If you look at the Nobel Prize, it goes to important discoveries in science.” Maybe so, I shot back, but the transistor won a Nobel Prize, just nine years after it was invented. The microchip was already thirty years old—and nary a word from Stockholm. “But that’s different,” Noyce said, in the tones of a patient teacher. “The 1956 prize was not awarded for the invention of the device. It was for the physics of the transistor effect.”
Actually, there has been considerable criticism of the Royal Swedish Academy for the excessively esoteric nature of its science prizes, and its failure to recognize the microchip was Exhibit A for the critics. Nobel laureates in physics and chemistry win the award for important discoveries, of course, but often the discoveries are important only to a small clique of specialists in a subset of the field. By honoring the academic and ignoring the practical, the Nobel Committee was arguably slighting the work of people who made genuine contributions—and frustrating the design of Alfred Nobel, who stated explicitly that his award was intended for “those who . . . shall have conferred the greatest benefit on mankind.” In the last half of the twentieth century, there was no scientific development that conferred greater benefits than the chip—but there was no Nobel for the chip’s inventors. By ignoring, for more than three decades, the inventors of the microchip, the committee denied the prize completely to Robert Noyce. Noyce was dead by the time the Nobel people honored the integrated circuit, and Nobel Prizes are not awarded posthumously. The Wall Street Journal spotted in this a plot by European left-wingers to thwart Noyce’s legitimate deserts: “Why? Because he was a businessman and because his work had been commercial. The Swedish Academy would never stand for a dirty capitalist, especially a very, very rich one, getting the prize.”
For its first award of the new century, however, the Nobel Committee adopted a thoroughly practical stance. The official announcement that went out from the Royal Swedish Academy on October 10 explained that the Nobel Prize in Physics for 2000 recognized the connection between physics and the information age of computers and telecommunications. Accordingly, the physics prize was split. Half would go to Jack S. Kilby “for his part in the invention of the integrated circuit.” The other half of the prize money was to be divided between two academic physicists, Zhores I. Alferov of Russia’s Ioffe Physico-Technical Institute, and Herbert Kroemer, of the University of California at Santa Barbara, “for developing semiconductor heterostructures used in high-speed and opto-electronics.” In essence, Alferov and Kroemer had made valuable improvements on Jack’s basic idea. The term “heterostructure” refers to a chip made up of alternating layers of different semiconductors. Alferov and Kroemer had independently determined how heterostructures could be used to generate a laser beam—such as the one in the read/write head of a CD player—and the tiny but powerful amplifier required to make cellular phones work. The academy’s announcement conceded that “the integrated circuit is more of a technical invention than a discovery in physics.” But that was all right, because “it is evident that it embraces many physical issues . . . [such as] how to produce dense layers that are only a few atoms thick.”
As October 10 dawned over Texas, it quickly became obvious that Jack’s brief appearance on the front porch was not going to satisfy the media hordes. Accordingly, later that day, the newly named laureate was ushered into a conference room at Texas Instruments for a formal press conference. Jack first offered a tribute to Robert Noyce: “I’m sorry he’s not still alive. If he were, I suspect we’d share this prize.” He then answered endless questions about his invention and himself, particularly about his own low-tech lifestyle. “Well, I may be the only person in the room without a cell phone,” Kilby conceded. As always, he did his best to deflate the suggestion that he had changed the world, insisting that many other engineers shared the credit. The reporters were smitten by this understated performance from a world-class overachiever. “He looked like everyone’s grandfather,” observed The Dallas Morning News, “not the man who spawned the Information Age.”
Two months later, with granddaughters, daughters, son-in-law, and sister in tow, the man who spawned the information age arrived in Stockholm for a glittering round of luncheons, receptions, royal audiences, and the like. “I’m not worth the fuss,” Jack said again, as he climbed into yet another Volvo limousine for yet another function in his honor. Still, he appeared to be enjoying himself. Several old friends were on hand to share the occasion. One of the more poignant moments of the week came in the lobby of Stockholm’s Grand Hotel, when Jack spotted another acquaintance from years back: Gordon Moore. Bob Noyce’s engineering and entrepreneurial colleague, the man who was on the receiving end when Noyce had first enunciated the monolithic idea forty-one years earlier, routinely received invitations to the Nobel Prize ceremony. But now for the first time he had decided to attend. “I did it for my friend,” Moore said. “The Nobel Prize for the integrated circuit would have been shared by my colleague Bob Noyce if he were alive. I thought I should come so that Bob would have a presence when his invention was honored.”
Other than showing up in white tie and tails for the lavish awards ceremonies—the event is so fancy that even the traffic cops outside wear tuxedos, and the sterling silver laid out for the ensuing banquet is never used for any other function—a Nobel laureate’s only unavoidable duty during prize week is to deliver a lecture. Jack Kilby’s Nobel lecture in physics took place in a classically Scandinavian lecture hall, all blond wood and sleek modern furniture, on the campus of Stockholm University. Jack was introduced by a Swedish physicist who noted that “Dr. Kilby’s” invention had launched the global digital revolution, making possible calculators, computers, digital cameras, pacemakers, the Internet, etc., etc. Naturally, Jack wasn’t going to let that go unanswered. “When I hear that kind of thing,” he said, “it reminds me of what the beaver told the rabbit as they stood at the base of Hoover Dam: ‘No, I didn’t build it myself, but it’s based on an idea of mine.’ ” Everybody liked that joke, so Jack quickly added that he had borrowed the story from Charles H. Townes, an American who won the physics prize in 1964.
Kilby then went on, in his deep, slow Kansas drawl, to describe the tyranny of numbers and the various efforts to overcome it in the 1950s. He described the solution he hit upon at TI and explained
how Bob Noyce’s work shortly thereafter had complemented his own approach. He showed the slide of his original phase-shift oscillator, with the hand-carved chip glued to a glass laboratory slide and the flying wires sticking out every which way. “Had I realized that I would have to look at that thing for 42 years, I would have put a little more effort into its appearance,” he said. Jack expressed astonishment at the way the invention had changed electronics: “The development in the last 42 years has been more rapid than in the first 400 years after William Gilbert coined the word ‘electricity.’ ” Improvements in the chip, and particularly the constant cost reduction, had made high-tech wonders available to ordinary people almost everywhere, he went on. “I’m happy to have had even a small part in this process of turning human ingenuity and creativity into practical reality.”
On that characteristically humble note, the man who made the microchip strode back to his seat. And, as it turned out, back to obscurity.
For a few weeks, just before that awards ceremony, the Nobel Prize seemed to trigger a new surge of interest in Kilby, in Noyce, and in the broader notion that the two inventors who launched a revolution deserved the recognition and esteem of their countrymen. Something about the turning of the century, about America’s stature as the world’s dominant political, financial, and industrial power at the dawn of a new millennium, led to a national focus on how the United States—which had been, after all, a mere snip of a developing nation one century before—had achieved its exalted status. It was obvious that technologists like Noyce and Kilby were a key part of the answer. And the Nobel Prize in the year 2000 seemed to emphasize the point. Many Americans were proud. Texas Instruments put a huge new poster on that building beside I-635: “The Chip That Jack Built Won the Nobel Prize.” The outgoing president, Bill Clinton, invited Jack Kilby to the White House for coffee. The incoming president, George W. Bush, in one of his last official acts as governor of Texas, welcomed Jack Kilby as a charter inductee into the new Texas Science Hall of Fame. Newsmagazines and cable talk shows actually paid some attention to the integrated circuit and the man who had made the first one. The Topeka Beacon-Journal named Jack St. Clair Kilby its “Kansan of the Year” for 2000. The front-page article announcing the choice was a textbook example of that old journalistic rule, Find the Local Angle. Harking back to the blizzard of ’37, the story began: “A Kansas ice storm more than 60 years ago set the wheels in motion for an invention that would change the world.”