Seven Elements That Have Changed the World

Home > Other > Seven Elements That Have Changed the World > Page 23
Seven Elements That Have Changed the World Page 23

by John Browne


  The first commercial applications of the transistor were not, however, computers, but technologies that used its other function as an amplifier. The first of these was a hearing aid produced by Sonotone in 1952. The same principle was applied in radios, amplifying the electromagnetic waves received from transmitting stations. The small size of the transistor dramatically reduced the size and cost of radios, making them portable and opening up their ownership to a vast new market. The transistor radio, the ‘trannie’, heralded a new era of popular music heard anywhere by everyone. As these new products took off, the importance of the transistor began to be widely recognised. In March 1953, Fortune published an article entitled ‘The Year of the Transistor’. ‘In the transistor and the new solid-state electronics,’ Fortune wrote, ‘man may hope to find a brain to match atomic energy’s muscle.’72 Silicon had joined uranium and titanium in a class of post-war ‘wonder elements’.

  Silicon chip

  Soon after Shockley, Bardeen and Brattain had invented the transistor, their relationship began to break down. Shockley, paranoid and competitive in the extreme, felt he was not being given sufficient credit for the invention.73 He became unhappy at Bell Labs where his apparent lack of management abilities, combined with his foul temperament, led him to be overlooked for promotion. In 1956, he left to set up his own company, Shockley Semiconductor, in California, having been encouraged to move there by Frederick Terman, the Dean of Stanford’s School of Engineering. Terman had the vision to see the potential of the semiconductor industry and wanted his graduate students to become a part of it. Together, Shockley and Terman shifted the centre of the industry from the East to the West Coast of the US, laying the foundations of Silicon Valley.

  At Shockley Semiconductor, a number of brilliant individuals began investigating the potential of silicon. ‘Neither the processing nor the physics of [silicon] was well understood,’ wrote Gordon Moore, an employee at the company. ‘We were just exploring the technology and figuring out what could be done and we had a lot of things to make work before we could try and build something.’74 However, working at Shockley Semiconductor was a trying experience. Shockley’s bad temper and poor management practices wore down his workforce. He was known to stage public firings and demand lie-detector tests over trivial matters. Shockley and his employees disagreed not only about the direction of the company but also about which new inventions should be commercialised. After about a year, a group of eight of Shockley’s most talented and ambitious employees decided to leave the company. The ‘traitorous eight’ were put in touch with Bud Coyle and Arthur Rock, the latter the father of venture capital. Coyle and Rock persuaded them that, rather than being employed by another company, they should set up on their own. With US $1.4 million of funding from Sherman Fairchild, an inventor and businessman with a large stockholding in IBM, the group founded Fairchild Semiconductor, in nearby Palo Alto, California.

  At the time, one of the biggest technical problems was making the use of a transistor very reliable, unlike the cumbersome and unreliable vacuum tube. Each transistor had to be connected in a circuit by wires installed by hand. As the number of circuits in a computer grew the chances of one of these connections failing rose significantly. That was very risky. Other components in the circuit, such as resistors, were not made of silicon but of carbon and other materials. This made circuit production an expensive and inefficient process.75 In 1958, Jack Kilby, a scientist at Texas Instruments, started making changes to these circuits that would eventually lead to the development of the ‘integrated circuit’ by making all the components out of silicon. But Kilby’s circuits were still connected with fine wires.

  At Fairchild Semiconductor, a method had recently been developed to package and to protect these components by using the silicon dioxide layer that naturally forms on the surface of silicon.76 Robert Noyce, a co-founder of Fairchild, described the process as being like ‘building a transistor inside a cocoon of silicon dioxide so that it never gets contaminated. It’s like setting up your jungle operating room. You put the patient inside a plastic bag and operate inside of that, and you don’t have all the flies of the jungle sitting on the wound.’77 Noyce began to think about what else could be done with the new process. He realised that the oxide layer could be used to simplify the production, and so could reduce the cost, of entire electronic circuits. The insulating properties of the layer enabled all the components of a circuit to be produced simultaneously on one piece of silicon. Instead of using wires, the components of the circuits could be connected by using a thin sheet of metal spread on top of the dioxide layer. Wherever a hole was punched through it an electrical connection would be made with the component underneath. Electrical connections could now be ‘printed’ on to a circuit, rather than made with fragile wires. He called this invention the ‘integrated circuit’, in which transistors, capacitors and resistors were printed and connected simultaneously on a piece of silicon.78 This dramatically increased reliability, so much so that NASA used integrated circuits on the early Apollo space missions. Production costs were also dramatically reduced. Bardeen believed that recognising the natural tendency of silicon to form a protective dioxide layer had led to an invention which was as ‘important as the wheel’.79

  Moore’s law

  Fairchild Semiconductor became the leader in the development and production of integrated circuits as a result of Noyce’s invention. The company grew rapidly; revenue was US $500,000 in 1958 but was over forty times that by 1960.80 Around it many more computer technology and software companies were created and their location came to be known as Silicon Valley.

  In 1965, Gordon Moore, one of the Fairchild ‘traitorous eight’, noticed a consistent trend in the way in which the price and size of silicon transistors fell; that trend has underpinned the extraordinary growth of Silicon Valley ever since. Moore’s eponymous law states that the numbers of electronic devices (such as transistors, resistors and capacitors) which can fit on to a computer chip will double every year.81 In 1965, Moore expected that this rate of increase would continue for at least ten years so that by 1975 the number of components that could fit on to a computer chip would have grown from 60 to 60,000. To everyone’s surprise he was right. But the realisation of Moore’s law did not stop in 1975. The exponential rate of increase in computing power and the consequential reduction in the cost of that power has been going on ever since.82

  Today the most advanced microprocessors contain over two and a half billion transistors, the size of which has now shrunk to an incredibly small 22 nanometres, just nine times wider than a DNA chain. When combined with the overall growth of the computer industry, this leads to an extraordinary result: more than 1018 (a one with eighteen zeros following it) transistors were produced in 2011. This is more than the number of grains of rice grown across the world each year and more than the world’s yearly output of printed characters. It costs even less to produce a transistor than it does to print each letter in a book, newspaper or magazine. The process of miniaturisation, described by Moore’s law, produces faster and cheaper chips. And when chips became smaller and cheaper, they were used in more and more devices and were embedded into our daily lives. As Moore wrote in the article in which he first outlined his law: ‘the future of integrated circuits is the future of electronics itself.83

  In 1968, Moore and Noyce were bought out of Fairchild and used the money to create their own company: Intel. I joined the board of Intel in 1997 on the suggestion of Mike Spence, the Dean of Stanford’s Graduate School of Business. I had been chairman of the school’s advisory board, having studied there. However, I wanted to stay involved in California’s thriving business sector and believed I could learn a lot at Intel. Before I joined the board, I met Andy Grove, Intel’s CEO, who had worked with Noyce and Moore at Fairchild. Grove remains one of the most impressive business thinkers I have ever met. He had the intellect and dynamism to execute successful strategic plans, again and again, in the fast-paced semiconductor in
dustry. But Grove also deeply understood the science behind Intel’s products; he had written textbooks on semiconductor physics. With the veteran venture capitalist Arthur Rock as a member and the master engineer Gordon Moore as its chairman, the board was formidable; the company’s management was world class. Grove’s mantra was ‘only the paranoid survive’.84 He acted that way and he made the board and management follow his lead. In this fast-moving industry you had to be aware of what changes were on the horizon. More than that, you had to check that you were on top or ahead of them. Grove called the most important of these changes ‘10x forces’ because the change ‘becomes an order of magnitude larger than what that business is accustomed to’.85 The invention of the integrated circuit brought about one such change. More recently, the invention of the internet has brought about another.

  Silicon communication

  In the early 1990s, at the European Centre for Nuclear Research (CERN) near Geneva, Tim Berners-Lee, a computer scientist, was trying to find a way to help CERN’s thousands of scientists work together more effectively. Each of their experiments on particle collisions created vast quantities of data, but without a communication network to share information it was impossible to collaborate substantively. A lot of research had already been done on the theory and practical design of information sharing networks by the US military. In the 1950s, tensions in the Cold War called for a decentralised communication network. If communication relied on a single line running from one point to another it would be capable of fatal interruption. However, if that line formed part of a much larger interconnected network, messages between points could be easily re-routed along different paths, thereby providing diversified back-up.

  Building on this work, Berners-Lee created a system to link the computers at CERN; this later became the World Wide Web. Those in academia and industry were the first to use it to link the work of many collaborators. They already recognised the importance of computers; the immense processing power of these silicon-based machines was used in complex tasks such as mapping oilfields and simulating climate models. But Berners-Lee’s invention also made possible a simple way of establishing an easy-to-use global communication network, whose use soon spread to the general public. That was the internet whose birth coincided with the rapid increase in the number of people owning a personal computer. By July 1995, 6.6 million computers were connected; the following year that number had almost doubled. Shortly after I joined Intel’s board in 1997, Andy Grove announced that Intel should lead the way in the creation of a billion connected personal computers around the globe; at the time it was difficult to believe that it would happen.

  Today, the internet connects over two billion people, fulfilling and surpassing Grove’s vision. And the network even extends out into space, connecting astronauts on board the International Space Station to the Earth. The creation of the internet also added a new dimension to silicon, initially used just for computers but now also for communication infrastructure and devices. Berners-Lee’s innovation relied upon the silicon infrastructure laid down by Shockley, Moore and the other Silicon Valley entrepreneurs. It also depended on silicon in a very different format. Silicon optical fibres, a type of glass, were developed in the 1970s and 1980s, replacing copper for long-distance telecommunication lines. As a result, the capacity of these lines was expanded hugely, which enabled the internet to transmit information across the world at the speed of light.86

  As the backbone, the internet was a necessary means of meeting the increasing demand by people in business and everyday life to communicate with each other in real time. However, something else was needed to satisfy this demand fully: the interface between the human and the machine which needed to be made simpler and more pleasurable to use. This very need was recognised by Apple and that has made its name over the last couple of decades. In May 2012, I met Sir Jony Ive, senior vice-president of industrial design, in the Silicon Valley town of Cupertino, the home of Apple.

  Form and function

  I met Jony Ive outside, in a sunlit courtyard. We sat down for coffee and he began to elaborate on the process of design. ‘At its core, my job is to think about the relationship between function and form,’ he explained, before pointing to the table in front of us: ‘Look at this cup. When we drink from it, we don’t think about it because we know exactly what to do with it Its form is intrinsically tied to its function. But around the time of the Industrial Revolution, this began to change. Mechanised objects created a disjoint between function and form, to the point where, in the smart-phone today, there are an extraordinarily large number of functions with no intrinsically associated form.’

  A smartphone is powered by electrons zipping across atomic layers of silicon, but its complexity is hidden away behind a shiny metal case and sleek graphical interfaces. This exterior is as critical as the technology inside; it ensures smooth and flawless functioning that makes our interactions with computers as easy as those with a teacup. The first personal computers looked threatening and scared off many potential users. They looked like laboratory equipment, beige and black boxes, designed by scientists for scientists. That created a barrier between the user and the computer, a barrier that Jony has spent his career at Apple breaking down. Later in the day, he took me into the entrance, but no further, of his design studio, where he directs the small team that creates the designs of the future. Frosted glass windows keep prying eyes out of the hushed and calm environment. Here ideas are plenty, but masterpieces, his standard, are few. Days, weeks, even months, are spent designing, modelling and re-forming each and every button and curve to create objects that transcend utility; they are attempting to create objects of desire. That desire and power have spread throughout the world, with truly revolutionary consequences.

  Social media revolutions

  In the middle of December 2010, Mohamed Bouazizi was selling fruit from a street stall in Sidi Bouzid, Tunisia, when he was confronted by two police officers. He did not have a licence, or the money to pay the usual bribe expected by the officers, and so his cart was confiscated. He tried to lodge a complaint at the local governor’s office, but they just laughed at him. Helpless and in despair, he returned with a can of petrol, doused himself with its contents and lit a match. The news of his death spread quickly and triggered a series of protests, many of which were organised through the use of social networking sites. The president fled the country shortly after.

  Twitter and Facebook had provided a platform through which Tunisia’s despairing youth could communicate their shared grievances and coordinate political action. Around the Arab world, similar extraordinary protests followed, with the eventual demise of rulers in Egypt, Libya and Yemen. This was not the first era in which silicon has been used as an enabler of political revolution. Transistor radios were used during the Cold War by Radio Free Europe to broadcast anti-communist propaganda across the Soviet Union. A revolutionary group always tries to take control of state radio stations in an attempted coup d’état as, for example, Hugo Chávez did in the early 1990s. Whoever controlled these controlled the country.

  Mobile communications and the internet, with their ability to spread information very widely, allowed the revolutions of 2011 to build momentum far more quickly and with greater effect. Silicon enabled this to happen by providing the tools to have debates and discussions. That is a noble purpose, but these tools also enable surveillance, snooping and persecution. In Tunisia authorities used the internet to target and arrest prominent bloggers before the revolution. In China and Russia much of the information accessible online is censored and those using social media to voice opposition to the government are kept under surveillance, usually for a bad purpose. Just as with other of the other elements considered here, silicon can make the good and the bad happen. And it can do so very rapidly without geographical boundaries.

  Silicon society

  ‘What news on the Rialto?’ asks Shylock in Shakespeare’s The Merchant of Venice. During the Renaissance, the Rialto w
as the financial and commercial centre of Venice, and to find out what was going on there you had actually to go and see for yourself. Local communication was limited to a walking pace, while international communication occurred only as fast as a ship could sail. Centuries later, during the Industrial Revolution, humanity harnessed the energy in coal and oil to power steam trains and ships, and then cars and aeroplanes. By transporting people further and faster, carbon expanded our geographical horizons and our capacity for communication.

  But it is silicon that has enabled the most recent and dramatic step change in our power to communicate. Just as with carbon-based transportation, silicon has transformed our individual daily lives, giving us greater choice over who our ‘friends’ are and enabling us to keep in touch with a much wider social network. The power of silicon, though, extends far beyond that of carbon. Even today around only 15 per cent of the world population has a car; far fewer have ever flown by aeroplane.

  Silicon is far more pervasive in its use in mobile phones. These have dramatically changed the way societies are developing. They have existed for a long time but, just as with the early computers, they were expensive, bulky and used a lot of power; the batteries were so large that they had to be placed in the boot of a car. Individuals can now access computing power that was once only available to universities and big business: a smartphone contains more computing power than that at the whole of NASA when man landed on the Moon in 1969. By the 1990s, mobile phones had become so cheap that they were affordable to many in the developing world. By 2002, there were over one billion mobile phone subscriptions, a milestone which took fixed telephone lines 128 years to reach. Today, around 75 per cent of the world has access to a mobile phone. By connecting so many previously isolated voices, silicon has shifted the balance of power within society. One only needs to look to the growing influence of NGOs and internet-based lobby groups to see how political change in nations across the world is being catalysed by silicon.

 

‹ Prev