Engines That Move Markets (2nd Ed)

Home > Other > Engines That Move Markets (2nd Ed) > Page 50
Engines That Move Markets (2nd Ed) Page 50

by Alasdair Nairn


  Bush argued that America could no longer rely on Europe for fundamental research; that the government had to coordinate and sponsor research activities but allow private sector and academic involvement; that future technological superiority should be attained not by restriction of information but by its dissemination; and that a new body funded by Congress and named the National Research Foundation should be established. This foundation would be controlled by civilians and support basic university research in medicine and the natural sciences, as well as weapons research for the armed forces.

  This proved a visionary report. It not only helped create a body that would play an important role in future technological development, but also provided a powerful impetus to the dissemination of research conducted during the war, encouraging potential use by the private sector. The commercial deployment of research from World War II in America distinguished it from other developed nations. This concept of private/public partnership and central funding for research was to underpin the development of the new information age, although implementing Bush’s vision was far from a smooth and orderly process. The 1947 Act setting up a National Research Foundation was vetoed by President Truman. The National Science Foundation that eventually emerged in 1950 differed somewhat from Bush’s original conception, and Bush himself opposed it. It departed from his ideas in that research was not coordinated by the new foundation as a single body, nor was it involved in military-related research. The NSF also had a limited remit and funds.

  The main research-funding bodies of the time were the Office of Naval Research and the Atomic Energy Commission. These bodies grew in importance with the Korean War, but it was the intensification of the Cold War that provided the real impetus for funding military computing applications. While the promotion of a national science-funding body was one of Bush’s enduring contributions, the conditions of the time ensured that the military maintained control for the foreseeable future. Bush wrote a visionary article in July 1945 in Atlantic Monthly entitled ‘As We May Think’. Much of this was devoted to advances in photography and the role this might play in the storage and retrieval of information. A further portion dealt with retailing and point-of-sale information-gathering. Interesting as these discussions were, they simply extrapolated existing scientific knowledge to potential applications, and in any event were soon made redundant by the advance of technology.

  The final and most prophetic part of the article, however, described a machine called the ‘memex’, which would allow information to be stored in a manner that better mimicked human needs. Rather than a hierarchical indexing system, Bush discussed a new system in which stored data or text was cross-referenced by association. This is the way, he pointed out, that the human mind operates. “With one item in its grasp, it snaps instantly to the next that is suggested by the association of thoughts, in accordance with an intricate web of trails. Wholly new forms of encyclopedias will appear, ready-made, with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified. The lawyer has at his touch the associated opinions and decisions of his whole experience, and the experience of friends and authorities. The patent attorney has on call the millions of issued patents, with familiar trails to every point of his client’s interest…There is a new profession of trail blazers, those who find delight in the task of establishing trails which lead him all over civilisation at a particular epoch.”⁹⁵

  Just as the NSF took many years to appear, and did so in a somewhat different form from Bush’s original conception, so too the functionality associated with the memex was to take decades to appear. It did, though – at least in part – appear at last in the form of the Internet and the World Wide Web, the 21st century’s encyclopedia of choice. Its appearance, though, proved largely a by-product of network development, and owed more to the military imperatives of the intervening period than to any direct attempt to bring Bush’s vision to reality.

  Timeshare computing: means to an end

  The vision of timeshare computing which ultimately led to the development of the Internet had been driven in part by the economics of computing. The high cost of a mainframe for most users makes access to processing facilities economic only through shared use. The first solution to this was access through timeshare facilities. The development of the computer industry quickly made this conclusion questionable, as the emergence of minicomputers and the microprocessor made access to processing power readily available to larger groups of users. Timeshare, though, was only one reason for linking up mainframe computers. Just as military needs during the World War II had accelerated the development of the mainframe, so the perceived threats of the Cold War channelled funds into the development of networking.

  In retrospect, the psychological impact of the successful launch of the Sputnik satellite in 1957 probably far exceeded the real state of Russian scientific capability. At the time, though, the impact was immediate, and the US government reacted quickly to the perceived threat with the creation of a body charged specifically with the task of ensuring American scientific superiority. The body was titled the Advanced Research Projects Agency, or ARPA for short. Although its initial focus was on space research, within a few years these objectives had been transferred to NASA and ARPA’s main focus became defence. Its funding was to play a vital role in the development of networking technology.

  For ARPA, two main technology objectives were quickly established. First, computers had to talk to each other so as to share information. Second, the links between computers had to be robust so that cutting one link would not disable the entire system. In 1962, ARPA created a new department, the Information Processing Techniques Office (IPTO), charged with researching the issues of network command and control. Joseph Licklider, a behavioural psychologist from MIT, was appointed its first director. This was to prove a significant appointment as Licklider had been heavily involved with research to improve the functionality of computers, including the publication of a seminal work, ‘Man-Computer Symbiosis’, in 1960. He foresaw the development of graphical user interfaces to make computers easier to use and also the need for tools other than the keyboard to make navigation and other tasks easier – in effect, what is now the ‘mouse’. Licklider gradually moved the emphasis of IPTO away from command-and-control technology towards graphics, common languages, recognition conventions and timesharing. Although Licklider was director of IPTO for a relatively short period, his ideas continued to influence the future direction of programs funded under his successors at ARPA. Early in his tenure, Licklider had written a memo to ‘Members and Affiliates of the Intergalactic Computer Network’ that discussed the linking of multiple timeshare computer sites, and the need for common conventions if this was to be achieved. In 1963, the first practical expression was given to this when UCLA and UC Berkeley were commissioned to conduct research on network creation.

  Nurtured by the military

  Funding from ARPA produced a consortium of academic institutions who would work together to build a computing network. The embryonic ARPANET was some distance from the Internet of today. Licklider’s communication problem still had to be solved. Fundamental to this was the mental leap required to split timesharing concepts from those of networking. The problem with timesharing was that it caused users to be very protective of their allocation of time on the computer, and naturally resistant to anything that might encroach upon it, such as new networking languages and protocols. The solution was to create a network of linked minicomputers which would handle all these tasks and tie the individual mainframes to these interface machines.

  Each institution therefore had to work out how to communicate with its minicomputer, known as an interface message processor (IMP). The solution was an elegant one, which split timesharing and networking, allowing users to network with increased computing power. By the late 1960s, building upon work sponsored by the Department of Defense at the Rand Corporation, and concurrent work at the National Physical Laborator
y in the UK, it was possible to use ‘packet switching’ technology to move information around a network without having to rely on a single unbroken series of computer links. In 1968, the Boston-based consultancy Bolt, Beranek and Newman (BBN) won the contract to develop interface message processors. BBN had been heavily involved in US government defence work in the past and also had a historic relationship with Licklider. Less than ten months after winning the contract, BBN installed the first IMP at UCLA and the second at the Stanford Research Institute. Subsequent nodes, or access points, were added in 1969 and the four host institutions – UCLA, Stanford Research Institute, UC Santa Barbara and the University of Utah – linked their machines to create the ARPANET. This would be the forerunner of today’s Internet, although at the time few suspected the uses to which such networking would eventually be put.

  The second key piece of work to emerge from ARPA was sponsorship conducted at Stanford by Doug Engelbart. Engelbart had worked on the ARPANET project, as well as focusing on a wide range of tools and techniques for information manipulation and transfer. Initially Engelbart had been funded by the Air Force Office of Scientific Research, and this resulted in a paper discussing interactive computing. In 1963 Engelbart published: ‘A Conceptual Framework for the Augmentation of Man’s Intellect’, which focused on the need to design computer systems and tools to complement human capabilities. He set out an illustrative general system referred to as H-LAM/T (Human using Language, Artifacts, and Methodology in which he is Trained). This proposed system built on the vision of Vannevar Bush and pulled together the various strands of development in human computer interfaces and the concepts of linking associated text as laid out in ‘memex’. Development of the concepts in this paper was funded by the ARPA IPTO under Licklider.

  The culmination of this work was a presentation in San Francisco in December 1968 later dubbed ‘the mother of all demos’. At the presentation, Engelbart used a mouse to navigate the screen, projected so that the audience could take in the full detail of the mixture of text, graphics and video. It was effectively a multimedia display incorporating expandable and embedded text with links to additional documents, all this between two users in different locations. In this one remarkable presentation, many of the tools that were to take decades to refine and become usable were shown to a stunned audience. The presentation led to a research project known as oNLine System (NLS), which continued until 1977 when financial considerations caused Stanford to cut its funding.

  Engelbart moved to a company called Tymshare, but many of his staff were recruited by Xerox, which had set up a new research centre at Palo Alto, named the Palo Alto Research Center, or PARC. The technology and concepts that emerged from PARC were to find their way back into mainstream computing, but (as we saw in chapter 9) only via a circuitous route and through the opening of a completely new market segment, the personal computer. The development of the personal computer would take place separately from the Internet, although as prices fell and functionality increased, it became the equivalent of the telephone handset, linking telephone cables but for purposes of data transmission.

  Engelbart was not alone in following up Bush’s vision. In the early 1960s a graduate sociology student at Harvard, Ted Nelson, began to recognise the latent power of computing for information storage and retrieval. By the mid-1960s Nelson had coined the term ‘hypertext’ to describe the linkages first outlined by Bush, although Nelson’s conception had moved some distance beyond that of either Bush or Engelbart. Nelson would pursue these ideas, through Project Xanadu, a software development project, and texts such as Dream Machines (1974) and Literary Machines (1987). On each occasion, the Xanadu software was described as ‘forthcoming/imminent’ but, as with Babbage in chapter 8, it always seemed to be postponed or shelved in favour of new improvements and further developments. One should not minimise the importance of Nelson’s ideas, despite the fact that they failed to find full expression. He contributed to the evolution of the Internet just as did Bush and Engelbart. Products such as Apple’s Hypercard software, Lotus Notes, and the Mosaic Internet graphical interface, had their genesis in Nelson’s publications and presentations. Whether his original conception of hypertext as a two-way linkage will ever resurface, as Babbage’s ideas did, remains to be seen.

  Marketing the dream

  Innovative as the work of Engelbart and Nelson was, it remained of peripheral importance to funding bodies. Defence was the core function. The research on networking took place in a devolved framework, with different groups working on different segments of the project. Although the research on networking was mainly conducted in a university environment, the sponsor remained the US Department of Defense. Over the next two to three years, other nodes were added to the system, but the lack of immediate uses looked as if it was going to consign the work to a side role, or backup that would only be required in desperate defence conditions – or what Licklider called the “rare occasion”.

  To combat this, a conference was organised to try and stimulate interest and ongoing support. The International Conference on Computer Communications was held in Washington in October 1972. Like all good marketing efforts it included a number of headline-catching demonstrations. It boasted, for example, a hypothetical psychotic character at UCLA being programmed by a hypothetical doctor at BBN. It had remote games of chess and quizzes, but more importantly it involved the first use of a new application, which was later to prove a driving force in the development of the Internet. This was email, which had preceded the APRANET but had then developed to assist interaction and coordination between users. The conference made public the existence of the ARPANET, the possibility of international communication and the evolving email functionality. That this functionality would give rise to the Internet of today was not recognised at the time. The reporting of the new network was subdued to say the least, though. It appeared to be viewed as another scientific advance without much relevance to the wider world.

  10.1 – And the press reports the new network

  Source: New York Times, 15 April 1972.

  Fundamental to this evolution was the concept that the ARPANET would be a network for packet switching, based on a foundation of open architecture. In open architecture, any individual network of computers can hook up to the central networking as long as it conforms to a particular set of standards, or in the jargon ‘a meta-level internetworking architecture’. This allows individual networks to be designed for their own specific purpose, rather than having to conform to a more restrictive structure. A key requirement of any open architecture system has to be that the ‘internetworking’ is robust. This resulted in a communications protocol named the transmission control protocol/Internet protocol (TCP/IP). A version of TCP/IP was written as a collaboration between Bob Kahn and Vint Cerf and presented in September 1973.

  This work was extended with funding from DARPA (as ARPA had become). Between 1974 and 1979, four progressively refined versions were produced. The versions were tested and the fourth version was eventually accepted as the standard. The TCP/IP protocol was used to connect the ARPANET via terrestrial, radio and satellite links. The initial earth stations for the satellite connection were in the USA and the UK, but soon further stations were added in Germany, Norway and Italy. At the time, the personal computer industry remained in its infancy and the main connections were of necessity between large scientific complexes.

  The main users, though, remained the military. By 1979, 46 sites on the ARPANET were in the military/industrial sphere, as compared to 16 academic campus sites.⁹⁶ The development of the ARPANET increased the need for some form of coordination of protocols and the configuration of gateways (what we now know as ‘routers’). To this end, ARPA set up what later became the Internet Activities Board (IAB). With increasing participation by groups other than the military, the Department of Defense saw the need to separate its networking requirements from those of the other users. Thus, in the early 1980s, the Department of Defense’s own net
work was set up, but continued to use the TCP/IP standard and retained access to the ARPANET. The ARPANET gradually came to be known by its popular name, the Internet.

  In reality, the original network was taken out of service as its successors increased in importance. The importance of the Internet as a connection medium had not gone unnoticed, and the National Science Foundation in America had begun to provide funding support to allow increasing numbers to gain access to the ARPANET. Further support was forthcoming during the 1980s from the federal government, which set up the NSFNET to provide high-speed links between its supercomputing centres for academics and the centres themselves. The rationale for linking the five supercomputers was effectively the same as it had been for the original networking of mainframes. Supercomputers were very expensive, and there could only be a limited number of them. If widespread access was required a network therefore had to be provided; timeshare again! The supercomputer links were augmented by links to regional networks, which provoked a wide range of educational institutions to connect to the NSFNET. In 1987, the NSF commissioned a group of companies, including IBM and MCI, to improve and manage the supercomputer network, which had begun to struggle as traffic growth threatened to overwhelm the system. The main cause of this was the enthusiastic adoption of email, the application that had been revealed at the Washington conference 15 years before.

 

‹ Prev