Book Read Free

Engines That Move Markets (2nd Ed)

Page 51

by Alasdair Nairn


  The group commissioned by the NSF upgraded the network by replacing old lines with connections able to run at substantially higher speeds. The consequent higher functionality and specification of NSFNET led to it effectively replacing the ARPANET, which was finally decommissioned in 1990. Subsequently another network named the National Research and Education Network (NREN) was created, sitting initially on top of the NSFNET, and funded by the Al-Gore-sponsored bill to service the needs of ‘lower’ education. The Internet had arrived.

  When the original protocols were being formulated in the early 1970s, the vision upon which they were based was one of linking large national networks. The expectation was based on the ARPANET model and hence only a small number of such creations were anticipated. Although technologies for linking local computers – the ethernet – were under development at Xerox at the time, the evolution of the Internet was not foreseen by those directly involved. As a consequence, the expectation that the number of networks that would be linked would not exceed 256 determined the initial methodology employed to define addresses. This assumption was not to remain underpinned for very long as both the ethernet and personal computers dramatically rearranged the working model. No longer was there simply a small number of large national networks, but to this were added a growing number of regional networks and an exploding number of local area networks. A system that could only accommodate 256 networks was inadequate and the growth required a new naming system, hence the domain name system (DNS). Early pioneers did foresee that the Internet would have commercial potential but focused primarily on the knowledge/data sharing aspects and hence on how developing the infrastructure for such activity could be financed.

  10.2 – The commercial potential of the Internet begins to surface

  Source: New York Times, 9 February 1990. Telecommunications, 1 June 1991.

  From academia to commercialisation

  In the early years, the networking of large computers had been largely restricted to arms of government and academia. It was funded and developed by the US government, largely through the military budget and for military purposes. It is unlikely that such projects would ever have been undertaken by the private sector, as networks involved very obvious costs but no obvious revenues of meaningful size. The costs included basic research and the cost of building physical networks and large-scale computing facilities. Even after the establishment of ARPA in the early 1960s, the cost of developing what is now the Internet continued to be largely borne by government out of public funds.

  Commercial interest for most of the period was confined to military functions and the provision of equipment, expertise and maintenance. Manufacturers of servers, cables and switching equipment witnessed growing demand as the network of computers expanded. However, the real growth was not to come until the network sped up and increasing numbers of local area networks (LANs) could be linked by the Internet to networks externally. The majority of early local networks were in academic institutions, with email the principal use. Gradually, however, businesses that would otherwise have been unable to fund such long-distance connections began to take advantage of the new medium. Of these, the most notable was Cisco Systems, which emerged from Stanford University in 1984, formed by a small group of academics seeking to capitalise on the improvements in computer system connectivity.

  At Stanford University, the husband-and-wife team of Sandy Lerner from the Graduate School of Business and Leonard Bosack of the Computer Science Department were so frustrated by the inability to communicate by electronic mail that they embarked on a course that would ultimately result in the creation of one of the world’s largest companies. They had met some years before when they both used a timesharing system in Stanford University’s computer science department. Later, as staff at the same institution, and despite both having access to computer networks, the couple could not communicate with each other because each department was connected to a different network. This was far from unique. In 1982 Stanford housed some 5,000 computers, the majority of which could not talk to each other directly.

  The only connection that could be made was through the ARPANET. This involved sending an email to the ARPANET for transmission before receiving it back through the IMP terminal and then on to the indicated recipient. This was an unduly clumsy method of communicating, when in theory a link between the local networks could complete the same task without going anywhere near the ARPANET. Stanford was not short of such sites, having developed many local area networks as a result of Xerox’s largesse in distributing ethernet equipment. The task facing Bosack and Lerner was somehow to link these local networks and allow them to communicate, while not interfering with the network’s jealously guarded operational independence.

  To achieve this, they enlisted some engineering help from colleagues and built an updated IMP, originally developed by BBN for the ARPANET. This ‘router’ assisted the email and information transfer between networks. Since email was the pre-eminent function, or ‘killer application’, on the Internet, the development of routers which speeded up the process was a vital cog in the future development of the Web. The efforts of the team including Bosack and Lerner were sufficiently successful for Stanford to adopt the system officially into the university network. Bosack and Lerner found that demand from other universities for the equipment grew sharply as word of mouth (or more accurately, email) disseminated their success. But when they tried to set up a commercial venture alongside their academic efforts, they found the university unwilling to sanction the use of either resources or office space. Bosack and Lerner left to set up their own operation, which they named ‘Cisco’.

  Enter Cisco Systems

  The early development of Cisco was funded by the couple’s mortgage and credit cards, but such was the latent demand that it quickly established sales in excess of $250,000 per month. Their efforts to develop the business were supported by colleagues, some of whom threw in their lot with the operation, working incredibly long hours in cramped conditions to try and meet demand. In 1986, the operation moved to its own premises, but remained a relatively amateurish business. Advertising took place via email and word of mouth. The business was constrained by lack of capital and management. As a consequence, the couple decided to try and raise additional funds. Despite their early commercial success they did not find a receptive audience. The venture capital world remained more interested in the PC industry. As a consequence, they were rejected by the first 75 venture capital companies they approached. Eventually, after many failed presentations, Sequoia Capital agreed to provide $3m in return for a one-third equity stake. Sequoia also installed John Morgridge as CEO.

  The timing of this investment could not have been better, with product demand entering an explosive upward curve. With a readily available product and no meaningful competitors, the company prospered immediately. It was not just growth, though, that was explosive, with violent personality clashes between the founders and the new management team. One thing all parties did agree on was that it would be an ideal time to take the company public. Cisco came to the market in 1990. The business continued to prosper and expand, although the original founders were ousted soon after, following an irreconcilable breakdown in relations between staff and Sandy Lerner. The couple sold their two-thirds stake in Cisco for $170m. Len Bosack also decided to leave.

  Although the initial reaction to the IPO was lukewarm, Cisco never looked back, thanks to its dominant position as a provider of intranet infrastructure. If Cisco had been in the railroad business, it would have been as the provider, installer and maintainer of rails, signalling equipment and timetables. As such, it benefited directly from the growth in networks, and both as a supplier of physical items such as routers and the owner of the source code of its Internetwork Operating System (IOS). This was the essential software to ensure compatibility between products, and Cisco had learned from the example of Microsoft how important it was for this code to become the industry standard. In addition, Cisco repeate
dly acquired companies that appeared to be developing complementary technologies, and attempted to add these new business streams to its operations. These acquisitions were made substantially easier by a stock market that had gradually woken up to Cisco’s prospects and was ready to give the company a higher valuation, consistent with accelerating expectations of Internet growth. The ‘bubble’ element built into the company’s valuation of its peak was only to be deflated as the market’s TMT bubble burst in 2000.

  Cisco Systems

  10.3 – Cisco: a slowing growth stock

  Source: Thomson Reuters Datastream. Cisco annual reports.

  Cisco’s financial history is reminiscent of that of many other technology companies in times past and follows a familiar pattern. In the early stages, when the company has a technological lead, or meaningful patent protection, sales and profits increase at a rapid pace. The return on equity and assets rises accordingly. At some point, however, the company finds that, as market penetration increases, or as its technological lead is reduced, pricing assumes increasing importance. Sales growth slows and margins flatten. In such a scenario, the only ways to maintain previous rates of profit growth are to decrease the equity base or use increased debt to fund additional growth through acquisitions. When one has the market share commanded by Cisco, the room for manoeuvre is relatively narrow and strategy tends to veer towards warding off threats.

  Given the unavoidable constraints presented by its size, the company has shown itself to be a proven survivor and an impressively managed business. Return on capital has inevitably declined but earnings per share have continued to grow thanks to buybacks and acquisitions. The company has clearly become more vulnerable to the ups and downs of the economic cycle. Nevertheless, Cisco has largely managed to maintain its margins at a remarkably high level for such a mature business. Market maturity may not be the most significant challenge it faces. If new entrants are successful in breaking into the software element of Cisco’s business then this would expose the company to commoditisation of its hardware. Industry shifts to cloud services are concentrating the customer base, changing the balance of power and making Cisco vulnerable to a new, tougher pricing environment. It is hard to argue that the market is unaware of these challenges and what really stands out from its share price history is the valuation aberration of the TMT bubble period.

  Towards an electronic post office

  Early visionaries such as Bush, Engelbart and Nelson had all seen the potential in improved access to stored information. This potential was not simply the timeshare value of allowing many more individuals access to electronic libraries, but also the ability to move away from historical hierarchical referencing methods which until then had been the only practical method of cataloguing large sets of information. The creation of a network of networks had made the physical linking of information repositories a reality, but there remained no referencing system to allow these physical links to be exploited. As a result, the Internet in its early days remained largely the preserve of the users for whom it was originally constructed.

  In the 1970s, before email, the Internet had been the preserve of specialists. Sending and receiving information had required the ability to compile, transmit and decode information. Email greatly simplified the transmission of information and led to a rapid increase in the use of the Internet. Once information could be easily transmitted, what was then needed was the ability for it to be easily stored and accessed. Visionaries such as Ted Nelson sought to create such a system (in Nelson’s case, the long-running effort mentioned earlier, named Xanadu) but by the late 1980s nothing had emerged. This was to change in the 1990s as the practical demands of science, the increasing availability of local computing power and the availability of the Internet combined to produce what is now known as the ‘World Wide Web’.

  In the early 1980s Tim Berners-Lee was a consulting software engineer at CERN, the European particle-physics laboratory in Geneva. One of the problems he faced in his work was an almost impenetrable jungle of information. Projects at CERN typically involved many different individuals and groups of individuals and were frequently related to other projects either past or contemporaneous. In order to track the interdependencies, Berners-Lee wrote a program named Enquire. A page in Enquire contained information on a particular person/subject/object and represented a node which could only be created by linking it to another node. Each page contained a footnote of references and the relationship to other nodes.

  This system was fully constructed, though Berners-Lee left it behind when he departed CERN at the end of his contract period. In September 1984 he returned to CERN having obtained a fellowship specialising in the acquisition and control of data. He began by trying to recreate Enquire, but found that a departure was necessary if his program was to be able to access external information – information that had not necessarily been stored according to a centrally defined hierarchical classification system. Previous attempts at creating generalised storage and retrieval systems had all foundered on this rock. The problem was that in order for any system to work, all users were forced to conform to its rules and hence necessarily change some of their working methods.

  The solution had been articulated in theory many times, mainly in the United States, but never achieved. The Internet had not seen the same take-up in Europe as it had in America for various reasons. What Berners-Lee saw was a network which already had a standardised set of protocols for packet switching (IP/TCP) and a system which was inclusive in that it allowed both VAX- and Unix-based users to have access. Berners-Lee submitted at least two proposals to CERN for the creation of a non-hierarchical hypertext-based system. Finally, after seeing his proposal twice put to one side, he embarked on the project on his own. The name he gave to the system under development was the World Wide Web.

  His challenge then was to persuade the scientific community to appreciate the merits of his proposed system. It was hard going. Different groups of users found it difficult to see their areas of specialisation in a wider context and Berners-Lee found himself having to develop the necessary tools himself to demonstrate the power of his proposal. Using a NeXT computer – newly arrived on the market courtesy of Steve Jobs – he was able to exploit its capabilities to create a program for building, browsing and editing hypertext pages. This involved writing a protocol for transferring hypertext; the hypertext transfer protocol (HTTP). This effectively allowed computers to talk together over the Web using the also-developed addressing system, the universal resource indicator (URI). The next step was the evolution of a language (hypertext markup language, or HTML) which allowed the creation and formatting of pages with hypertext. Accessing information required a browser that would decode the address, the URIs and allow editing facilities on Web pages. This constituted the Web client.

  The challenge of access

  The next step was to create the Web server, that is, the software to facilitate retention of pages and access to them. By Christmas 1990 a prototype version was up and running at CERN, but to generalise the system Berners-Lee needed to establish a set of standards, just as the Internet had to establish the TCP/IP standards before it could function. Achieving this end required that he maintain and increase his efforts as an evangelist for the World Wide Web and the protocols which supported it, in the face of scepticism and many academic and commercial rivalries.

  Unlike the Internet, where the funders could specify a protocol, the World Wide Web required others to adopt its conventions. Its non-invasive nature was a strong selling point since it did not require users to make changes to their own systems in order to use it. Against it was the lack of developed tools and a critical mass of established users. Berners-Lee described the process as akin to pushing a bobsleigh – an enormous initial effort until the sleigh picks up speed and gains a momentum of its own.⁹⁷ The next two years were spent frantically encouraging and cajoling others to develop browsers and create a common set of standards, beginning with the URL definitional s
tructure. Without browsers, users could not access information efficiently, and without a common set of standards there would be no information to access. It was a classic chicken-and-egg situation again. Few wished to spend the time developing a browser with more than local area capabilities, purely in the hope that the Internet would one day become more widely accessible. Equally, the move to agree common standards lacked the urgency of a widespread perceived practical need. There was no single or dominant funding body to force a resolution, as had happened with TCP/IP.

  Browsers began to emerge in a variety of academic settings, most frequently to assist in accessing information from an institution’s network, but also as standalone student projects. As traffic on the Web grew, browsers gradually began to be disseminated among users. The early browsers represented a great advance on what had gone before, but often involved a lot of effort to install, use and adapt. This is not surprising. They were not specifically designed for Internet use. The enormous commercial potential that later unfolded was not widely foreseen. One exception proved to be at the National Center for Supercomputing Applications at the University of Illinois, Urbana-Champaign, where a group including Marc Andreessen and Eric Bina was concentrating on the development of a browser named Mosaic. This work differed from many other browser developments in that it focused specifically on client needs. The Mosaic browser was one of the first to be developed as an easy-to-use tool to the benefit of the development of the Web.

 

‹ Prev