Book Read Free

Engines That Move Markets (2nd Ed)

Page 43

by Alasdair Nairn


  8.5 (a), (b) and (c) – Launched with a bang: the IBM System 360

  Source: New York Times, 21 January 1966, 21 August 1969 and 8 April 1964.

  However, the stakes were dramatically raised in the early 1960s when IBM took the ambitious but necessary step of unifying its product line. Up to that point, compatibility between different models of IBM computer had been limited. The focus was on the production of a single computer of a certain size rather than as one element in a broad product range. As a consequence, customers who wished to upgrade their computer faced the prospect of substantial reprogramming and retraining, which some saw as the equivalent of switching suppliers. Not only was this risky in terms of customer retention, but it also added huge extra cost to IBM in terms of the resources which had to be devoted across the whole range of its computers. After much debate, IBM resolved to make its entire range compatible. This was a massive undertaking, largely spurred by the success of the Honeywell 200 against IBM’s prime model. In 1964 a massive press campaign was launched simultaneously across the major cities of America announcing the launch of the System 360.

  8.6 – Big blue under fire: 13 years of regulatory scrutiny for IBM

  Source: Montage – sources in art itself.

  The System 360 was a resounding success, reinforcing IBM’s dominant market position. Indeed, so successful were the results of the compatibility program that IBM became the subject of repeated investigation by regulators looking for evidence of market abuses, monopoly behaviour and violation of antitrust legislation. The company was ultimately successful in its defences against most charges, but the onslaught undoubtedly had an impact on the corporate psyche. IBM became a corporation where the legal implications of any of its actions had to be carefully evaluated, not an ideal situation in a dynamic and growing industry and one which was likely to slow down an already huge corporation.

  Timesharing: an idea before its time

  The issue that continually faced users was one of cost. A mainframe was expensive to purchase and maintain. To make it efficient required multiple users. To extend the number of users required the ability to share the use of the machine: in other words, to timeshare. Initially timesharing took the form of batch processing of computer jobs. Users would type their programs on cards and submit them to a processing centre. The processing centre would then aggregate and allocate all the jobs to try and ensure efficient usage of the machine. This was efficient for the machine, but not for the user, who would only find out his programming success after an extended delay. What was required was a system that allowed multiple users the ability to work on the computer simultaneously without interfering with each other’s progress.

  The logic behind the development of timesharing was founded on the view of economies of scale in computing, summarised at the time by Grosch’s Law which stated that the ‘power of the computer varied with the square of its price’. Even if the relationship was not a scientifically proven one, it reflected the position that an extra dollar spent brought more than an incremental dollar of computing power. If this was the case then logic suggested an operating model similar to that used in power generation, where a centralised generator distributed power to end users. Out of this was born the computer utility framework. The commercial focus of the time was therefore on ‘utility computing’ and this was reflected in the financial markets. The potential market such computing networks might achieve was almost exactly the same as that later claimed for the Internet. Although the technology was ultimately misspecified, the spectre of timeshare computing makes fascinating reading many years on.

  8.7 – University Computing Company

  Source: Montage – sources in art itself.

  The grand vision of universal access to computing and global information systems proved irresistible for financial markets, the appetite of which had already been whetted by genuine growth in the technology sector. A technological commodity available to all certainly looked potentially lucrative. Since economic conditions were also buoyant, capital was available to fund such ventures. Almost any company perceived as being a beneficiary of the move to timesharing found its share price soaring. One such company was the computing timeshare company, University Computing Company (UCC), based in Dallas, Texas. Such was the excitement that its share price rose from $1.50 to over $155 before crashing back to earth. The problem for UCC and the other timeshare companies was that the cost of the complex software needed to make centralised computing a reality, together with the falling costs of processing power, fatally undermined the initial commercial premise of the business. When the concept so heavily anticipated by financial markets was shown to be a dud, retribution was equally abrupt and brutal.

  8.8 – Computer timeshare: an idea ahead of its time

  Share price of University Computing Company relative to US total market

  Source: CRSP, Center for Research in Security Prices, Graduate School of Business, University of Chicago, 2000. (Used with permission. All rights reserved. www.crsp.uchicago.edu.)

  The US military had the same vision as the commercial sector about timeshare computing. There were, however, complications. Primary among these was the threat posed to central communications by nuclear attack. A timeshare-based system which tied together geographically disparate facilities and which could continue functioning with part of the system disabled was preferable to a system with a nerve centre that could be neutralised by a single attack. There therefore seemed to be both commercial and military reasons for developing a sophisticated communication and timeshare network. The reason it took 20 to 30 years for the vision to be fully realised was that the commercial premise behind the timeshare concept was flawed. Grosch’s law – that computer performance increases as the square of the cost – proved to have a short lifespan, given that, almost as it was being promulgated, technology was already advancing to allow the emergence first of the minicomputer and then the personal computer.

  Nevertheless, at the time the mainframe dominated, and centralised processing remained paramount. The concern over potential vulnerability to nuclear attack was fuelled by the Cold War and the apparent lead held by the USSR in rocket and space technology. It provided the impetus for the creation of an agency charged with the task of furthering military and defence-related research. The Advanced Research Projects Agency (ARPA) was established in 1958, spurred by the shock of the Sputnik launch, and within three years had moved its space operations to NASA to concentrate primarily on defence. From that point forward ARPA funding would play a vital role in directing the cutting edge of technology. With timesharing, the main goal was to create a single network. Seven years after its creation, ARPA began funding research to overcome the problems associated with the lack of interaction between different timeshare networks.

  By the mid-1960s work had begun to link different networks, work which was to produce the ARPANET, out of which would later evolve the Internet. In computing, though, it remained a mainframe-dominated world, a world which was to be rudely upset, however, by the continued advance of technology. This advance had been occurring concurrently with the development of the mainframe, but the true commercial significance would take some years to emerge.

  From mainframes to minicomputers

  The split in the market and IBM’s dominance was assisted by the barriers to entry associated with providing customers with a complete solution from software to peripherals. The costs to new entrants of filling these needs were prohibitive. As a consequence, the entry point typically took the form of addressing the needs of sophisticated customers who did not require the ‘complete solution’. The two major client groups in this respect were the academic community and the military. Frequently new entrants emerged from an association of these two groups. For example, Ken Olsen, who formed Digital Equipment Corporation with Harlan Anderson, had been involved in Project Whirlwind at MIT. DEC was formed in 1957 as a result of Olsen’s desire to move from theoretical demonstrations to practical applications. Fundin
g of $70,000 was received from American Research and Development (ARD), one of the forerunners of the venture capital industry, which had been set up to exploit wartime technological advances. DEC could not compete head on with IBM and on ARM’s advice sought to first establish a beachhead in the sophisticated user segment of the market. The results would eventually be spectacular.

  DEC had begun by producing digital circuit boards for other manufacturers, but in 1960 it was able to put its first computer on the market. This was the Programmed Digital Computer, or PDP–1. By focusing solely on the processor, and ignoring peripherals, DEC was able to sell the PDP–1 for $125,000 ($800,000), a fraction of the $1m or more that would be required for a mainframe. The downside was that such a computer could only be successful in a relatively small market segment. The big breakthrough for DEC was the PDP–8 computer, and the first one to use integrated circuits. The PDP–8 was the answer to the bottlenecking problem faced by all users of computer facilities. Up to the time of the PDP–8, users typically had to compete for access and time on an IBM mainframe. They were in the hands of the administrators of the mainframe and set against other competing users. To a substantial group of potential customers, therefore, the PDP–8 represented salvation.

  8.9 – Shrunken giant: IBM’s market share declines (1952–1996)

  Source: Montage – sources in art itself.

  Whether by design or chance, DEC filled the latent gap in the market and created the minicomputer segment. Customers included academia and medium-sized companies. It was not long before there was sufficient software to accommodate the users. Despite its overwhelming dominance of mainframes, and the rewards which flowed from that, IBM was caught flat-footed. Within a short space of time, DEC had grown to be the world’s number two producer of computers. The simplicity of what DEC had done soon attracted other competitors, but by this time DEC had gone public, raising $8.25m (just over $50m) and looking to erect the same barriers to entry which had protected IBM from immediate assault. These included the development of peripherals, software and sales/customer support. The only problem was that customers had landed on the UNIX operating system developed by Bell Labs as their preferred choice. By this time the market had begun to segment. Large mainframes remained, but were now joined in the marketplace by minicomputers. Minicomputers were more than just slimmed-down versions of a mainframe at a cheaper price. They expanded the market and also increasingly encroached on the mainframe’s territory. What should have become increasingly evident was that the cost of processing power was on a sharply declining curve.

  Conclusions

  The emergence of computer technology differed from many other technologies in that it was largely funded by government. Certainly other industries had benefited from government injections of capital to accelerate development for the armed forces, but computing was fostered both in periods of peace and war. Governments required the ability to collect and analyse large volumes of data for purposes of both taxation and expenditure. In the military sphere, the field of ballistics had always been constrained by a lack of practical ability to fulfil the computational requirements. The demand for computational power from both business and government grew rapidly. Thus while the theoretical work of Babbage never made it to a practical commercial form, it was not long before others pursued the same route.

  The Babbage example shows that governments, just like other customers, are subject to the whims of opinion and perception. When war had to be fought, necessity became the mother of invention. During peacetime the development process proceeded at a more sedate pace. In the US, Hollerith won a major contract with the Census Bureau, and other companies such as American Arithmometer and National Cash Register began to emerge with computational products for the commercial sector. The lead enjoyed by these companies, together with some of their business practices, was to bring them into conflict with government authorities concerned with monopoly practices and the absence of sufficient competition. Although the business segments they occupied remained profitable and growing even through the difficult economic conditions of the early 1900s, their stock market performance remained relatively immune from the excitement and hype which surrounded other industries such as the telephone and the radio. Perhaps the reason lay in the scale of their business and the capital required to make entry plausible. The relatively well-defined demand profile also contributed, combined with the fact that the products by and large required precision-engineering processes which carried little mystique, with success or failure demonstrable.

  The industry continued to grow steadily through to World War II, when effectively a quantum leap took place and analogue machines were superseded by their new numerical counterparts. The war removed constraints on research funding and expenditure, while at the same time bringing together the foremost scientists of the day. Research took place not only in universities and government research facilities but also within existing large corporations. The net result was that by the end of the war the underpinning provided by basic research had advanced, but more importantly the emphasis had shifted to the production of machines which provided practical assistance. After the war, the mainframe sought a commercial peacetime home.

  The early phase of development therefore did not require continuous propaganda to maintain the confidence of private sector investing bodies by proving the theoretical capabilities or, second, demonstrating some form of commercial viability. It was not that these tasks did not have to be undertaken, simply that the sponsoring body was by and large the government, and the funding took place under a special set of wartime conditions. The creation of a commercial machine from the foundations of the wartime effort did not happen overnight, nor did the financial markets or the press initially seem to realise the significance of what was happening. First, in the early stages the inventors had been courting the government rather than the private sector. Second, it took some time for the technology to move to a point where costs were falling while processing power was growing. Only when valves had been replaced by transistors was the trend established and later accelerated with the introduction of semiconductors. Suddenly the size of the market was no longer limited by the level of capital expenditure required, but driven by the ability of producers to bring the cost of the machines down to within reach of a consequently increasing customer base. The impact of this expansion shows up clearly in the results of IBM, the giant of the industry.

  For IBM, so lucrative was the expansion that the implications of the trend that caused it were largely ignored. As a consequence a niche was left open to be quickly and profitably filled by companies such as DEC. Ironically, even this example of a new business being created by producing smaller yet powerful machines did not immediately establish a trend. This was only to come with the advent of the personal computer. Equally, the relatively apathetic reaction to the introduction of the technology only changed when its growth prospects became clearer. When this happened, the stock market reaction was overwhelming – almost as if a dam had burst – and share prices moved to a position well in excess of that justified by the returns being earned or what could plausibly be expected.

  The 20 years after the war witnessed the emergence of a brand new industry. Cumbersome computation machines had been transformed into computers. Valves were replaced by transistors and subsequently semiconductors. New companies were formed to develop the technology further. There were competitors in the existing marketplace for mainframes – a market which remained dominated by IBM. There was a new marketplace for smaller machines, which had been exploited by DEC. Finally, there was a range of companies involved in supplying components, peripherals and software. Most notable among these was Intel, which had emerged, like the majority of the semiconductor specialists, from Fairchild Semiconductor.

  Compatibility was the key to the growth of competition. New, small-scale producers could only hope to enter the market if they could access the existing knowledge and product base. To do this they had to offer compatible
products, otherwise customers would be unwilling to absorb the cost of switching vendor. This placed great power in the hands of the dominant supplier, which effectively set the standards for the industry. The false sense of security this created may have caused these companies to miss new opportunities and lag behind the evolution of the industry. The other ironic impact of compatibility was that the largest companies still controlled the pace of development in the industry, particularly as the design and production of microprocessors became the province of specialist companies.

  * * *

  74 C. Cerf and N. S. Navasky, The Experts Speak: The Definitive Compendium of Authoritative Misinformation, New York: Villard, 1998, p.230.

  75 Ibid.

  76 Ibid.

  77 J. Shurkin, Engines of the Mind: The Evolution of the Computer from Mainframes to Microprocessors, New York: W. W. Norton & Company, 1996, p.273.

  78 M. Campell-Kelly and W. Aspray, Computer: A History of the Information Machine, New York: Basic Books, 1996, p.23.

  79 C. Fletcher, Double the Treachery, 2006, p.9.

  80 Shurkin (1996), p.165.

  81 Ibid., p.189.

  82 Ibid., p.227.

  83 Campell-Kelly and Aspray (1996), p.169.

 

‹ Prev