The virtual memory concept as originally proposed by Kilburn and colleagues is yet another prime example of the creation of a subparadigm within the larger stored-program computing paradigm. Clearly, the “memory management problem” inhered in the overall paradigm. In this sense, the memory management problem was a “normal (computer) science” problem, in Thomas Kuhn’s sense. However, the solution itself—virtual memory—was of such consequence that it created its own subparadigm; it spawned a whole range of problems that needed to be solved for virtual memory to be realizable—theoretical, experimental, design related, architectural, algorithmic—to the extent that virtual memory became a fertile research arena.63 Among the problems to which the virtual memory concept gave rise, and were investigated by computer scientists during the 1960s, were the design of algorithms for transferring pages/segments from virtual address space into main memory, algorithms for selecting pages/segments in main memory for replacement by “incoming” pages/segments, analysis of the performance of paging and segmentation schemes, and analysis of the performance of secondary memory that implemented virtual memory. Most interestingly, the virtual memory subparadigm unveiled new phenomena that became problems in their own right. One was the peculiar nature of “program behavior” in virtual memory environments.64
XIII
Operating systems, as earlier noted, came of age during the 1960s. They were also, by far, the largest and most complex software artifacts built. People were not so much speaking of writing operating systems as of building them. And building large software systems became an engineering enterprise. In October 1968, a conference sponsored by the North Atlantic Treaty Organization (NATO) was held in Garmisch, Germany, on software engineering; a year later, another NATO-sponsored conference on software engineering was held in Rome, Italy.65
The IBM OS/360 released in 1966,66 an operating system for the System/360, required an estimated 5000 man-years between 1963 and 1966 for its design, implementation, and documentation.67 However, we get a more complete sense of the functional range and complexity of the most ambitious and innovative operating systems of this period by considering the Multics system.
In its mature state, Multics, built under the leadership of Fernando J. Corbató (1926–) at MIT, consisted of some 1500 program modules for a total of approximately one million lines of machine code.68 Its structure was a direct consequence of its overall objective: to create a general computer utility analogous to electric power and telephone utilities that would run continuously and reliably, and to provide a comprehensive range of services to a population of users interacting with it through remote terminal access. The designers refined this all-encompassing objective into a collection of more specific capabilities, including time-sharing facilities, virtual memory, protection of users’ programs from unauthorized access, a sophisticated programming environment in which users could work, a number of programming languages at users’ disposal, an interuser communication facility (a forerunner of present-day e-mail), flexibility for enhancing the system in response to new technology and user expectations, and certain other maintenance, monitoring, and management capabilities.
Multics was thus conceived as an instance of what historian of technology Thomas Parke Hughes (1923–) would describe as a “technological system.”69 Multics had an elaborate phylogeny (see Chapter 7, Section VII). It drew upon (a) an operating system called CTSS (Compatible Time Sharing Computer System), also built at MIT between 1960 and 1963 that was, in fact, the first operational time-sharing system70; (b) the combination of the two virtual memory schemes of paging and segmentation; (c) multiprogramming; and (d) a very sophisticated scheme for protecting a user’s programs and data in memory from unauthorized access by other programs.71
The Multics designers drew on these earlier inventions, but they combined, expanded, and generalized them, and—with this synergy—created a significantly original but enormously complex artifact. Moreover, the project constituted a significant experiment in the use of a high-level programming language (PL/I, developed by IBM) to write a large operating system.72
XIV
There is a price to be paid for the kind of complexity operating systems such as OS/360 and Multics manifested. How does a designer or programmer (or a programming team) deal with this kind of complexity? By the time of the NATO 1968 conference, there were rumblings of a “software crisis” And, as Thomas Kuhn stated, a crisis in science leads to the possibility of revolt or even revolution, a paradigm shift.73 At the very least, it led to a serious challenge to the status quo. There were those even before the NATO conferences, even before “software crisis” became an alarmed mantra, who were willing to lead a revolution. The arch-leader was a man from Holland.
NOTES
1. J. R. Rice & S. Rosen. (1994). History of the computer science department of Purdue University. In R. DeMillo & J. R. Rice (Eds.), Studies in computer science: In honor of Samuel D. Conte (pp. 45–72). New York: Plenum.
2. S. H. Lavington. (1998). A history of Manchester computers (p. 46). Swindon, UK: British Computer Society.
3. J. B. Dennis & D. P. Misunas. (1974). A preliminary architecture for a basic data flow processor. CSG memo 102. Cambridge, MA: Laboratory for Computer Science, Massachusetts Institute of Technology.
4. J. W. Backus. (1987). Can programs be liberated from the von Neumann style? A functional style and its algebra of programs (ACM Turing Award lecture for 1977). In Anon. ACM Turing Award lectures: The first twenty years 1966–1985 (pp. 63–130). Reading, MA: Addison-Wesley (original work published 1977).
5. D. A. Huffman. (1954). The synthesis of sequential switching circuits. Journal of the Franklin Institute, 257, 161–190; 275–303; G. H. Mealy. (1955). A method for the synthesis of sequential circuits. Bell Systems Technical Journal, 34, 1045–1079; S. C. Kleene. (1956). Representation of events in nervous nets and finite automata. In C. E. Shannon & E. F. Moore (Eds.), Automata studies (pp. 3–41). Princeton, NJ: Princeton University Press; E. F. Moore. (1956). Gedanken experiments on sequential machines. In Shannon & Moore (pp. 129–153), op cit.
6. See, for example, J. Hartmanis & R. E. Stearns. (1966). Algebraic structure theory of sequential machines. Englewood-Cliffs, NJ: Prentice-Hall.
7. See, for example, M. Minsky. (1967). Computation: Finite and infinite machines. Englewood-Cliffs, NJ: Prentice-Hall.
8. See, for example, Z. Kohavi. (1970). Switching and finite automata theory. New York: McGraw-Hill; G. G. Langdon, Jr. (1974). Logic design: A review of theory and practice. New York: Academic Press.
9. Shannon & Moore, op cit.
10. Minsky, op cit.
11. M. A. Arbib. (1969). Theories of abstract automata. Englewood-Cliffs, NJ: Prentice-Hall.
12. J. E. Hopcroft & J. D. Ullman. (1969). Formal languages and their relation to automata. Reading, MA: Addison-Wesley.
13. M. Davis. (1958). Computability and undecidability. New York: McGraw-Hill; M. Davis. (Ed.). (1965). The undecidable. Hewlett, NY: Raven Press; H. Rogers. (1967). Theory of recursive functions and effective computability. New York: McGraw-Hill.
14. N. Chomsky. (1957). Syntactic structures (pp. 18–19). The Hague: Mouton.
15. N. Chomsky. (1959). On certain formal properties of grammars. Information and Control, 2, 137–167.
16. Hopcroft & Ullman, op cit.
17. R. W. Floyd. (1963). Syntactic analysis and operator precedence. Journal of the ACM, 10, 316–333; D. E. Knuth. (1965). On the translation of languages from left to right. Information & Control, 8, 607–639.
18. J. Earley. (1968). An efficient context-free parsing algorithm. PhD dissertation, Carnegie-Mellon University; D. H. Younger. (1967). Recognition and parsing of context-free languages in time n2. Information & Control, 10, 189–208.
19. H. D. Huskey & W. H. Wattenburg. (1961). A basic compiler for algebraic expressions. Communications of the ACM, 4, 3–9; B. Randell & L. J. Russell. (1964). Algol 60 implementation. New York: Academic Press.
20. F. E.
Allen. (1969). Program optimization. In Annual Review in Automatic Programming (Vol. 5). Oxford: Pergamon Press.
21. Randell & Russell, op cit.
22. P. Z. Ingermann. (1960). A syntax-oriented translator. New York: Academic Press; J. A. N. Lee. (1967). Anatomy of a compiler. New York: Rheinhold; F. R. A. Hopgood. (1969). Compiling techniques. London: MacDonald.
23. R. W. Floyd. (1964). The syntax of programming languages: A survey. IEEE Transactions on Electronic Computers, EC-13, 346–353.
24. R. A. Brooker, I. R. MacCullum, D. Morris, & J. S. Rohl. (1963). The compiler-compiler. In Annual Review in Automatic Programming (Vol. 3). Oxford: Pergamon Press; R. A. Brooker, D. Morris, & J. S. Rohl. (1967). Experience with the compiler-compiler. Computer Journal, 9, 345–349.
25. W. M. McKeeman, J. J. Horning, & D. B. Wortman. (1970). A compiler generator. Englewood-Cliffs, NJ: Prentice-Hall.
26. T. S. Kuhn. (1970). The structure of scientific revolutions (2nd ed.). Chicago, IL: University of Chicago Press.
27. G. M. Amdahl, G. A. Blaauw, & F. P. Brooks, Jr. (1964). Architecture of the IBM System/360. IBM Journal of Research & Development, 8, 87–101.
28. R. F. Rosin. (1969a). Contemporary concepts of microprogramming and emulation. Computing Surveys, 1, 197–212.
29. S. G. Tucker. (1965). Emulation of large systems. Communications of the ACM, 8, 753–761.
30. C. G. Bell & A. Newell. (1971). Computer structures: Readings and examples. New York: McGraw-Hill.
31. S. Dasgupta. (1989). Computer architecture: A modern synthesis (Vol. 1). New York: Wiley.
32. D. E. Knuth. (1968). The art of computer programming: Vol. 1. Fundamental algorithms. Reading, MA: Addison-Wesley.
33. D. E. Knuth. (1969). The art of computer programming: Vol. 2. Seminumerical algorithms. Reading, MA: Addison-Wesley.
34. The third volume of the series was : D.E. Knuth (1973). The art of computer programming: Vol. 3. Sorting and searching. Reading, MA: Addison-Wesley—“Knuth Volume 3.”
35. For collections of his papers and essays on his different interests in computer science, see, for example, D. E. Knuth. (1992). Literate programming. Stanford, CA: Center for the Study of Language and Information; D. E. Knuth. (1996). Selected papers on computer science. Stanford, CA: Center for the Study of Language and Information.
36. Knuth, 1968, op cit., pp. 1–2.
37. Oxford English Dictionary (2nd ed.). Available: http://www.oed.com
38. Minsky, op cit., p. 106.
39. Knuth, 1968, op cit., pp. 8–9.
40. Knuth, 1968, op cit., pp. 4–6.
41. D. E. Knuth. (1966). Letter to the editor. Communications of the ACM, 9, 654.
42. Knuth, 1968, op cit., p. 7.
43. See, for example, A. V. Aho, J. E. Hopcroft & J. D. Ullman. (1974). The design and analysis of algorithms (pp. 2–5). Reading, MA: Addison-Wesley.
44. J. Hartmanis & R. E. Stearns. (1965). On the computational complexity of algorithms. Transactions of the American Mathematical Society, 117, 285–306; P. M. Lewis, II, R. E. Sterns, & J. Hartmanis. (1965). Memory bounds for recognition of context-free and context-sensitive languages. Conference Record, IEEE 6th Annual Symposia on Switching Circuit Theory and Logic Design, 191–202.
45. Oxford English Dictionary (2nd ed). Available: http://www.oed.com
46. Ibid.
47. Ibid.
48. Ibid.
49. A. Padegs. (1964). The structure of System/360. Part IV: Channel design considerations. IBM Systems Journal, 3, 165–180.
50. R. F. Rosin. (1969b). Supervisory and monitor systems. Computing Surveys, 1, 37–54.
51. G. H. Mealy, B. I. Witt, & W. A. Clark. (1966). The functional structure of the OS/360. IBM Systems Journal, 5, 3–51.
52. Rosin, 1969b, op cit.
53. W. Y. Stevens. (1964). The structure of System/360. Part II: System implementation. IBM Systems Journal, 3, 136–143.
54. T. Kilburn, D. B. G. Edwards, M. J. Lanigan, & F. H. Sumner. (1962). One-level storage system. IRE Transactions on Electronic Computers, EC-11, 223–235. An earlier but much briefer paper on this same proposal was written by J. Fotheringham. (1961). Dynamic storage allocation in the Atlas computer, including the automatic use of a backing store. Communications of the ACM, 4, 435–436.
55. P. J. Denning. (1970). Virtual memory. Computing Surveys, 2, 153–190.
56. Lavington, op cit.
57. Ibid., p. 41.
58. Ibid.
59. J. K. Iliffe & J. G. Jodeit. (1962). A dynamic storage allocation scheme. Computer Journal, 5, 200–209; J. K. Iliffe. (1972). Basic machine principles (2nd ed., pp. 25–29). London: Macdonald.
60. Ibid.
61. J. B. Dennis. (1965). Segmentation and design of multiprogrammed computer systems. Journal of the ACM, 12, 589–602; J. B. Dennis & E. C. Van Horn. (1966). Programming semantics for multiprogrammed computations. Communication of the ACM, 9, 143–155; R. C. Daley & J. B. Dennis. (1968). Virtual memory, processes, and sharing in MULTICS. Communications of the ACM, 11, 306–312.
62. E. I. Organick. (1972). The Multics system: An examination of its structure. Cambridge, MA: MIT Press.
63. See, for example, Denning, op cit., for an excellent picture of this subparadigm as it was in 1970.
64. P. J. Denning. (1968). The working set model of program behavior. Communications of the ACM, 11, 323–333; P. J. Denning. (1968). Thrashing: Its causes and prevention. Proceedings of the AFIPS 1968 Fall Joint Computer Conference, 33, 915–922.
65. J. N. Buxton, P. Naur, & B. Randell. (Eds.). (1975). Software engineering. New York: Petrocelli (report on two NATO conferences held in Garmisch, Germany [October 1968] and Rome, Italy [October 1969]).
66. Mealy, Witt, & Clark, op cit.
67. F. P. Brooks, Jr. (1975). The mythical man-month: Essays in software engineering (p. 31). Reading, MA: Addison-Wesley.
68. F. J. Corbató, J. H. Saltzer, & C. T. Clingen. (1975). Multics: The first seven years. In P. Freeman (Ed.), Software system principles (pp. 556–577). Chicago: SRA (see especially p. 560).
69. T. P. Hughes. (1987). The evolution of large technological systems. In W. E. Bijker, T. P. Hughes, & T. J. Pinch (Eds.), The social construction of technological systems (pp. 51–82). Cambridge, MA: MIT Press.
70. F. J. Corbató. (1963). The compatible time sharing system. Cambridge, MA: MIT Press.
71. For a discussion of memory protection schemes developed in the time frame of the Multics development, see M. V. Wilkes. (1975). Time sharing computer systems (3rd ed., pp. 52–83). London: Macdonald & Jane’s (original work published 1968).
72. F. J. Corbató. (1969). PL/I as a tool for system programming. Datamation, May, 68–76.
73. Kuhn, op cit.
16
Aesthetica
I
IN 1965, THE Dutch computer scientist Edsger Dijkstra (1930–2002), then professor of mathematics at the Technische Universiteit Eindhoven (THE) in the Netherlands, wrote a paper titled “Programming Considered as a Human Activity” and thereby announced the birth of a movement to which he gave the name structured programming a few years later.1 Within the next 10 years, the movement would cause so much upheaval in the realm of programming, some came to call it a revolution—the structured programming revolution—and Dijkstra was viewed as its originator.2
The movement did not precipitate an overthrow of the stored-program computing paradigm as a whole, but insofar as designing and building software systems was a major component of this paradigm, structured programming altered the very essence of the subparadigm in computer science that came to be called programming methodology. It brought about a new mentality concerning programming and its methodology.
A major part of this mini-revolution actually occurred during the 1970s, but its foundations were laid during the second half of the 1960s by just a handful of publications. And Edsger Dijkstra was the revolutionary-in-chief. He laid out the gospel.
II
Dijkstra’
s undergraduate training was in mathematics and physics at the University of Leyden; he went on to obtain a PhD in computing in 1959 from the Mathematics Centrum in the University of Amsterdam and worked there until 1962 before accepting a chair in mathematics at the Technische Universiteit Eindhoven.3
As a computer scientist, mathematics was a source of inspiration for him, not only in terms of the method-of-proof construction, but also in the mathematician’s search for beauty in mathematical reasoning. He quoted 19th-century English logician and mathematician George Boole, who spoke of perfectness in mathematical reasoning not just in terms of efficiency, but also in whether a method exhibited “a certain unity and harmony.”4 And he tells us that contrary to the tacit assumption on the part of many that such aesthetic considerations as harmony and elegance were unaffordable luxuries in the hurly-burly world of programming, it actually paid to cultivate elegance. This became a mantra for him.5
So the search for beauty in programming—a programming aesthetic—was a prime desideratum for Dijkstra. He did not mention English mathematician Godfrey Harold Hardy (1877–1947) and his haunting book A Mathematician’s Apology (1940), but he would surely have approved Hardy’s assertion that there is no room for ugliness in mathematics.6 There was certainly no place for ugliness in Dijkstra’s world of computing.
But Dijkstra’s search for a programming aesthetic and the inspiration he took from mathematical reasoning was also prompted by a more practical consideration: mathematical reasoning also served as an exemplar of how the human mind, with its limited cognitive capacity, can deal with complexity.7
It Began with Babbage Page 38