by Daniel Bell
Japan, too, has been following a similar trajectory in which the expansion in services has been at the expense of industry. For a detailed discussion, see Henry Rosovsky, “Japan’s Economic Future,” Challenge, July/August, 1973. In this essay Rosovsky develops a definition of “economic maturity” which is interesting in the light of sector changes that have taken place in the industrializing countries in the last fifty years. He writes: “Economic maturity is a difficult term to define but as used here it has a narrow meaning. Let us call it that state in which the incentives of sectoral labor force reallocation have become minimal—in the extreme case, impossible” (p. 16).
1 The other participants included S. N. Eisenstadt of Hebrew University, Rein-hard Bendix of Berkeley, Zbigniew Brzezinski of Columbia, Michel Crozier of Paris, Zygmunt Bauman of Tel Aviv, Helio Jaguaribe of Brazil, Juan Linz of Yale, Ota Sik of Basle, Andrew Shonfield of Chatham House, David Lockwood of Essex, Stanley Hoffmann of Harvard, and Stephen Graubard of Cambridge, Mass.
1 In his essay, “Has Futurology a Future?” Robert Nisbet writes: “The essence of futurology is that the future lies in the present just as our present once lay in the past.... Fundamental, it seems to me, to futurology is the attractive, but utterly fallacious, assumption that the continuity of time is matched by the continuity of change or the continuity of events.” Encounter (November 1971. Emphasis in original). To use an old Russian proverb, Mr. Nisbet is knocking down an open door. He has set up a group of metaphors—the future, time, change—with no reference to content or relationships, so that incongruities between the words as words can easily be created. The methodological issue is the kinds of forecasting about different sorts of social phenomena. That is why I have never liked or used the term futurology; it is essentially meaningless.
2 This is a common confusion. For example, one hears much talk of consciousnesy or consciousness-raising. Yet as William James pointed out long ago, there is no such thing as consciousness, only consciousness of something. See “The Stream of Consciousness,” chap. 2 of Psychology: The Briefer Course (New York, 1961; originally published in 1892).
3 The specific techniques are discussed in the section on technological forecasting in chap. 3.
4 The illustration is taken from Kenneth Boulding, “Expecting the Unexpected: The Uncertain Future of Knowledge and Technology,” in Prospective Changes in Society by 1980, Designing Education for the Future, vol. 1 (Colorado Department of Education, 1966).
5The Brookings Quarterly Econometric Model of the United States, ed. James S. Duesenberry, Gary Fromm, Lawrence R. Klein, and Edwin Kuh (Chicago, 1965), p.734.
The Brookings model breaks down the economy into 36 producing sectors and government plus 18 other major components (such as consumer demand, twenty variables; labor force and marriage, eighteen variables; residential construction, 23 variables; foreign trade, nine variables, etc.). See chap. 18, “The Complete Model: A First Approximation.”
6 The paradigm for this is the “second-degree lie” exemplified in the Minsk-Pinsk joke. Two men are standing in a railroad station. The first asks: “Where are you going?” “To Minsk,” the other replies, “to buy some cotton goods.” “Huh,” snorts the first, “you’re telling me you are going to Minsk to buy some cotton goods to make me think you are going to Pinsk to buy some woolen goods, but I know you are going to Minsk to buy some cotton goods; so why do you tell me lies!”
7 For a general survey of social trend analysis, see Otis Dudley Duncan, “Social Forecasting: The State of the Art,” The public Interest, no. 17 (Fall 1969).
8 For Tocqueville, see Democracy in America, translated by George Lawrence and edited by J.P. Mayer and Max Lerner (New York, 1966), Author’s Introduction, pp. 5–6. For Weber, see chap. 11, “Bureaucracy,” in Economy and Society, vol. 3 (New York, 1968). Written between 1914 and 1920, it was uncompleted on Weber’s death in that year. (The first German edition appeared in 1922.) As Weber wrote, “The United States still bears the character of a polity which, at least in the technical sense, is not fully bureaucratized. But the greater the zones of friction with the outside and the more urgent the needs for administrative unity at home become, the more this character is inevitably and gradually giving way formally to the bureaucratic structure.” Ibid., p. 971.
9 Quoted in Werner Heisenberg, Physics and Beyond: Encounters and Conversations (New York, 1971), p. 63.
10 Walter Buckley, Sociology and Modern Systems Theory (Englewood Cliffs, N.J., 1967), pp. 42-45, passim.
11 For Weber’s formulation, see the General Economic History (London, n.d.), chap. 30, esp. p. 354.
12 For a vivid illustration of these differences, see the beautiful maps of Richard Edes Harrison, Look at the World (New York, 1944).
13 There is an inherent risk in taking a concept derived from one field bodily into another, and the social sciences particularly have been bedeviled by such borrowing. For example, the use of the terms force and power from physics, and structure and function from biology. Complementarity was used by Niels Bohr to explain the contradictory behavior of light as wave and particle, but Bohr did feel, according to my colleague, the physicist Gerald Holton, that the principle applied to many phenomena in nature and society. This may have been the hubris of a great man infatuated with the discovery of a compelling principle. Since the concept is suggestive, let me say only that I use it simply as a metaphor and not as an explanatory device.
This discussion of axial structures and conceptual schemata is elaborated in my essay “Macro-Sociology and Social Change,” in Theories of Social Change: A Stocktaking, which I have edited for the Russell Sage Foundation, to be published in 1974. A different use of the idea of conceptual schemata appears in Georges Gurvitch’s The Social Frameworks of Knowledge (Oxford, 1971; published originally in French in 1966). Gurvitch seeks to define a succession of historical social types and the kinds of cognitive systems associated with each. To that extent he is elaborating the kind of sociology of knowledge developed by Max Scheler in his Die Wissenformen und die Gesellschaft (1926).
14 Alfred North Whitehead, Science and the Modern World (New York, 1960; original edition, 1925), p. 46.
15 In Table 3 the projected figure for the number of professional and technical persons in 1975 is given as 13.2 million and in Table 4 as 12.9 million. The discrepancies are due in part to the fact that the figure in Table 4 was calculated five years later, and also because different assumptions about the unemployment rate were made. I have let the figures stand to indicate the range.
16Matthew Josephson, Edison (New York, 1959). p. 361.
17 Aviation is an interesting transition. The first inventors were tinkerers, but the field could develop only through the use of scientific principles. Langley (1891) and Zahm (1902-1903) started the new science of aerodynamics by studying the behavior of air currents over different types of airfoils. At the same time, in 1900, the Wright brothers began tinkering with gliders, and in 1903 put a gasoline-powered engine into an airplane. But further work was possible only through the development, after 1908, of experiments (such as models in wind tunnels) and mathematical calculations (such as airflows over different angles of wings) based on physical laws.
18 See Eduard Farber, “Man Makes His Materials,” in Kransberg and Pursell, eds. Technology and Western Civilization, vol. 2 (New York, 1967).
19 See L. F. Haber, The Chemical Industry, 1900-1930 (Oxford, 1971), chap. 7, pp. 198-203. As Haber writes:
“The Haberprocess... was still largely an unknown factor when the Great War broke out. The synthesis of ammonia... represents one of the most important advances in industrial chemistry.... The process, discovered by Fritz Haber and developed industrially by Carl Bosch, was the first application of high-pressure synthesis; the technology of ammonia production, appropriately modified, was used later in the synthesis of methanol and the hydrogenation of coal to petroleum. Its influence extends to present-day techniques of oil refining and use of cracker gases from refining op
erations for further synthesis.” Ibid., p. 90.
20 Thomas Jones, A Diary with Letters (New York, 1954), p. 125. Lindsay is A. D. Lindsay, Master of Balliol College for twenty-five years until 1949. T. J. is an ironic reference by Jones to himself.
21 See Rexford G. Tugwell, The Democratic Roosevelt (New York, 1957), chap. 15, esp. pp. 312–313.
22 The Keynesian revolution in economics actually occurred after most of the economies had recovered from the depression even though many policies, particularly so-called unbalanced budgets or deficit financing, were adopted by trial-and- error and had “Keynesian” effects. The most self-conscious effort to use the new economics was in Sweden, where the socialist finance minister, Ernest Wigforss, broke away from Marxist thinking and, on the advice of the economists Erik Lindahl and Gunnar Myrdal, pursued an active fiscal and public-works policy which was Keynesian before Keynes, i.e. before the publication of Keynes’ General Theory in 1936.
23 Thirty years ago few, if any, graduate schools taught mathematical economics. The turning point, probably, was the publication of Paul Samuelson’s Foundations of Economic Analysis in 1947, which presented a mathematically formalized version of neoclassical economics. Today, no one can work in economic theory without a solid grounding in mathematics.
24 It is striking that during the depression there was no real measure of the extent of unemployment because of the confusion over a conceptual definition and the lack of sample survey techniques to make quick counts; the government relied on the 1930 census and some estimates from manufacturing establishments. In 1921, when President Harding called a conference of experts to discuss the unemployment that accompanied the postwar depression, estimates ranged widely and the final figure published was decided, literally, by majority vote. The confusions about who should be counted, or what constituted the “labor force,” continued through the 1930s and a settled set of definitions and figures emerged only in the 1940s. Nor were there, of course, the Gross National Product and national-income accounts to give a view of the economy as a whole. This came into public-policy use only in 1945. (I am indebted to an unpublished dissertation at MIT by Judith de Neufville, on social indicators, for the illustration on unemployment statistics.)
25 Charles Wolf, Jr., and John H. Enns have provided a comprehensive review of these developments in their paper “Computers and Economics,” Rand Paper P-4724. I am indebted to them for a number of illustrations.
26 Mathematically speaking, an input-output matrix represents a set of simultaneous linear equations—in this case 81 equations with 81 variables which are solved by matrix algebra. See Wassily Leontieff, The Structure of the American Economy: Theoretical and Empirical Explorations in Input-Output Analysis (New York, 1953). Ironically, when the Bureau of Labor Statistics tried to set up an input-output grid for the American economy in 1949, it was opposed by business on the ground that it was a tool for socialism, and the money was initially denied.
27 Their conclusions: that the largest impact on real GNP came from increases in government nondurable and construction expenditures. Income-tax cuts were less of a stimulant than increase in expenditures. Gary Fromm and Paul Taubman, Policy Simulations with an Econometric Model (Brookings Institution, Washington, D.C., 1968), cited in Wolf and Enns, op. cit.
28 With modern economic tools, Robert M. Solow argues, an administration can, within limits, get the measure of economic activity it wants, for the level of government spending can redress the deficits of private spending and step up economic activity. But in so doing, an administration has to choose between inflation or full employment; this dilemma seems to be built into the market structure of capitalist economies. An administration has to make a trade-off—and this is a political choice. Democrats have preferred full employment and inflation. Republicans price stability and slow economic growth.
In the last few years, however, there has been the new phenomenon— simultaneous high unemployment and high inflation. For reasons that are not clear, unemployment no longer “disciplines” an economy into bringing prices down, either because of substantial welfare cushions (e.g. unemployment insurance), wage-push pressure in organized industries, or the persistent expectation of price rises that discounts inflation.
The two turning points in modern economic policy were President Kennedy’s tax cut in 1964, which canonized Keynesian principles in economic policy, and President Nixon’s imposition of wage and price controls in 1971. Though mandatory controls were relaxed in 1973, the option to use them now remains.
29Technology: Processes of Assessment and Choice, Report of the National Academy of Sciences, U.S. House of Representatives, Committee on Science and Astronautics, July 1969.
30 To further the idea of technology assessment, the National Academy of Engineering undertook three studies in developing fields, that of computer-assisted instruction and instructional television; subsonic aircraft noise; and multiphasic screening in health diagnosis. The study concluded that technology assessment was feasible, and outlined the costs and scope of the necessary studies. In the case of technological teaching aids, the study considered eighteen different impacts they might have. In the case of noise, they examined the costs and consequences of five alternative strategies, from relocating airports or soundproofing nearby homes to modifying the airplanes or their flight patterns. See A Study of Technology Assessment, Report of the Committee on Public Engineering Policy, National Academy of Engineering, July 1969.
The idea of “technology assessment” grew largely out of studies made by the House Science and Astronautics Committee, and in 1967 a bill was introduced in the House by Congressman Daddario for a Technology Assessment Board. The bill was passed in 1972 and the Congress, not the Executive, is charged with setting up a Technology Assessment Office.
31Science and the Modern World, p. 141.
32 Warren Weaver, “Science and Complexity,” in The Scientists Speak, ed. Warren Weaver (New York, 1947). I am indebted to a former special student at Columbia, Norman Lee, for this citation and for a number of other suggestions in this section.
33 Jagit Singh, Great Ideas of Operations Research (New York, 1968).
34 Harvey Brooks, “Technology and the Ecological Crisis,” lecture given at Amherst, May 9, 1971, p. 13 from unpublished text, emphasis added. For an application of these views, see the reports of two committees chaired by Professor Brooks, Technology, Processes of Assessment and Choice, Report of the National Academy of Science, published by the Committee on Science and Astronauticism, U.S. House of Representatives, July 1969; and, Science Growth and Society, OECD (Paris, 1971).
35 Most of the day-to-day problems in economics and management involve decision-making under conditions of certainty; i.e. the constraints are known: These are such problems as proportions of product mixes under known assumptions of cost and price, production scheduling by size, network paths, and the like. Since the objectives are clear (the most efficient routing, or the best profit yield from a product mix), the problems are largely mathematical and can be solved by such techniques as linear programming. The theory of linear programming derives from a 1937 paper by John von Neumann on the general equilibrium of a uniformly expanding closed economy. Many of the computational procedures were developed by the Soviet economist L.V. Kantorovich, whose work was ignored by the regime until Stalin’s death. Similar techniques were devised in the late 1940s by the Rand mathematician G.B. Dantzig, in his simplex method. The practical application of linear programming had to await the development of the electronic computer and its ability (in some transportation problems, for example) to handle 5200 equations and 600,000 variables in sequence. Robert Dorfman has applied linear programming to the theory of the firm, and Dorfman, Samuelson and Solow used it in 1958 in an inter-industry model of the economy to allow for substitutability of supply and a criterion function that allows a choice of solutions for different objectives within a specified sector of final demand.
Criteria for decision-making under cond
itions of uncertainty were introduced by the Columbia mathematical statistician Abraham Wald in 1939. It specifies a “maximin” criterion in which one is guided by an expectation of the worst outcome. Leonid Hurwicz and L.J. Savage have developed other strategies, such as Savage’s charmingly named “criteria of regret,” whose subjective probabilities may cause one to increase or decrease a risk.
Game theory has a long history but the decisive turn occurred in a 1928 paper of John von Neumann which provided a mathematical proof of a general minimax strategy for a two-person game. The 1944 book by von Neumann and Morgenstern, Theory of Games and Economic Behavior (Princeton), extended the theory of games with more than two persons and applied the theorem to economic behavior. The strategy proposed by von Neumann and Morgenstern—that of minimax, or the minimization of maximum loss—is defined as the rational course under conditions of uncertainty.
Games-and-decision theory was given an enormous boost during World War II, when its use was called “operations research.” There was, for example, the “duel” between the airplane and the submarine. The former had to figure out the “best” search pattern for air patrol of a given area; the other had to find the best escape pattern when under surveillance. Mathematicians in the Anti-Submarine Warfare Operations Research Group, using a 1928 paper of von Neumann, figured out a tactical answer.
The game-theory idea has been widely applied—sometimes as metaphor, sometimes to specify numerical values for possible outcomes—in bargaining and conflict situations. See Thomas C. Schelling, The Strategy of Conflict (Cambridge, Mass., 1960).
36 R. Duncan Luce and Howard Raiffa, Games and Decisions (New York, 1957). My discussion of rationality is adapted from the definition on p. 50; that of risk, certainty, and uncertainty from p. 13.
37 See Charles J. Hitch, “Analysis for Air Force Decisions,” in Analysis for Military Decisions: the Rand Lectures on Systems Analysis, ed. E. S. Quade (Chicago 1964). His illustration is conjectural. A more relevant but much more complicated illustration is Quade’s case history, in the same volume, on the selection and use of strategic air bases.