Complexity and the Economy

Home > Other > Complexity and the Economy > Page 28
Complexity and the Economy Page 28

by W Brian Arthur


  140

  120

  100

  80

  60

  40

  20

  0

  0

  10

  20

  30

  40

  50

  Generation

  Figure 1:

  Structural depth (number of parts in parse tree) of the currently best expression plotted against number of generations of search in the problem of finding a Fourier series expression to match a given function (from Koza,8 p. 502).

  Collapse near the base of a system can be seen in a very different context, the history of science, when new theories suddenly replace old, elaborate ones. An example is the collapse of Ptolemaic astronomy caused by the Kepler-Newton

  version of the Copernican theory. This novel system, that explained planetary

  orbits using only a few simple laws, struck at the root base of the hugely complicated Ptolemaic system; and it had such superior explanatory power that

  the Ptolemaic system never recovered. Similarly, Whittle’s jet engine, with its extraordinarily simple propulsion principle, largely replaced the piston aero

  engine of the 1930s, which had become incurably complicated in attempts to

  overcome the limitations in operating internal combustion engines at high

  speed in the very thin air of higher altitudes.4 And so in evolving systems,

  bursts of simplicity often cut through growing complexity and establish a new

  basis upon which complication can again grow. In this back-and-forth dance

  between complexity and simplicity, complication usually gains a net edge

  over time.

  So far I have described two apparently separate mechanisms. In the first,

  ecosystems—collections of many individuals—become more complex, more

  diverse, in the course of evolution; in the second, individuals within ecosys-

  tems become internally more complex, structurally deeper, in the course of

  evolution. In many systems, of course, these mechanisms operate simultane-

  ously, and they may interact, alternate, and even compete.

  on t He evolu t ion of comPlexi t y [ 151 ]

  This can be seen clearly in Kristian Lindgren’s study of strategies that evolve in a game-theoretic setting.10 Lindgren sets up a computerized model

  populated by strategies that meet randomly and cumulate profit by playing,

  one-on-one, a finite version of the iterated prisoners’ dilemma. The compet-

  ing strategies are described as coded bit-strings, where the bits represent

  memory of previous plays that the strategies can take account of. The strate-

  gies can occasionally mutate. Successful ones proliferate in this coevolution-

  ary environment; unsuccessful ones die out. In Lindgren’s world, it can clearly be seen that the diversity of strategies increases as new coevolving strategies provide niches that can be exploited by fresh, new strategies, exactly as in the first mechanism I have discussed. But the strategies themselves also become

  increasing “deep”—their code string or memory lengthens—as competition

  rewards increasingly subtle strategies, as in the second mechanism. In fact,

  the two mechanisms interact in that the arrival of a new, successful, deeper

  strategy eliminates many of the previous, simpler strategies. Diversity col-

  lapses, and with it many of the niches it provides. There follows a phase in

  which the newer, deeper strategies mutate and proliferate, so that diversity

  increases again. And so new depth can both destroy old diversity and feed a

  new round of increased diversity among newer, deeper strategies. In this way,

  the growth of coevolutionary diversity alternates in a sporadic way with the

  growth of structural depth in the strategies. This process has obvious paral-

  lels in the history of biological evolution. Some biologists suggest, for example, that increased “depth” in the form of the appearance of multicellular,

  eukaryotic organisms fueled the Cambrian explosion of diversity 600 million

  years ago.15

  CAPTURING SOFTWARE

  The third mechanism in the growth of complexity that I will propose is com-

  pletely different from the first two. Actually it has more to do with the rapid emergence of complexity than with any slow growth. It is a phenomenon

  I will call capturing software. This is the taking over and “tasking” of simpler elements by an outside system for its own (usually informational) purposes.

  Typically the outside system “discovers” the simpler elements and finds it

  can use them for some elementary purposes. The elements turn out to have

  a set of rules that govern how they may be combined and used—an “inter-

  active grammar.” This grammar typically allows many combinations of the

  simple elements; and as the outside system begins to learn this grammar,

  it also learns to take advantage of the elements in combination. At full frui-

  tion, the outside system learns to use this interactive grammar to “program”

  the simple elements and use them in complicated combinations for its own

  multipurpose ends.

  [ 152 ] Complexity and the Economy

  This mechanism may sound somewhat strange and unfamiliar; so let me clarify it by some examples. A very simple one would be electronics, taken as

  a technology. As humans, we have learned over the last couple of centuries to

  “task” electrons to carry out such activities as transmitting sound and vision, controlling sophisticated machinery, and computing. Originally, in the days of Faraday and Franklin, the workings of electrons and of static electricity were poorly understood. And so, uses were few. But in the last century and in the

  early decades of this one, we began to learn the “grammar” of electricity—the

  set of operational rules involving induction, capacitance, and impedance that

  govern the movements of electrons and amplification of their flow. And so we

  slowly learned to “capture” and “program” electrons for our own use. In this

  case the simple elements referred to above are electrons. The outside system

  is ourselves, the human users. The grammar is the laws of electromagnetism.

  And the programmable outputs are the various technical uses to which elec-

  tronics are put. At the output level, there is swift “adaptation.” The various technological purposes in which we use electrons as a “programmable software” shift and expand rapidly. But at the grammar and carrier level, in this

  case, adaptation is absent. The behavior of electricity and of electrons is fixed by physical laws that are, within the human time frame at least, immutable.

  Sometimes with capturing software, the interactive grammar is not laid

  down unalterably, but can itself change and evolve in the process of “captur-

  ing” the software. An example is the way in which human language evolved.

  Early humans learned perhaps several hundred thousand years ago that

  crude, emitted sounds could be used for communicating warnings, pleasure,

  or simple needs. Very slowly, and comparatively recently on an evolutionary

  time scale, elementary rules—a grammar—began to emerge to organize these

  into simple concatenated expressions. Eventually, over many thousands of

  years, these sounds or phonemes plus grammar evolved into a complex inter-

  active system—a language. This could be “programmed” to form statements,

  queries, and commands that conveyed a high degree of nuance and subtlety.

  In this example, the simple, carrier elements are the sounds or phonemes

  of human speech. T
he outside system is the human community that “captures”

  and makes them into a software, a language. And the grammar is the syntacti-

  cal system that develops to ensure consistency and commonality of meaning.

  Of course, there is no single syntactical grammar for all human languages.

  A grammar must emerge by the slow evolution of a social convention, with

  constraints exercised by the need for linguistic efficiency and consistency,

  and by the way linguistic activities are organized in the human brain.9 (Of

  course, both the human vocal anatomy and brain also changed as a response

  to the evolution of language.) The overall language that results from this evolutionary process is a programmable software whose potential output we may

  think of as the set of all meaningful sentences or statements the language can express.

  on t He evolu t ion of comPlexi t y [ 153 ]

  Adaptation in this case can occur at all levels. At the program output level, adaptation is instantaneous. We can think of a sentence uttered as a one-off,

  extremely rapid adaptation of software output to the purpose of what the sen-

  tence is intended to communicate. At the grammar level, adaptation implies

  change in the language itself. This commonly takes the form of drift, and it

  happens slowly and continuously. This is because any abrupt alteration or

  large deviation in grammar would invalidate current “output programs.” At

  the phoneme or simple element level, adaptation—or change and drift—is

  slowest of all. Slight changes at this carrier level, if not subtle and continuous, might upset all that is expressed in the system. Slow drift may occur, as when vowels shift over the course of a generation or two; but there is a powerful

  mechanism acting to keep the carrier elements locked-in to a constant way of

  behaving.

  A particularly telling example of capturing software is the way in which

  sophisticated derivatives have arisen and are used in recent years in finan-

  cial markets. In this case the outside system is the financial community. It

  begins by the simple trading of something of value—soybeans, securities, for-

  eign currencies, municipal bonds, Third World debt, packages of mortgages,

  Eurodollars—anything to which title can be held. Such items may fluctuate in

  value and can be swapped and traded. They are called underlyings in financial

  jargon, and they form the simple, carrier elements of the system I want to

  consider now.

  In early days of such markets, typically an underlying is simply held and

  traded for its intrinsic value. But over time, a grammar forms. Traders find

  they can: (a) usefully arrange options associated with contingent events that affect the underlying; (b) put several underlyings together to create an associated index, as with a stock index; (c) issue futures contracts to deliver or obtain an underlying at some time, say, 60 days or one year, in the future; and (d) issue securities backed by the underlying. But notice that such “derivatives” as contingent-event options, indices, futures, and securities are themselves elements of value. Thus, they, too, can become underlyings, with their own traded values. Once again the market could apply (a), (b), (c), or (d) to these new

  underlyings. We may then have options on securities, index-futures, options

  on futures, securities indices, and so on, with trades and swaps of all these.

  With such a grammar in place, derivatives experts “program” these ele-

  ments into a package that provides a desired combination of financing,

  cash-flow, and risk exposure for clients with highly particular, sophisticated financial needs. Of course, financial markets did not invent such programming all at once. It evolved in several markets semi-independently, as a carrier element was used, simply at first and then in conjunction with the natural

  grammar of finance.

  From the examples I have given, it may seem that the system that uses

  and captures simple elements to its own uses is always a human one. But, of

  [ 154 ] Complexity and the Economy

  course, this is not the case. Let me point out two examples in the biological sphere. One is the formation of neural systems. As certain organisms evolved,

  they began to “task” specialized cells for the simple purposes of sensing and

  modulating reactions to outside stimuli. These specialized cells, in turn, developed their own interactive grammar; and the overall organism used this to

  “program” this interconnected neural system to its own purposes. Similarly,

  the ancestors of the cells found in the immune systems of higher organisms

  were used originally for simple purposes. Over time, these, too, developed

  useful rules of interaction—an interactive grammar—thereby eventually

  becoming a highly programmable system that could protect against outside

  antigens.

  Biological life itself can be thought of in this way. Here the situation is

  much more complicated than in the previous examples. Biological organisms

  are built from modules—cells mainly—that in turn are built from relatively

  small and few (about 50 or so), fairly simple molecules.12 These molecules

  are universal across all terrestrial life and are the carriers of biological construction. They are combined into appropriate structures using a grammar

  consisting of a relatively small number of metabolic chemical pathways. This

  metabolic grammar, in turn, is modulated or programmed by enzymes. The

  enzymes doing the programming of course have no conscious purpose. In fact

  they themselves are the carriers in a second programmed system. They are

  governed by a complicated gene-expression “grammar,” which switches on or

  inhibits their production from the genes or DNA that code for them, according

  to feedback received from the state of the organism they exist in. And so we

  have one captured software system, the programming of the simple metabolic

  pathways via proteins or enzymes to form and maintain biological structures,

  modulated by another captured software system, the programming of pro-

  teins or enzymes via nucleic acids and the current state of the organism.

  In this case the entire system is closed—there is no outside system pro-

  gramming the biological one to its own purposes. In the short term each

  organism programs itself according to its current development and current

  needs. In the long term the overall system—the resulting biospheric pattern

  of organisms that survive, interact, and coevolve—together with environ-

  mental and climatic influences, becomes the programmer, laying down its

  code in the form of the collection of gene sequences that survive and exist at any time. Of course, without an outside system, we cannot say these programmable systems were ever “captured.” Instead they emerged and bootstrapped

  themselves, developing carriers, grammar, and software as they went. Viewed

  this way, the origin of life is very much the emergence of a software system

  carried by a physical system—the emergence of a programmable system

  learning to program itself.

  Capturing software in all the cases discussed here is an enormously suc-

  cessful evolutionary strategy. It allows the system to adapt extremely rapidly on t He evolu t ion of comPlexi t y [ 155 ]

  by merely reprogramming the captured system to form a different output.

  But because changes in grammars and in carriers would upset existing “pro-

  grams,” we would expect them to be locked in and to change slowly
if at all.

  This explains why a genetic sequence can change easily, but the genetic code

  can not; why new organisms can appear, but the cell and metabolic chemistry

  remain relatively fixed; why new financial derivatives are constantly seen, but the securities-and-exchange rules stay relatively constant.[3]

  CONCLUSION

  In this chapter, I have suggested three ways in which complexity tends to

  grow as evolution takes place. It may grow by increases in diversity that

  are self-reinforcing; or by increases in structural sophistication that break

  through performance limitations; or by systems “capturing” simpler elements

  and learning to “program” these as “software” to be used to their own ends. Of course, we would not expect such growth in complexity to be steady. On the

  contrary, in all three mechanisms we would predict it to be intermittent and

  epochal. And we would not expect it to be unidirectional. The first two mecha-

  nisms are certainly reversible, so we would expect collapses in complexity to

  occur randomly from time to time.

  As we study evolution more deeply, we find that biology provides by no

  means all of the examples of interest. Any system with a lineage of inherited, alterable structures pressured to improve their performance shows evolutionary phenomena. And so, it is likely that increasingly we will find connections between complexity and evolution by drawing examples not just from biology,

  but from the domains of economics, adaptive computation, artificial life, and

  game theory. Interestingly, the mechanisms described in this chapter apply to

  examples in all of these evolutionary settings.

  ACKNOWLEDGMENTS

  This paper was originally presented at the Santa Fe Institute’s Integrative

  Themes Workshop in July 1992. I thank Dan McShea, Brian Goodwin, and the

  workshop participants for useful comments. I am grateful to Harold Morowitz

  in particular for several conversations on the themes of this essay.

  [3] Carriers do change, of course, if they can be substituted for one another easily. For example, options can be built on any underlying; and so, in this case, carriers can and do change rapidly. But the essential property of underlyings—that of being an object that carries uncertain value—remains necessary in all cases and does not change.

 

‹ Prev