puted incorrectly). The circuit for 2-bit-adder is constructed from TECH-712 by adding circuitry to correct this error.
Some of our evolved circuits contain unused parts. The use of the 3-bit-adder
on the right of Figure 3 is an example. In the course of the experiment such
redundancies usually disappear because less “costly” circuits replace ones with needless complication.
Our experiment starts from the NAND primitive. In other versions of the
experiment we used “implication” as the primitive. Similarly complicated
[ 124 ] Complexity and the Economy
not
xor
imply
not
equiv
3 bit xor
3 bit and
Figure 1:
Two circuits “invented” for simple goals.
circuits evolved. The process simply constructs the more elementary needs
such as “not,” “and,” and “xor” from the new “implication” primitive and pro-
ceeds as before.
The emergence of circuits such as 8-bit adders may not seem particu-
larly remarkable. But consider the combinatorics. If a component has n
inputs and m outputs there are ( m)( n)
2 2 possible phenotypes, each of which
could be realized in a practical way by a large number of different circuits.
For example, an 8-bit adder is one of over 10177,554 phenotypes with 16
inputs and 9 outputs. The likelihood of such a circuit being discovered by
random combinations in 250,000 steps is negligible. Our experiment—or
algorithm—arrives at complicated circuits by first satisfying simpler needs
and using the results as building blocks to bootstrap its way to satisfy more
complex ones.
2-bitwise-xor
full-adder
TECH-712
2-way-and
TECH-712
2-bit adder
Figure 2:
TECH-712 is useful towards satisfying the “2-bit-adder” goal because the two high-order bits are computed correctly. (The low-order bit is on the left. For multi-bit adders, input bits are interleaved.)
t He evolu t ion of t ecHnology [ 125 ]
3-bit-adder
full-adder
3-bit-adder
Figure 3:
A “4-bit-adder” circuit with an unconnected module.
THE BUILD-OUT OF TECHNOLOGIES
To talk about the build-out of technologies we need two definitions. The col-
lection of all methods and devices (all circuits) ever used we call the standing reserve. The technologies that are currently viable—in current use—and have not yet been replaced, we call the active repertoire.
Figure 4 shows the growth over time of the standing reserve, the technolo-
gies ever invented. In contrast the growth of the active repertoire, the number of technologies actually in use, is not monotonic. This indicates that important inventions render older technologies obsolete. Figure 4 also shows that
there is continual improvement in accomplishing truth function “needs” as
indicated by growing number of replacements.
Tick marks along the time axis of Figure 4 indicate when one of the
needs has been satisfied. Progress is slow at first: the experiment runs for
some time without meeting any goals exactly, and then functional species
begin to appear leading to further species. The evolution is not smooth. It is punctuated by the clustering of arrivals because from time to time key technologies—key building block components—are “discovered” that quickly
enable other technologies. For example, after a circuit for OR is invented,
circuits for 3-, 4-, and 5-bit OR and bitwise-OR operations follow in short
order. This appearance of key building blocks that quickly make possible
further technologies has analogies in the real world (think of the steam
engine, the transistor, the laser) and with the build-out of species in bio-
logical evolution [2] .
The order of invention makes a difference. Although “negation” is a
simpler function than “implication,” it happens that in some runs of the
[ 126 ] Complexity and the Economy
1000
Standing Reserve
Active Repertoire
Cumulative Replacements
100
Count
10
Satisfied Goals
1 1
10
100
1000
10000
100000
1e+06
Simulation Step
Figure 4:
The standing reserve, by definition, grows monotonically. The same is not true for the active repertoire because new inventions may improve upon and replace several existing ones.
experiment that the latter is invented first and is then used as a key build-
ing block.
Figure 5 shows the result of one such experiment. “Implication” was used in
a large number of other technologies and became much more prevalent than
“negation.” But eventually, its usage as a component declined as negation and
other, less costly components offered themselves for combination. For com-
parison, the figure shows a third technology, TECH-69, which also performs
implication but has 3 additional redundant inputs and contains unneeded
components. Eventually, all uses of TECH-69 are replaced with the functionally equivalent but more efficient implication.
There is a trade-off between the number of needs or goals posted and the
creation of new technologies. To illustrate this, we performed an experiment
masking some of the needs and retaining a subset that we considered useful
for the construction of adders: (“not, imply, 2-way-or, 2-way-xor, full-adder,”
and “k-bit-adder” for 1 ≤ k ≤ 8). (We can also streamline the process by adding more difficult needs, as measured by the number of inputs and outputs, to
the experiment only after simpler ones have been satisfied.) An 8-bit adder
evolved very quickly within 64,000 simulation steps. In contrast, using more
general goals, some simulation runs took over 675,000 steps before even a
4-bit adder evolved. A large disparate set of needs leads to broad generation
of functionalities within the circuit design space, but is slow in arriving at particular complex needs. Narrowly focused goals lead to a deep search that
t He evolu t ion of t ecHnology [ 127 ]
70
imply
not
60
TECH-69
50
40
30
Number of Uses
20
10
01
10
100
1000
10000
100000
1e+06
Simulation Steps
Figure 5:
“Implication,” being invented before “negation” in this example, is used more heavily. Usage declines over time as better technologies are invented.
reaches particular complex needs quickly, but produces a narrow library of
functionalities.
The algorithm does not produce complex circuits without intermediate needs
present. If we start without these, the repertoire of necessary building blocks is missing. For instance, if the “full-adder” goal is omitted from the goals for adders listed above, not even a 2-bit adder was found in one million steps. When the
“full-adder” goal is present, it occasionally happens that the 2-bit adder is found before the “full-adder” is invented. This is because the invention
of technologies that build toward the “full-adder” goal are also useful for the 2-bit adder.
The fact that at each step only circuits combining fewer than 12 existing
components are considered defines a set of possible experimental circuits at
any time—a large number—which we can think as the adjacent probable [4] .
We can think of this as a probabilistic cloud that surrounds the existing technologies and that gradually lead to new ones by being realized by points near
intermediate goals. Thus if a goal is too complicated it cannot be reached—
realized—with reasonable probability, and so if stepping stone goals are not
present the algorithm does not work.
The technologies we have listed as needs or goals are well-ordered in the
sense that the more complicated ones can be constructed from the more ele-
mentary ones by repeating these in simple patterns. For example, a compli-
cated circuit such as a 4-bit adder can be constructed from simpler elements
such as adders and half-adders that repeat in combination.
[ 128 ] Complexity and the Economy
What if we choose complicated goals that are not easy to realize by repetitive patterns? We can do this by selecting random truth tables with n inputs and m outputs as needs. Not surprisingly we find that often these cannot be reached from our standard intermediate steps. By the same token, what if we
replace our intermediate stepping-stone goals by random truth tables of the
same intermediate size? Again, these also do not perform as well. The algo-
rithm works best in spaces where needs are ordered (achievable by repetitive
pattern), so that complexity can bootstrap itself by exploiting regularities in constructing complicated objects from simpler ones.
PROPERTIES OF THE NETWORK
Each technology (or encapsulated circuit) that is currently used to construct
a technology is a node in the network of active technologies, and if two or
more technologies are directly used to create a novel technology they show a
(directed) link to this technology. A given technology A therefore links to its user technologies—the technologies it directly makes possible. As illustrated in Figure 6, some technologies have many links—are used very heavily to construct new ones—others have few. Usage approximates a power law (yielding
a scale-free network) but by no means perfectly.
From time to time a new superior way of executing a certain functionality
(or truth table function) is discovered. The new circuit may have fewer com-
ponents or perform that function better. In this case the new circuit replaces 100
10
1
Frequency of Occurrence
0.1
1
10
100
Number of Uses of Technology
Figure 6:
Very few key technologies are used heavily to directly construct new ones. The plot shows the average over 20 experiments at their termination of 250,000 steps each.
t He evolu t ion of t ecHnology [ 129 ]
1000
100
10
1
Frequency of Occurrence
0.1
1
10
100
Avalanche Size (Technologies rendered obsolete)
Figure 7:
Gales of destruction (or avalanches of collapse in use), average over 20 experiments.
the old one in all its descendant technologies (all the technologies below it in the network that use it directly or indirectly as a component). Replacement is immediate in our algorithm.
Replacement can also cause the collapse of technologies backward in the
network. Suppose Tech 124 is used to construct Tech 136. Then, when a supe-
rior way to achieve Tech 136’s functionality is found, Tech 124 may be left
with no further use (it may neither satisfy any goal, nor be used in any active technology). In this case technology 124 disappears from the active set of
technologies. In its disappearing, some of its component technologies may in turn be left with no further use. These also disappear from the active set.
In this way technologies may be swept from active use in large or small ava-
lanches of collapse—Schumpeter’s “gales of destruction” [1] .
Figure 7 shows that such sandpile avalanches of collapse follow a power
law. The scale on the size axis does not extend far, however, because the num-
ber of technologies in the network is never large. We can take Figure 7 as suggestive that our system of technologies exists at self-organized criticality [5] .
CONCLUSION
Using an artificial system, we have demonstrated how technology can boot-
strap itself from extreme simplicity to a complicated level, both in terms of
the numbers of objects created and their intricacy. In the real world, of course, novel technologies are not usually constructed by random combination, nor
are the needs for which technologies are created specified in a posted list.
[ 130 ] Complexity and the Economy
Nevertheless, all novel technologies are constructed by combining assemblies and components that already exist; the needs they satisfy are usually clearly
signaled economically and technically; and existing technologies form a sub-
strate or library of building blocks for future ones. The model captures certain phenomena we see in real life. Most technologies are not particularly useful
as building blocks, but some (enabling technologies such as the laser or the
transistor) are key in creating descendant technologies. Within our model, we
find a strong indication that our collection of active technologies is subject to similar statistics as earthquakes and sand-piles: it exists at self-organized criticality. Our model also shows that the build-out of technology depends
crucially on the existence of earlier technologies constructed for intermediate or simpler needs. This mirrors the finding of Lenski et al. in biological systems that complex features can be created, but only if simpler functions are first
favored and act as stepping stones [2] .
A COMMENT ON THE ALGORITHM
Just as biological evolution has been the model for genetic algorithms and
genetic programming, technology-based evolution may inspire a new form of
automatic programming and problem solving. The algorithm we develop here,
viewed abstractly, operates by discovering objects of intermediate complexity
and bootstraps complication by using these as building blocks in further com-
binations. It bears some semblance to other evolutionary algorithms such as
genetic programming. But unlike these it does not attempt to solve a given single problem. Rather, it attempts to create a toolbox of useful functionalities that can be further used for further problem solving. In doing so it sets up no parallel population of trial solutions for a problem. Instead it creates a growing collection of objects that might be useful in solving problems of increasing complexity within the same class. In this sense it does not resemble standard programming methods inspired by genetic mechanisms; rather, it is an abstraction of methods already used as combinatorial chemistry or synthetic biology or software
construction that build libraries of objects for the creation of further objects.
EXPERIMENTAL CONDITIONS
Our experimental system is implemented in Common Lisp. Different sets of
goals can be added to the system manually. The detailed behavior of the sys-
tem is controlled by a number of configuration parameters. (The values we
give below are default ones.) Extensive experiments with different settings
have shown that our
results are not particularly sensitive to the choice of
these parameters.
t He evolu t ion of t ecHnology [ 131 ]
To construct new circuits, at each step a small number of components (up to a maximum of 12) is selected and combined. The selection is made each
time by randomly drawing a component either from the set of primitives, or
the constants 0 and 1, or the set of circuits encapsulated as technologies, with probabilities 0.5, 0.015, and 0.485, respectively (and then choosing with equal probability within these sets). (For the purpose of selection, components that satisfy a goal exactly are added to the primitives’ set.) Selected components
may then be combined randomly to each other two circuits at a time, or to
combinations of each other, to form new circuits for testing. To combine two
circuits C and C , each input of C becomes an input of the combination; each 1
2
1
input of C becomes an input of the combination with probability 0.2; other-2
wise it is connected to a random output of C . All outputs of C and C become 1
1
2
outputs of the combination. The step stops when a useful circuit has been
found, or when a limit to combinations tested has been reached.
The cost of a circuit is the sum of the costs of its components. The cost of a primitive component is 1. The cost of a circuit encapsulated as a new technology/component is the number of its components. Thus, the cost of a technol-
ogy is less than the cost of the circuit it encapsulates, reflecting the idea that it becomes a commodity. We use the cost function to decide when to replace
an existing technology by a cheaper one.
Logic functions are represented by binary decision diagrams (BDDs) [6, 7].
The phenotypes of goals and circuits are described by vectors of BDDs, one
for each output wire. BDDs make it efficient to determine the equality of two
logic functions. The representation also makes possible an efficient distance
measure on logic functions, namely the number of input values for which two
functions differ. We use this distance measure to define when one circuit C 1
better approximates a goal G than another circuit C . This is the case if for 2
each output g of G circuit C computes a function f that is closer to g than any 1
of the functions computed by C . Note that this relation is a partial order, 2
i.e., two circuits need not be comparable. A circuit C is encapsulated as a new technology if there is a goal G and no other circuit better approximates G than C. Only outputs of C appropriate for G become outputs of the new component, possibly making some parts of C redundant. In general, several circuits may approximate the same goal G at the same time, as when each circuit best satisfies some aspect (subset of the outputs) of the goal, but neither strictly dominates the other.
Complexity and the Economy Page 24