The Cybernetic Brain

Home > Other > The Cybernetic Brain > Page 51
The Cybernetic Brain Page 51

by Andrew Pickering


  12. Walter later interpreted the other brain rhythms as also modes of adaptive search: the delta rhythms were associated with a search for order and stability; thetas were a search for specific pleasurable entities. See Hayward (2001b).

  13. In Cybernetics Wiener also discussed the alpha rhythms as a scanning mechanism, citing Walter, and he developed this idea further in the second edition (1948, 141; 1961, 198). See also John Stroud's contribution (1950) to the sixth Macy conference.

  14. The protoypical version of this argument is to be found in Rosenblueth,Wiener, and Bigelow's foundational essay "Behavior, Purpose and Teleology" (1943). Rosenblueth et al. suggest that purposeful behavior in humans, animals, and machines can be understood in terms of a single model: the servomechanism with negative feedback. The tortoise can be seen as a specific instantiation of this idea.

  15. The notion of exploration is important here and marks a difference between the cybernetic devices we will be examining and earlier ones such as the thermostat and the steam-engine governor. Those earlier devices did not interrogate their environments; they were "hard wired" to respond to certain features: if the temperature goes up, the thermostat automatically turns the heating down, and so on. The tortoise, in contrast, went looking for specific elements in the environment, like lights. This, of course, contributed to the lively and lifelike quality of its behavior, and its effectiveness in undermining the boundary between animals and machines.

  16. The tortoise's predecessors and contemporaries typically did not have the opportunity to display this kind of inscrutable variability. The prewar brain models in experimental psychology discussed by Cordeschi (2002) usually aimed to emulate a narrowly circumscribed range of performances, such as the adaptation of the eye to varying light intensities or simple phototropism, and they either succeeded or failed in that predetermined task—likewise the cybernetic maze-running robots of the early 1950s mentioned above.

  17. The individual mechanical and electrical components of the tortoise were likewise nonadaptive—the electronic valves, say, did not change their properties in the course of the tortoise's explorations. Adaptation consisted in variations of the interconnections of parts. It is, of course, hard to imagine building any fully adaptive system. Models for the latter might be biological evolution, including the evolution of the nonbiological environment, and the coevolution of science, technology, and society.

  18. Strictly speaking, the tortoises were capable of modifying their environment, but only in a simple and mechanical fashion: small obstacles moved when the tortoises bumped into them, which helped the tortoises to navigate past them. And in multitortoise configurations, the tortoises constituted lively environments for each other, as in the mating dance. But Walter's writings thematize neither of these observations, and it is better to postpone this phase of the ontological discussion to the next chapter.

  19. Craik was the author of a highly original book on philosophy and psychology and of a posthumous collection of essays (Craik 1943, 1966). Craik's essay "Theory of the Human Operator in Control Systems" (1947) derives from his wartime research and outlines a detailed vision of the mechanical re-creation of the human operator of a weapons system. On Craik and his relation to cybernetics in Britain, see Hayward (2001b, 295–99), Clark (2002), Gregory (1983), and Zangwill (1980). John Stroud's paper (1950) at the sixth Macy conference attests to the importance of Craik's work in the history of U.S. cybernetics.

  20. The quote continues, "But it will be a worse 'animal' for though it will keep more closely to its beam it will have to be aimed roughly in the right direction and will not 'speculate'—that is, spy out the land—nor will it solve Buridan's dilemma [of choosing between two targets]." I think "the usual way" here harks back to the prewar tradition in experimental psychology of constructing phototropic robots. As discussed by Cordeschi (2002), such robots typically used a pair of photocells to home in on their targets.

  21. For more on this, see Holland (2003, 2090–91), which includes a description of Walter's workshop.

  22. In this connection it is also profitable to read WarrenMcCulloch's biographical account of his own route to cybernetics: McCulloch (2004).

  23. On the Burden Neurological Institute's transformation during World War II, Cooper and Bird (1989, 14) record that "less than six months after the opening ceremony [at the Burden] war started and the Emergency Medical Service used the Institute as a neurological hospital for the whole of the West Country, it possessing the only neurosurgical theatre West of London. Despite the strain due to the requirements of the neurosurgical unit, the laboratories continued to function as centres for clinical research in neurology and psychiatry."Walter was presumably exempt from military service during the war by virtue of his occupation. One can get some feeling for his work during the war from an undated handwritten document by him entitled "The Genesis of Frenchay" (Science Museum, BNI papers, 6/38). Frenchay was a hospital built close to the Burden, "as a sort of reverse land-lease project to provide the U.S. Army Medical Corps with a clinical service not too far from the rapidly advancing front after D-Day. . . . My task was to help the neurologists & neurosurgeons by taking EEGs with a home-made portable in cases of head-wounds." After the Battle of the Bulge, "I can still see in vivid horror what is nowWard 2 with beds touching, and the pressure on, not only for treatment, but to get the men back to the States as soon as possible for local morale over there." Besides his EEG work, "there were also cases of 'battle fatigue' in the Hospital (what we called 'shell shock' in the first War) and I was engaged to help with these too, by giving ECT."

  24. In a quite different context, the history of the electronic synthesizer, Pinch and Trocco (2002, 280) make the same point about the importance of postwar army-surplus electronics, referring specifically to Lisle Street in London, "just full from one end to the other of second-hand, ex-Army, electronic stores."

  25. In another field, psychology, Richard Gregory's Mind in Science: A History of Explanations in Psychology and Physics(1981), refers only briefly to Walter, mainly in reproducing the circuit diagrams for the tortoises and CORA (286, 287). (His reference to Ashby is even briefer [82], and there is no mention of Bateson, Laing, Beer, or Pask.) We should note that Walter did find some support at the Burden. As noted, it was Bunny Warren, a Burden engineer, who built the machines that Walter displayed in the 1950s. But there is no indication of any sustained support for Walter's work there. A book on the history of the Burden coauthored by Ray Cooper, who worked there from 1955 onward and was director from 1971 until his retirement in 1988, devotes a chapter to Walter's work on the electrophysiology of the brain but mentions the tortoises only in passing, with the comment: "Cybernetics suffered from the fact that it was fine in theory [sic]but whenever a practical application was attempted the whole thing appeared to be facile. . . . A pity, for some general theory of brain function (if one exists) would be most welcome" (Cooper and Bird 1989, 45).

  26. Bates's letter and Walter's reply are in the Wellcome Archives, GC/179/B.1.

  27. This information is taken from an excellent PhD dissertation, Clark (2002). See Clark (77–80) on the Ratio Club; the quotation from Walter is from a letter to Bates dated 29 September 1949 (Clark 2002, 80n221). Clark includes an appendix collecting archival documentation on the Ratio meetings (207–15). At the meeting on 22 February 1951, Walter gave the address "Adaptive Behaviour" and demonstrated the tortoises; on 31May 1951, Ashby delivered "Statistical Machines" (the homeostat, etc.); on 6 November 1952, Walter discussed Ashby's new book, Design for a Brain;the meeting on 2 July 1953 again featured a visit by McCulloch; and the meeting of 6May 1955 included a demonstration at Barnwood House (Ashby's place of work) and lunch at the Burden Institute (Walter's).

  28. The Ratio Club's membership list grew somewhat relative to the people named in Bates's letter to Walter, the most eminent addition being the mathematician and computer pioneer Alan Turing, invited by Bates in a letter dated 22 September 1949 (Clark 2002, 80; on Turing,
see Hodges 1983). If one focuses on the Ratio Club as a whole, the case for a military origin of British cybernetics looks much stronger, and Clark makes such a case (2002, 77): "Craik appears . . . as the lost leader of a group of physiologists and psychologists, who as a consequence of their wartime redeployment, developed an interest in electronics and control mechanisms. The interest in electrical mechanisms in the nervous system was not new, it had been established before the war. But the redirection of research imposed by necessity, moved it toward engineering technology rather than pure science, something that, but for this wartime demand, might not have had such comprehensive influence." It is worth emphasizing therefore that neither Walter nor Ashby was mobilized as part of the war effort. Walter continued his work at the Burden throughout the war (n. 23 above) and Ashby worked at Barnwood House mental hospital until June 1945, when he was called up for military service in India.

  29. The lack of sociologists was in contrast to the important presence of social scientists at the Macy conferences, including Margaret Mead and Gregory Bateson. In this respect the U.S. cybernetics community was even more markedly interdisciplinary than the British one. See Heims (1991).

  30. Echoing the heterogeneity of these institutions, Cooper and Bird (1989, 20) note that "throughout the history of the [Burden] Institute there have been serious attempts to establish some formal links with the University of Bristol. For all kinds of reasons, some good some bad, often to do with finance, these never succeeded."

  31. At the first Namur conference, twenty-four papers were presented in the session "Cybernetics and Life," at which Walter presided. Several of the authors are listed in the Proceedings simply as doctors of medicine; others as neuropsychiatrists and neurosurgeons; Reginald Goldacre gave the Royal Cancer Hospital in London as his affiliation. Academics came from departments including electrical engineering, physics, physiology, and philosophy. Two authors were based in the cybernetics group of the Max Planck Institute for Biology in Tübingen. Albert Uttley listed the BritishNational Physical Laboratory; French authors recorded affiliations to the Fédération Nationale de l'Automation, the Conseil en Organisation électronique et nucléaire pour l'Industrie, and the CNRS.

  32. Other robots modelled on the tortoise included an American robot squirrel called Squee, built by Edmund Berkeley (Berkeley 1952), dubbed M. speculatrix berkeleyiby Walter (1953, 132), and "la tortue du Vienne," built by Heinz Zemanek, an engineer at the Technical University in Vienna, and exhibited at the first Namur conference (Zemanek 1958). Zemanek also exhibited a copy of Ashby's homeostat. I thank Garnet Hertz for making me aware of Berkeley's manuscript.

  33. In his early work in science studies, Harry Collins (1974) emphasized how dif-ficult it is to replicate an experiment with only a published description to rely upon and how important personal contacts are. My point concerning cybernetics is this obverse of this.

  34. See people.csail.mit.edu/brooks/.

  35. On Allen, see Brooks (2002, 32–44). For a technical discussion of Brooks's robotics at this time, including subsumption architecture and Allen (not named), see Brooks (1999 [1986]).

  36. Though not central to our story, a key event here was the extinction in the 1960s of funding for neural-network research stemming from the work of McCulloch and Pitts (1943), in favor of support for symbolic AI: see Olazaran (1996). Like Brooks's robotics, neural networks, too, emerged from the shadow of representational AI in the 1980s. The contrast between cybernetics and AI is sometimes phrased as that between analog and digital computing. Historically this stands up, but the continuity between Walter's analog machines and Brooks's digital ones suggests that the key difference does not lie there, but rather in the overall nature of the projects and their respective emphases on adaptive performance or symbolic representation.

  37. The standard AI response is, of course, wait for the next generation of processors.

  38. Brooks explicitly conceived the contrast between his approach and mainstream robotics as that between performative and representational approaches: see his essays "Intelligence without Reason" (1999 [1991]) and "Intelligence without Representation" (1999 [1995]).

  39. At the time of writing, details of the meeting are still to be found online at www.ecs.soton.ac.uk/~rid/wgw02/first.html. The proceedings were published as Damper (2003) but do not cover the full diversity of presentations (a keynote talk by Brooks is missing, for example). It is significant to note that the revival of this style of robotics coincided in the mid-1980s with a revival of neural network research in AI (n. 36 above), and, as is evident in the proceedings, neural networks have become an important feature of robot development. Here two branches of the original cybernetic synthesis as laid out by Wiener have been reunited in a symbiosis of research in engineering and brain science that echoes but goes far beyond Walter's work. A glance at the proceedings of the conference makes clear that the resurrection of cybernetics in robotics was not the work of Brooks alone; I discuss him as a conspicuous and illuminating example of this process. At much the same time as Brooks changed his style in robotics, Valentino Braitenberg published a book called Vehicles: Experiments in Synthetic Psychology(1984), which reviewed the imagined performances of fourteen different conceptual variants of tortoise-style robots, each of which could mimic distinct psychological performances (from"getting around," "fear and aggression," and "love" up to "egotism and optimism"). When he wrote the book, Braitenberg was director of the Max Planck Institute of Biological Cybernetics in Germany.

  40. More could be said on how Brooks and others managed to redirect their research fields, never an easy task in academia. The simple answer is that the performativity of Brooks's robots was attractive to outside funding agencies, which, in the United States at least, is a powerful argument in the university and makes it possible to support and train graduate students and postdoctoral fellows. Thus, the long list of acknowledgments in Brooks (2002, vii–viii) begins with "all my students over the years who have contributed to building robots and helping to solve the problems of making life-like behavior from nonlifelike components," immediately followed by "my sponsors at DARPA (the Defense Advanced Research Projects Agency) and the Office of Naval Research, and more recently from NTT (Nippon Telegraph and Telephone Corporation), who have had patience and faith over the years that something good would come of my crazy ideas." Information on Brooks's company, iRobot, can be found at www.irobot.com/home.cfm. Until recently, one product of iRobot was a robotic doll for children, but, echoing the failure to turn the tortoise into a commercial toy, this no longer appears on the website. The products now listed are Roomba, a robot vacuum cleaner, and PackBot, "a portable unmanned vehicle [which] is helping to protect soldiers." Exploration of this website reveals that much of the research and development at iRobot is funded by U.S. military agencies and directed toward military ends.

  41. To take this thread of the story a bit further, we could note that much current research on the mind, brain and consciousness is recognizably in the Walter-Ashbymodel-building tradition. See, for example,Edelman(1992),who, despite considerable historical erudition, manages not to mention cybernetics.

  42. Walter (1951, 62): "This process may of course be accelerated by formal education: instead of waiting for the creature to hit a natural obstacle the experimenter can blow the whistle and kick the model. After a dozen kicks the model will know that a whistle means trouble, and it can thus be guided away from danger by its master."

  43. The "great Pavlov" quotation is from Walter (1966, 10), which speaks of "a period when I was working in Cambridge under the direction of the great I. P. Pavlov."

  44. There are some further interesting ideas about CORA and memory in Walter (1951, 63): "In M. Docilis the memory of association is formed by electrical oscillations in a feedback circuit. The decay of these oscillations is analogous to forgetting; their evocation, to recall. If several learning pathways are introduced, the creature's oscillatory memory becomes endowed with a very valuable feature: the freque
ncy of each oscillation, or memory, is its identity tag. A latent memory can be detected and identified by a process of frequency analysis, and a complex of memories can be represented as a synthesis of oscillations which yields a characteristic wave pattern. Furthermore a 'memory' can be evoked by an internal signal at the correct frequency, which resonates with the desired oscillation. The implications of these effects are of considerable interest to those who study the brain, for the rhythmic oscillation is the prime feature of brain activity."

  45. At the first Namur conference in 1956, Heinz Zemanek's replication of the tortoise included a very much simplified version of CORA, which used a resistance with a negative temperature coefficient instead of the sophisticated differentiating and integrating circuits devised by Walter (Zemanek 1958).

  46. We could return here to the cybernetic discovery of complexity. Like the tortoise, the CORA-equipped tortoise displayed emergent properties that Walter had not designed into it, some of which he was not able to explain (1953, 180–82). He constructed a complicated after-the-fact explanation of M. docilis's display of different patterns of conditioning, analogous to "appetitive and defensive reflexes" in animals, when sounds were associated with lights or with obstacles, but also found that "such models show inexplicable mood changes. At the beginning of an experiment the creature is timid but accessible, one would say, to gentle reason and firm treatment; later, as the batteries run down there is a paradoxical reversal of attitude; either the reflex or the acquired response may be lost altogether, or there may be swings from intractability to credulity. Such effects are inevitable; however carefully the circuits are designed, minute differences and changes are cumulatively amplified to generate temperaments and tempers in which we can see most clearly how variations in quantity certainly do, in such a system, become variations in quality." Docilis,like speculatrix,can thus be seen in itself as an instructive ontological icon, demonstrating again that systems of known parts can display unexpected behavior, remaining, in this sense, Black Boxes knowable only in their performances.

 

‹ Prev