Book Read Free

The River of Consciousness

Page 15

by Oliver Sacks


  The notion of perception as “given” in some seamless, overall way was finally shaken to its foundations in the late 1950s and early 1960s when David Hubel and Torsten Wiesel showed that there were cells and columns of cells in the visual cortex which acted as “feature detectors,” specifically sensitive to horizontals, verticals, edges, alignments, or other features of the visual field. The idea began to develop that vision had components, that visual representations were in no sense “given,” like optical images or photographs, but were constructed by an enormously complex and intricate correlation of different processes. Perception was now seen as composite, as modular, the interaction of a huge number of components. The integration and seamlessness of perception had to be achieved in the brain.

  It thus became clear in the 1960s that vision was an analytic process, depending on the differing sensitivities of a large number of cerebral and retinal systems, each tuned to respond to different components of perception. It was in this atmosphere of hospitality to subsystems and their integration that Zeki discovered specific cells sensitive to wavelength and color in the visual cortex of the monkey, and he found them in much the same area that Verrey had suggested as a color center eighty-five years before. Zeki’s discovery seemed to release clinical neurologists from their almost century-long inhibition. Within a few years, scores of new cases of achromatopsia were described, and it was at last legitimized as a valid neurological condition.

  That conceptual bias was responsible for the dismissal and “disappearance” of achromatopsia is confirmed by the completely opposite history of central motion blindness, an even rarer condition that was described in a single case by Josef Zihl and his colleagues in 1983.*8 Zihl’s patient could see people or cars at rest, but as soon as they began to move, they disappeared from her consciousness, only to reappear, motionless, in another place. This case, Zeki noted, was “immediately accepted by the neurological…and the neurobiological world, without a murmur of dissent…in contrast to the more turbulent history of achromatopsia.” This dramatic difference stemmed from the profound change in intellectual climate which had come about in the years immediately before. In the early 1970s it had been shown that there was a specialized area of motion-sensitive cells in the prestriate cortex of monkeys, and the idea of functional specialization was fully accepted within a decade. There was no longer any conceptual reason for rejecting Zihl’s findings—indeed, quite the contrary; they were embraced with delight, as a superb piece of clinical evidence in consonance with the new climate.

  That it is crucially important to take note of exceptions—and not forget them or dismiss them as trivial—was brought out in Wolfgang Köhler’s first paper, written in 1913, before his pioneer work in gestalt psychology. In his paper, “On Unnoticed Sensations and Errors of Judgment,” Köhler wrote of how premature simplifications and systemizations in science, psychology in particular, could ossify science and prevent its vital growth. “Each science,” he wrote, “has a sort of attic into which things are almost automatically pushed that cannot be used at the moment, that do not quite fit….We are constantly putting aside, unused, a wealth of valuable material [that leads to] the blocking of scientific progress.”*9

  At the time that Köhler wrote this, visual illusions were seen as “errors of judgment”—trivial, of no relevance to the workings of the mind-brain. But Köhler would soon show that the opposite was the case, that such illusions constituted the clearest evidence that perception does not just passively “process” sensory stimuli but actively creates large configurations or “gestalts” that organize the entire perceptual field. These insights now lie at the heart of our present understanding of the brain as dynamic and constructive. But it was first necessary to seize on an “anomaly,” a phenomenon contrary to the accepted frame of reference, and, by according it attention, to enlarge and revolutionize that frame of reference.

  Can we draw any lessons from the examples I have been discussing? I believe we can. One might first invoke the concept of prematurity here and see the nineteenth-century observations of Herschel, Weir Mitchell, Tourette, and Verrey as having come before their times, so that they could not be integrated into contemporary conceptions. Gunther Stent, considering “prematurity” in scientific discovery in 1972, wrote, “A discovery is premature if its implications cannot be connected by a series of simple logical steps to canonical, or generally accepted, knowledge.” He discussed this in relation to the classic case of Gregor Mendel, whose work on plant genetics was so far ahead of its time, as well as the lesser-known but fascinating case of Oswald Avery, who discovered DNA in 1944—a discovery totally overlooked because no one could yet appreciate its importance.*10

  Had Stent been a geneticist rather than a molecular biologist, he might have recalled the story of the pioneer geneticist Barbara McClintock, who in the 1940s developed a theory—of so-called jumping genes—which was almost unintelligible to her contemporaries. Thirty years later, when the atmosphere in biology had become more hospitable to such notions, McClintock’s insights were belatedly recognized as a fundamental contribution to genetics.

  Had Stent been a geologist, he might have given another famous (or infamous) example of prematurity—Alfred Wegener’s theory of continental drift, proposed in 1915, forgotten or derided for many years, but then rediscovered forty years later with the rise in plate tectonics theory.

  Had Stent been a mathematician, he might even have cited, as an astonishing example of “prematurity,” Archimedes’s invention of calculus two thousand years before Newton’s and Leibniz’s.

  And had he been an astronomer, he might have spoken not merely of a forgetting but of a most momentous regression in the history of astronomy. Aristarchus, in the third century B.C., clearly established a heliocentric picture of the solar system that was well understood and accepted by the Greeks. (It was further amplified by Archimedes, Hipparchus, and Eratosthenes.) Yet Ptolemy, five centuries later, turned this on its head and proposed a geocentric theory of almost Babylonian complexity. The Ptolemaic darkness, the scotoma, lasted 1,400 years, until a heliocentric theory was reestablished by Copernicus.

  Scotoma, surprisingly common in all fields of science, involves more than prematurity; it involves a loss of knowledge, a forgetting of insights that once seemed clearly established, and sometimes a regression to less perceptive explanations. What makes an observation or a new idea acceptable, discussable, memorable? What may prevent it from being so, despite its clear importance and value?

  Freud would answer this question by emphasizing resistance: the new idea is deeply threatening or repugnant, and hence is denied full access to the mind. This doubtless is often true, but it reduces everything to psychodynamics and motivation, and even in psychiatry this is not enough.

  It is not enough to apprehend something, to “get” something, in a flash. The mind must be able to accommodate it, to retain it. The first barrier lies in allowing oneself to encounter new ideas, to create a mental space, a category with potential connection—and then to bring these ideas into full and stable consciousness, to give them conceptual form, holding them in mind even if they contradict one’s existing concepts, beliefs, or categories. This process of accommodation, of spaciousness of mind, is crucial in determining whether an idea or discovery will take hold and bear fruit or whether it will be forgotten, fade, and die without issue.

  We have spoken of discoveries or ideas so premature as to be almost without connection or context, hence unintelligible, or ignored, at the time and of other ideas passionately, even ferociously, contested in the necessary but often brutal agon of science. The history of science and medicine has taken much of its shape from intellectual rivalries that force scientists to confront both anomalies and deeply held ideologies. Such competition, in the form of open and straightforward debate and trial, is essential to scientific progress.*11 This is “clean” science, in which friendly or collegial competition encourages an advance in understanding—but there is a good deal of �
�dirty” science too, in which competition and personal rivalry become malignant and obstructive.

  If one aspect of science lies in the realm of competition and rivalry, another one springs from epistemological misunderstanding and schism, often of a very fundamental sort. Edward O. Wilson describes in his autobiography, Naturalist, how James Watson regarded Wilson’s early work in entomology and taxonomy as no more than “stamp collecting.” Such a dismissive attitude was almost universal among molecular biologists in the 1960s. (Ecology, similarly, was scarcely allowed status as a “real” science in those days and is still seen as much “softer” than, for example, molecular biology—a mind-set that is only now beginning to shift.)

  Darwin often remarked that no man could be a good observer unless he was an active theorizer as well. As Darwin’s son Francis wrote, his father seemed “charged with theorising power ready to flow into any channel on the slightest disturbance, so that no fact, however small, could avoid releasing a stream of theory, and thus the fact became magnified into importance.” Theory, though, can be a great enemy of honest observation and thought, especially when it hardens into unstated, perhaps unconscious, dogma or assumption.

  Undermining one’s existing beliefs and theories can be a very painful, even terrifying, process—painful because our mental lives are sustained, consciously or unconsciously, by theories, sometimes invested with the force of ideology or delusion.

  In extreme cases scientific debate can threaten to destroy the belief systems of one of the antagonists and with this, perhaps, the beliefs of an entire culture. Darwin’s publication of the Origin in 1859 instigated furious debates between science and religion (embodied in the conflict between Thomas Huxley and Bishop Wilberforce), and the violent but pathetic rearguard actions of Agassiz, who felt that his lifework, and his sense of a creator, were annihilated by Darwin’s theory. The anxiety of obliteration was such that Agassiz actually went to the Galápagos himself and tried to duplicate Darwin’s experience and findings in order to repudiate his theory.*12

  Philip Henry Gosse, a great naturalist who was also deeply devout, was so torn by the debate over evolution by natural selection that he was driven to publish an extraordinary book, Omphalos, in which he maintained that fossils do not correspond to any creatures that ever lived, but were merely put in the rocks by the Creator to rebuke our curiosity—an argument which had the unusual distinction of infuriating zoologists and theologians in equal measure.

  It has sometimes surprised me that chaos theory was not discovered or invented by Newton or Galileo; they must have been perfectly familiar, for example, with the phenomena of turbulence and eddies which are constantly seen in daily life (and so consummately portrayed by Leonardo). Perhaps they avoided thinking of their implications, foreseeing these as potential infractions of a rational, lawful, orderly Nature.

  This is much what Henri Poincaré felt more than two centuries later, when he became the first to investigate the mathematical consequences of chaos: “These things are so bizarre that I cannot bear to contemplate them.” Now we find the patterns of chaos beautiful—a new dimension of nature’s beauty—but this was certainly not how it originally seemed to Poincaré.

  The most famous example of such repugnance in our own century is, of course, Einstein’s violent distaste for the seemingly irrational nature of quantum mechanics. Even though he himself had been one of the very first to demonstrate quantum processes, he refused to consider quantum mechanics anything more than a superficial representation of natural processes, which would give way, with deeper insight, to a more harmonious and orderly one.

  With great scientific advances, there is often both fortuity and inevitability. If Watson and Crick had not cracked the double helix of DNA in 1953, Linus Pauling would almost certainly have done so. The structure of DNA, one might say, was ready to be discovered, though who did it, and how, and exactly when, remained unpredictable.

  The greatest creative achievements arise not only from extraordinary, gifted men and women but from their being confronted by problems of enormous universality and magnitude. The sixteenth century was a century of genius not because there were more geniuses around but because the understanding of the laws of the physical world, more or less petrified since the time of Aristotle, was beginning to yield to the insights of Galileo and others who believed that the language of Nature was mathematics. In the seventeenth century, similarly, the time was ripe for the invention of calculus, and it was devised by both Newton and Leibniz almost simultaneously, though in entirely different ways.

  In Einstein’s time, it was increasingly clear that the old mechanical, Newtonian worldview was insufficient to explain various phenomena—among them the photoelectric effect, Brownian motion, and the change of mechanics near the speed of light—and had to collapse and leave a rather frightening intellectual vacuum before a radically new concept could be born.

  But Einstein also took pains to say that a new theory does not invalidate or supersede the old but rather “allows us to regain our old concepts from a higher level.” He expanded this notion in a famous simile:

  To use a comparison, we could say that creating a new theory is not like destroying an old barn and erecting a skyscraper in its place. It is rather like climbing a mountain, gaining new and wider views, discovering unexpected connections between our starting point and its rich environment. But the point from which we started out still exists and can be seen, although it appears smaller and forms a tiny part of our broad view gained by the mastery of the obstacles on our adventurous way up.

  Helmholtz, in his memoir On Thought in Medicine, also used the image of a mountain climb (he was an ardent alpinist), describing the climb as anything but linear. One cannot see in advance, he wrote, how to climb a mountain; it can only be climbed by trial and error. The intellectual mountaineer makes false starts, turns in to blind alleys, finds himself in untenable positions and often has to backtrack, descend, and start again. Slowly and painfully, with innumerable errors and corrections, he makes his zigzag way up the mountain. It is only when he reaches the summit that he will see that there was, in fact, a direct route, a “royal road,” to the top. In presenting his ideas, Helmholtz says, he takes his readers along this royal road, but it bears no resemblance to the crooked and tortuous processes by which he constructed a path for himself.

  Often there is some intuitive and inchoate vision of what must be done, and this vision, once glimpsed, drives the intellect forward. Thus Einstein at the age of fifteen had fantasies about riding a light beam and ten years later developed the theory of special relativity, going from a boy’s dream to the grandest of theories. Was the achievement of the theory of special relativity, and then of general relativity, part of an ongoing, inevitable historical process? Or the result of a singularity, the advent of a unique genius? Would relativity have been conceived in Einstein’s absence? And how quickly would relativity have been accepted had it not been for the solar eclipse of 1917, which, by a rare chance, allowed the theory to be confirmed by accurate observation of the effect of the sun’s gravity on light? One senses the fortuitous here—and, not trivially, a requisite level of technology, one which could measure Mercury’s orbit accurately. Neither “historical process” nor “genius” is an adequate explanation—each glosses over the complexity, the chancy nature, of reality.

  “Chance favors the prepared mind,” as Claude Bernard famously wrote, and Einstein was, of course, intensely alert, primed to perceive and seize whatever he could use. But if Riemann and other mathematicians had not developed non-Euclidean geometries (they had been worked out as pure abstract constructions, with no notion that they might be appropriate to any physical model of the world), Einstein would not have had the intellectual techniques available to move from a vague vision to a fully developed theory.

  A number of isolated, autonomous, individual factors must converge before the seemingly magical act of a creative advance, and the absence (or insufficient development) of any on
e may suffice to prevent it. Some of these factors are worldly ones—sufficient funding and opportunity, health and social support, the era into which one was born. Others have to do with innate personality and intellectual strengths or weaknesses.

  In the nineteenth century, an era of naturalistic description and phenomenological passion for detail, a concrete habit of mind seemed highly appropriate, and an abstract or ratiocinating one was suspect—an attitude beautifully brought out by William James in his famous essay on Louis Agassiz, the eminent biologist and natural historian:

  The only man he really loved and had use for was the man who could bring him facts. To see facts, not to argue or [reason], was what life meant for him; and I think he often positively loathed the ratiocinating type of mind….The extreme rigor of his devotion to this concrete method of learning was the natural consequence of his own peculiar type of intellect, in which the capacity for abstraction and causal reasoning and tracing chains of consequences from hypotheses was so much less developed than the genius for acquaintance with vast volumes of detail, and for seizing upon analogies and relations of the more proximate and concrete kind.

  James describes how the young Agassiz, coming to Harvard in the mid-1840s, “studied the geology and fauna of a continent, trained a generation of zoologists, founded one of the chief museums of the world, gave a new impulse to scientific education in America”—and all this through his passionate love of phenomena and facts, of fossils and living forms, his lyrical concreteness of mind, his scientific and religious sense of a divine system, a whole. But then there came a transformation: zoology itself was changing from a natural history, intent on wholes—species and forms and their taxonomic relationships—to studies in physiology, histology, chemistry, pharmacology, a new science of the micro, of mechanisms and parts abstracted from a sense of the organism and its organization as a whole. Nothing was more exciting, more potent than this new science, and yet it was clear that something was being lost, too. It was a transformation to which Agassiz’s mind could not well adapt, and he was pushed, in his later years, away from the center of scientific thought, becoming an eccentric and tragic figure.*13

 

‹ Prev