by Oliver Sacks
Such categories, he feels, depend on the “values” of the organism, those biases or dispositions (partly innate, partly learned) which, for Freud, were characterized as “drives,” “instincts,” and “affects.” The attunement here between Freud’s views and Edelman’s is striking; here, at least, one has the sense that psychoanalysis and neurobiology can be fully at home with one another, congruent and mutually supportive. And it may be that in this equation of Nachträglichkeit with “recategorization” we see a hint of how the two seemingly disparate universes—the universes of human meaning and of natural science—may come together.
* * *
*1 It was generally felt at this time that the nervous system was a syncytium, a continuous mass of nerve tissue, and it was not until the late 1880s and 1890s, through the efforts of Ramón y Cajal and Waldeyer, that the existence of discrete nerve cells—neurons—was appreciated. Freud, however, came very close to discovering this himself in his early studies.
*2 Freud published a number of neuroanatomical studies while in Meynert’s lab, focusing especially on the tracts and connections of the brain stem. He often called these anatomical studies his “real” scientific work, and he subsequently considered writing a general text on cerebral anatomy, but the book was never finished, and only a very condensed version of it was ever published, in Villaret’s Handbuch.
*3 If a strange silence or blindness attended Hughlings Jackson’s work (his Selected Writings were only published in book form in 1931–32), a similar neglect attended Freud’s book on aphasia. More or less ignored on publication, On Aphasia remained virtually unknown and unavailable for many years—even Henry Head’s great monograph on aphasia, published in 1926, makes no reference to it—and was only translated into English in 1953. Freud himself spoke of On Aphasia as “a respectable flop” and contrasted this with the reception of his more conventional book on the cerebral paralyses of infancy:
There is something comic about the incongruity between one’s own and other people’s estimation of one’s work. Look at my book on the diplegias, which I knocked together almost casually, with a minimum of interest and effort. It has been a huge success….But for the really good things, like the “Aphasia,” the “Obsessional Ideas,” which threatens to appear shortly, and the coming aetiology and theory of the neuroses, I can expect no more than a respectable flop.
*4 The same problem was also suggested to Joseph Babinski, another young neurologist attending Charcot’s clinics (and later to become one of the most famous neurologists in France). While Babinski agreed with Freud on the distinction between organic paralyses and hysterical ones, he later came to consider, when examining injured soldiers in World War I, that there was “a third realm”: paralyses, anesthesias, and other neurological problems based neither on localized anatomical lesions nor on “ideas” but on broad “fields” of synaptic inhibitions in the spinal cord and elsewhere. Babinski spoke here of a “syndrome physiopathique.” Such syndromes, which may follow gross physical trauma or surgical procedures, have puzzled neurologists since Silas Weir Mitchell first described them in the American Civil War, for they may incapacitate diffuse areas of the body which have neither specific innervation nor affective significance.
*5 Freud never reclaimed his manuscript from Fliess, and it was presumed lost until the 1950s, when it was finally found and published—although what was found was only a fragment of the many drafts Freud wrote in late 1895.
*6 The inseparability of memory and motive, Freud pointed out, opened the possibility of understanding certain illusions of memory based on intentionality: the illusion that one has written to a person, for instance, when one has not but intended to, or that one has run the bath when one has merely intended to do so. We do not have such illusions unless there has been a preceding intention.
The Fallibility of Memory
In 1993, approaching my sixtieth birthday, I started to experience a curious phenomenon—the spontaneous, unsolicited rising of early memories into my mind, memories that had lain dormant for upwards of fifty years. Not merely memories, but frames of mind, thoughts, atmospheres, and passions associated with them—memories, especially, of my boyhood in London before the Second World War. Moved by these, I wrote two short memoirs: one about the grand science museums in South Kensington, which were so much more important than school to me when I was growing up; the other about Humphry Davy, an early-nineteenth-century chemist who had been a hero of mine in those far-off days and whose vividly described experiments excited me and inspired me to emulation. I think a more general autobiographical impulse was stimulated, rather than sated, by these brief writings, and late in 1997 I launched on a three-year project of dredging, reclaiming memories, reconstructing, refining, seeking for unity and meaning, which finally became my book Uncle Tungsten.
I expected some deficiencies of memory, partly because the events I was writing of had occurred fifty or more years earlier and most of those who might have shared their memories, or checked my facts, were now dead. And partly because, in writing about the earliest years of my life, I could not call on the letters and journals I later started to keep from the age of eighteen or so.
I accepted that I must have forgotten or lost a great deal but assumed that the memories I did have—especially those which were very vivid, concrete, and circumstantial—were essentially valid and reliable, and it was a shock to me when I found that some of them were not.
A striking example of this, the first that came to my notice, arose in relation to the two bomb incidents that I described in Uncle Tungsten, both of which occurred in the winter of 1940–41, when London was bombarded in the Blitz:
One night, a thousand-pound bomb fell into the garden next to ours, but fortunately it failed to explode. All of us, the entire street, it seemed, crept away that night (my family to a cousin’s flat)—many of us in our pajamas—walking as softly as we could (might vibration set the thing off?). The streets were pitch dark, for the blackout was in force, and we all carried electric torches dimmed with red crêpe paper. We had no idea if our houses would still be standing in the morning.
On another occasion, an incendiary bomb, a thermite bomb, fell behind our house and burned with a terrible, white-hot heat. My father had a stirrup pump, and my brothers carried pails of water to him, but water seemed useless against this infernal fire—indeed, made it burn even more furiously. There was a vicious hissing and sputtering when the water hit the white-hot metal, and meanwhile the bomb was melting its own casing and throwing blobs and jets of molten metal in all directions.
A few months after the book was published, I spoke of these bombing incidents to my brother Michael. Michael was five years my senior and had been with me at Braefield, the boarding school to which we had been evacuated at the beginning of the war (and in which I was to spend four miserable years, beset by bullying schoolmates and a sadistic headmaster). My brother immediately confirmed the first bombing incident, saying, “I remember it exactly as you described it.” But regarding the second bombing, he said, “You never saw it. You weren’t there.”
I was staggered at Michael’s words. How could he dispute a memory I would not hesitate to swear on in a court of law and had never doubted as real?
“What do you mean?” I objected. “I can see it all in my mind’s eye now, Pa with his pump, and Marcus and David with their buckets of water. How could I see it so clearly if I wasn’t there?”
“You never saw it,” Michael repeated. “We were both away at Braefield at the time. But David [our older brother] wrote us a letter about it. A very vivid, dramatic letter. You were enthralled by it.” Clearly, I had not only been enthralled but must have constructed the scene in my mind, from David’s words, and then appropriated it and taken it for a memory of my own.
After Michael said this, I tried to compare the two memories—the primary one, on which the direct stamp of experience was not in doubt, with the constructed, or secondary, one. With the first incident, I could feel
myself into the body of the little boy, shivering in his thin pajamas—it was December, and I was terrified—and because of my shortness compared with the big adults all around me, I had to crane my head upwards to see their faces.
The second image, of the thermite bomb, was equally clear, it seemed to me—very vivid, detailed, and concrete. I tried to persuade myself that it had a different quality from the first, that it bore evidences of its appropriation from someone else’s experience and its translation from verbal description into image. But although I knew, intellectually, that this memory was false, it still seemed to me as real, as intensely my own, as before.*1 Had it, I wondered, become as real, as personal, as strongly embedded in my psyche (and, presumably, my nervous system) as if it had been a genuine primary memory? Would psychoanalysis, or, for that matter, brain imaging, be able to tell the difference?
My false bomb experience was closely akin to the true one, and it could easily have been my own experience, had I been home from school at the time. I could imagine every detail of the garden I knew so well. Had this not been the case, perhaps the description of it in my brother’s letter would not have affected me so. But since I could easily imagine being there, and the feelings that would go with this, I took it as my own.
All of us transfer experiences to some extent, and at times we are not sure whether an experience was something we were told or read about, even dreamed about, or something that actually happened to us. This is especially apt to happen with one’s so-called earliest memories.
I have a vivid memory from about the age of two of pulling the tail of our chow, Peter, while he was gnawing a bone under the hall table, of Peter leaping up and biting me in the cheek, and of my being carried, howling, into my father’s surgery in the house, where a couple of stitches were put in my cheek. There is at least an objective reality here: I was bitten on the cheek by Peter when I was two and still bear the scar of this. But do I actually remember it, or was I told about it, subsequently constructing a “memory” which became more and more firmly fixed in my mind by repetition? The memory seems intensely real to me, and the fear associated with it is certainly real, for I developed a fear of large animals after this incident—Peter was almost as large as I was at two—a fear that they would suddenly attack or bite me.
Daniel Schacter has written extensively on distortions of memory and the source confusions that go with them, and in his book Searching for Memory he recounts a well-known story about Ronald Reagan:
In the 1980 presidential campaign, Ronald Reagan repeatedly told a heartbreaking story of a World War II bomber pilot who ordered his crew to bail out after his plane had been seriously damaged by an enemy hit. His young belly gunner was wounded so seriously that he was unable to evacuate the bomber. Reagan could barely hold back his tears as he uttered the pilot’s heroic response: “Never mind. We’ll ride it down together.” The press soon realized that this story was an almost exact duplicate of a scene in the 1944 film A Wing and a Prayer. Reagan had apparently retained the facts but forgotten their source.
Reagan was a vigorous sixty-nine-year-old at the time, would go on to be president for eight years, and only developed unmistakable dementia in his eighties. But he had been given to acting and make-believe throughout his life and had long displayed a vein of romantic fantasy and histrionism. Reagan was not simulating emotion when he recounted this story—his story, his reality, as he felt it to be—and had he taken a lie detector test (functional brain imaging had not yet been invented at the time), there would have been none of the telltale reactions that go with conscious falsehood, for he believed what he was saying.
It is startling to realize, though, that some of our most cherished memories may never have happened—or may have happened to someone else.
I suspect that many of my own enthusiasms and impulses, which seem entirely my own, may have arisen from others’ suggestions that have powerfully influenced me, consciously or unconsciously, and then been forgotten.
Similarly, while I often give lectures on certain topics, I can never remember, for better or worse, exactly what I said on previous occasions; nor can I bear to look through my earlier notes (or often, even the notes I have made for the talk an hour earlier). Losing conscious memory of what I have said before, I discover my themes afresh each time.
These forgettings may sometimes extend to auto-plagiarism, where I find myself reproducing entire phrases or sentences as if new, and this may be compounded, occasionally, by a genuine forgetfulness.
Looking back through my old notebooks, I find that many of the thoughts sketched in them are forgotten for years, and then revived and reworked as new. I suspect that such forgettings occur for everyone, and they may be especially common in those who write or paint or compose, for creativity may require such forgettings, in order that one’s memories and ideas can be born again and seen in new contexts and perspectives.
Webster’s defines “plagiarize” as “to steal and pass off as one’s own the ideas or words of another; use…without crediting the source…to commit literary theft; present as new and original an idea or product derived from an existing source.” There is a considerable overlap between this definition and that of cryptomnesia, and the essential difference is this: plagiarism, as commonly understood and reprobated, is conscious and intentional, whereas cryptomnesia is neither. Perhaps the term “cryptomnesia” needs to be better known, for though one may speak of “unconscious plagiarism,” the very word “plagiarism” is so morally charged, so suggestive of crime and deceit, that it retains a sting even if it is unconscious.
In 1970, George Harrison released an enormously successful song, “My Sweet Lord,” which turned out to have strong similarities to a song by Ronald Mack (“He’s So Fine”), recorded eight years earlier. When the matter went to trial, the court found Harrison guilty of plagiarism, but showed a great deal of psychological insight and sympathy in its judgment. The judge concluded,
Did Harrison deliberately use the music of “He’s So Fine”? I do not believe he did so deliberately. Nevertheless…this is, under the law, infringement of copyright, and is no less so though subconsciously accomplished.
Helen Keller was also accused of plagiarism, when she was only twelve.*2 Though deaf and blind from an early age and indeed languageless before she met Annie Sullivan at the age of six, Helen became a prolific writer once she learned finger spelling and Braille. She wrote, among other things, a story called “The Frost King,” which she gave to a friend as a birthday gift. When the story found its way into print in a magazine, readers soon realized that it bore great similarities to “The Frost Fairies,” a children’s short story by Margaret Canby. Admiration for Keller turned into condemnation, and she was accused of plagiarism and deliberate falsehood, even though she had no recollection of reading Mrs. Canby’s story. (She later realized that the story had been “read” to her, using finger spelling onto her hand.) The young Keller was subjected to a ruthless and outrageous inquisition, which left its mark on her for the rest of her life.
But she had defenders, too, including the plagiarized Margaret Canby, who was amazed that a story spelled into Keller’s hand three years before could be remembered or reconstructed by her in such detail. “What a wonderfully active and retentive mind that gifted child must have!” Canby wrote. Alexander Graham Bell, too, came to her defense, saying, “Our most original compositions are composed exclusively of expressions derived from others.”
Keller herself later said of such appropriations that they were most apt to occur when books were spelled into her hands, their words passively received. Sometimes when this was done, she said, she could not identify or remember their source, nor even, sometimes, whether they came from outside her or not. Such confusion rarely occurred if she read actively, using Braille, moving her finger across the pages.
Mark Twain wrote, in a letter to Keller,
Oh, dear me, how unspeakably funny and owlishly idiotic and grotesque was that “plagiarism” far
ce! As if there was much of anything in any human utterance, oral or written, except plagiarism!…For substantially all ideas are second-hand, consciously and unconsciously drawn from a million outside sources.
Indeed, Twain had committed such unconscious theft himself, as he described in a speech at Oliver Wendell Holmes’s seventieth birthday:
Oliver Wendell Holmes [was] the first great literary man I ever stole any thing from—and that is how I came to write to him and he to me. When my first book was new, a friend of mine said to me, “The dedication is very neat.” Yes, I said, I thought it was. My friend said, “I always admired it, even before I saw it in The Innocents Abroad.”
I naturally said, “What do you mean? Where did you ever see it before?”
“Well, I saw it first some years ago as Doctor Holmes’s dedication to his Songs in Many Keys.”
Of course, my first impulse was to prepare this man’s remains for burial, but upon reflection I said I would reprieve him for a moment or two and give him a chance to prove his assertion if he could: We stepped into a book-store, and he did prove it. I had really stolen that dedication, almost word for word….
Well, of course, I wrote to Doctor Holmes and told him I hadn’t meant to steal, and he wrote back and said in the kindest way that it was all right and no harm done; and added that he believed we all unconsciously worked over ideas gathered in reading and hearing, imagining that they were original with ourselves.