Every brain is the product of other brains, and there is good reason to argue that the very concept of a brain isolated from the environment and from other people, an organ lying in a proverbial vat calculating away on its own, is an analytical philosopher’s folly. We are innately social beings and, as soon as we are born, we are able physically to reflect the faces of others. Those expressive faces are crucial to development.39 It is hardly revolutionary to say that the socially neglected infant will not grow normally or that shocks, traumas, and deprivations affect his developing nervous system. Commenting on early imitation and proto-conversation in infants, Stein Bråten and Colwyn Trevarthen write, “The mutual mirroring and turn-taking which we find in mature verbal conversation is clearly foreshadowed in these first bouts of sympathetic mimetic play.”40
When research on early human development and its mutual attunements is coupled with the many studies on mirror systems in primates, we may begin to speculate on possible avenues for rethinking hysterical conversion through an intersubjective framework that involves shared neural networks at a preconscious or subliminal level. Although for some people mirror neurons are still subject to debate, there is growing evidence for a prereflective understanding of other human beings’ actions, intentions, emotions, and sensations grounded in neuronal activity.41 Our access to other people is not purely cognitive. It is not purely a matter of linguistically represented machinations about what it might be like to be that other person. We do not know others only through elaborate conscious analogy but far more directly and bodily, and that ability, that corporeal imagination is honed through and by our preverbal, gestural, and tactile relations with others. Forms of embodied imitation may well help explain emotional contagions and hysterical epidemics that do not belong to any single era or place but take various forms depending on the cultural context. There are many examples but three will suffice. Think of the Salem witches in the late seventeenth century, the convulsionists of Saint Medard in the eighteenth,42 and the Cambodian women of the twentieth.
Despite all the worry about malingering in hysteria, it is safe to assume its simulations and metamorphoses do not take place on a self-reflective, conscious level, which is not to say that there isn’t a “hidden observer” beneath the surface or a version of the vacillating co-consciousness described so vividly by researchers such as Morton Prince in the early twentieth century.43 Like hypnotic trance, hysteria may indeed involve knowledge at a subliminal level, which can become conscious. The disturbance of hysteria may interfere with reflective self-consciousness itself, or what Antonio Damasio in Self Comes to Mind calls “the self as witness,” which is “that something extra that reveals the presence, in each of us, of events we call mental.”44
It is not strange to think that emotional shocks might derange this sense of self as witness. Research in PTSD has shown numerous changes in people who suffer from it, including altered cortisol levels.45 Charles S. Myers, who coined the term “shell shock” during the First World War, proposed that the hysterical men in his care had been subject to “a sudden snap or fission, whereby certain nervous or mental processes are ‘functionally dissociated,’ or ‘unconsciously repressed’ by inhibition, from the rest.”46 A Janetian theory. He also noticed that the hysterical patients demonstrated alternately what he called an apparently normal personality and an emotional personality. The apparently normal personality presented with functional symptoms, not the emotional one. Myers writes, “Nature’s purpose in repressing the patient’s painful experiences is obvious.”47 Although he does not develop this idea further, Myers implies that hysteria is adaptive. An involuntary physical handicap appears in place of an intolerable emotion or memory, and by doing so, liberates the patient’s present consciousness to an apparently normal condition.
This also suggests Freud’s idea of repression, a controversial notion in contemporary science to be sure, but one I suspect will not go away anytime soon. The work of Joseph LeDoux and others has demonstrated convincingly that cortical, executive brain regions are involved in affect regulation, and that people seem to keep unpleasant realities out of consciousness, something not only true of hysterical mental patients.48 Anosognosia (the failure to recognize an obvious illness or handicap) in a neurological patient might be described as another kind of dissociation or repression of the inconvenient reality of a paralysis, for example. The fact that the neglect that often accompanies anosognosia can be reversed temporarily by caloric stimulation (water administered in the ear) may suggest, as Mark Solms and Karen Kaplan-Solms have argued, that at a subliminal level the patient knows about the paralysis, just as the hysteric may know at some level that his paralysis can be reversed.49
As opposed to neurasthenia, which he regarded as a slow wear and tear on a man’s nerves, Myers insisted that hysteria was “a necessary result of severe ‘shell shock.’ ” He further pointed out that neurasthenia occurred more often in officers and hysteria in the ordinary soldier. “The reasons for this difference,” he writes, “are not hard to find. The forces of education, tradition and example make for greater self-control in the case of the Officer. He, moreover, is busy throughout a bombardment, issuing orders and subject to worry over his responsibilities, whereas his men can do nothing during the shelling, but watch and wait until the order is received for an advance.”50 It is, I think, in the latter sentence that we find a link between the ordinary combat soldier with hysterical symptoms and the women, past and present, who have suffered from the same symptoms outside of war—a sense of helplessness in the face of overwhelming, uncontrollable circumstances. Women have traditionally had far less to say about their fates than men have. This is still the case in many places in the world, where female infants are murdered, girls are kept out of school, brides are burned, and sexual violence is routine. Hysteria might be described as a crisis of agency and volition within specific social contexts.
In Higher Cortical Functions in Man, A. R. Luria distinguishes animal and human movements by assigning a role to language in human action. “A great number of our voluntary movements and actions arise on the basis of a plan,” he writes, “formed with the intimate participation of speech, which formulates the aim of the action, relates it to its motive, and indicates the basic scheme for the solution of the problem with which the person is faced.”51 I am well aware that “free will” remains a huge question in philosophy and science. Although there has been considerable research on volition in neuroscience, the neural basis of agency, the feeling of “I” in “I move” is not well understood. The parietal cortex has been implicated as well as frontal motor areas.52 Patients who had the presupplementary motor area of their brains stimulated during neurosurgery have reported they felt an urge to move and with more stimulation they did move.53 Exactly how this might relate to everyday agency and volition, however, is murky. The role language plays in volition remains a question overdue for further reflection.54 Because hysteria appears to involve an imaginative embodiment of traumatic emotional experiences, and hypnosis and verbal suggestion have resolved the symptoms in some patients, as have the narrative co-constructions of psychotherapy, an urgent question appears: What is it about language that can alter the symptoms of hysteria?
Every conversion patient has a story, and his or her personal narrative is vital to understanding the physiological theater of the symptom. In Borderlands of Psychiatry (1943), Stanley Cobb describes a case of hysteria in a twenty-one-year-old woman he saw at the psychiatric clinic of Massachusetts General Hospital.55 Like so many girls and women who were admitted to the Salpêtrière, she presented with dramatic symptoms and a history of accidents and abuse. When she arrived she was hyperventilating, had contractures of her feet and hands, but appeared indifferent to her severe symptoms. She had la belle indifference. I have collapsed her story into a series of fragments taken from Cobb’s longer account: alcoholic father, feeble-minded mother; whooping cough, convulsions, and pneumonia before age one; at eighteen months, she fell into a cess
pool and nearly drowned; frequent colds, falls, sleepwalking, and otitis (ear infections) as a child; raped at twelve; fainting spells, vomiting; raped again at sixteen; left school at seventeen after father’s death and found a job doing housework for long hours and low wages; right-side paralysis; uncontrollable twitching, twitching subsequently cured by divine healer; choreic (brief, irregular) movements and tetany (muscle spasms), followed by remission; took work with asthmatic employer and developed panting symptom; hospitalized.
Cobb treated the patient with hypnosis and suggestion. His summation of the case demonstrates his emphatic position that drawing a line between the mental and the physical is futile: “The observations on this patient are unique because it was possible to study a case of hyperventilation tetany by chemical analysis of the blood and then, at the same sitting, after curing the tetany by hypnosis, to study other samples of blood and show that they were normal. Moreover, the chemical changes in the patient’s blood were extreme, and illustrate that grave, demonstrable chemical changes can be wrought by the hysterical process. Conversely, it was proved that ‘objectively demonstrable changes’ in the patient’s blood chemistry were produced by hypnotic suggestion.”56
When Cobb wrote his book in the 1940s, the value of a single patient’s detailed narrative was still appreciated in American medicine, not as a beside-the-point literary exercise, but as a representation of the dynamic progress of an illness and, in this particular case, the series of emotional shocks that must be seen as essential to it. Indeed, Cobb defines psychogenic ailments as “due to maladjustments in interpersonal relations.”57 It is hardly surprising that the multiple traumas in this woman’s life had weakened her feeling that she had much control over what happened to her.
The Cambodian woman who saw her family forcibly taken away from her said that she wept for four years and when she stopped weeping, she was blind. In her case and in the cases of her fellow refugees, the transformation from witness of horror to a patient with functional blindness can be described as symbolically perfect, a kind of waking dream-work, if you will. The women’s bodies have become ambulatory metaphors of unbearable experience, not unlike the emotionally salient, concrete images of our dreams. Merleau-Ponty called the dream “a corporeal ontology, a mode of being with an imaginary body with weight.”58
Neither the bodily metamorphoses wrought by conversions nor the analgesia or other transformations that take place under hypnosis can be explained away by saying that nothing is organically wrong in the hysteric or altered in the hypnotized subject. We can say that we do not fully understand the organic phenomena. The application of the pragmatic parallelism advocated by Hughlings Jackson to hysteria is, I think, a dead end, as is a simplistic reduction that draws straight lines from the psychological level down to the physiological level. In a 1979 article on hypnosis and healing in the Australian Journal of Clinical and Experimental Hypnosis, Kenneth Bowers writes that the highly specific healing power of hypnosis is rooted in “the central nervous system’s capacity for imagery and symbolism” and must involve “the transduction of information from a semantic to a somatic level, possible through the mediation of imagery.”59 It may be impossible to free ourselves of “levels,” but it is important to begin to think of imagery, symbolism, and the imagination as bodily realities that do not fit the current model of psyche over soma or the tendency to regard the brain as an organ isolated from the rest of the nervous system as well as other systems of the body.
Understanding hysteria will require an upheaval in our understanding of what mind-brain actually is, a paradigm change that is already under way. “Embodiment” is the word of the moment, but it is in desperate need of elucidation. It is the body that carries meaning, meaning that is at once felt and symbolized. Our brains are in that body, which is made through other bodies in a world. We are intersubjective creatures, even before birth. The language we share is one of the body’s communicable gestures; it is in and of us, as is much that we learn. A deep comprehension of hysteria will also require multiple methods, brain-imaging certainly, and the neural locationism it inevitably inspires, combined with more dynamic, narrative models that include self-reports and case studies. It will also involve taking ideas from the past seriously and discarding the hubris of the present. It is well worth remembering what William James wrote at the very end of his Psychology: Briefer Course. The only hope for science is “to understand how great is the darkness in which we grope, and never to forget that the natural-science assumptions with which we started are provisional and revisable things.”60
Suicide and the Drama of Self-Consciousness
* * *
WHEN I worked as a volunteer writing teacher in a locked ward for psychiatric inpatients, I had a number of suicide survivors in my classes over the course of three and a half years. Some came attached to IVs, some wrapped in thick bandages, some with healing bruises. Others carried no visible sign of their ordeal. They were men and women, young and old. Their diagnoses varied—major depression, schizophrenia, bipolar, borderline, OCD. Their moods were just as variable. I had students continually on the verge of tears, students in states of volatile excitement, and students so withheld they could barely produce a whisper. One young man wrote a poem in the first person in which he recorded a long list of bungled suicide attempts. Despite his earnest efforts to destroy himself by hanging, drowning, leaping from buildings, and slicing his veins, he failed every time. The poem was funny, but the poet admitted that he had landed in the hospital because he had tried to kill himself yet again. He had turned his desperation into black comedy.
Humor allowed the patient in my class some distance from himself and his desire to die. I venture to guess it also let him express his relief that he had survived his attacks on himself. While he was writing in my class, he did seem to enjoy being alive. The subject of suicide is a terrible one, and I would like to say that I would not have taken it on if it had not touched me closely a few times in my life. When I was in college, the mother of a beloved friend killed herself. This is not my story to tell, and I do not feel free to share the details of that death. I can say that the shock and grief among the family members the woman left behind her continue to reverberate inside me. I was not her child. What I experienced I experienced through my friend, which is to say I did not suffer as she did. My suffering was for her. I am also close to two couples who lost children to suicide. One was a teenager, the other a young adult. These deaths caused agony in the parents, states of wretchedness so extreme it is hard to understand how they found the strength to go on. They did find that fortitude, but going on is not the same as “getting over it” or “finding closure”—those appalling American expressions often used in cases that involve horrible deaths. The wounds endure.
Suicide touches everyone. It is startlingly common. The global statistic most often cited is that more than a million people kill themselves every year. The discipline called suicidology, a discipline made from several disciplines, poses essential questions: What is it about human beings that makes suicide possible? Can we prevent it, and if so, how? Is it ever a reasonable course of action? What is being attacked when someone turns violently against himself or herself? There is now a vast literature on the subject, but as in many fields of study, ideas come in and out of fashion depending on the historical period and intellectual climate. The questions are exigent. If the patient who was able to joke about his own attempts to do away with himself demonstrates anything, it is that there are people who survive their own self-destructive acts and are glad they did. How many others might have been glad if they’d had the chance?
For centuries in the West, self-murder was a problem of ethics, not pathology. The emphasis began to shift in the seventeenth century, and by the nineteenth, suicide had become an illness. In 1828, the English physician George Man Burrows wrote, “A propensity to self-destruction, like any other peculiar delusion, is but a symptom of deranged intellect, and can only be viewed as a feature of melancholia
.”1 This statement, which connects suicidal desires to what we now call depression, has a contemporary ring to it. And yet, not all people suffering from major depression are suicidal and not all suicidal people are depressed. More people with temporal lobe epilepsy kill themselves than people with other forms of epilepsy and at rates far higher than nonepileptic people, but we cannot conclude from this that either depression or epilepsy causes suicide.2
As I made my way through countless papers and books on the subject, I read hundreds upon hundreds of times that more than 90 percent of all people who kill themselves have a mental disorder, but there was never a note explaining where this statistic comes from. Is this an American estimate, a European one? My efforts to find the source for this bit of received knowledge yielded nothing. How can anyone actually know this statistic? Psychiatric diagnosis is not an exact science. Criticism of the DSM as a purely descriptive text that ignores etiology and shifts its categories as ideological winds blow is hardly new. In his book November of the Soul (1991), George Howe Colt mentions a Harvard study in which doctors were given case studies of people who had killed themselves. When the physicians examined the same narratives but without mention of the ultimate suicide, the highest estimate of mental illness was 22 percent. When the end of the story was provided, the estimate leapt to 90 percent.3 In this instance, the last chapter appears to have rewritten those that came before it. This is not to say that mental illness does not make many people more vulnerable to suicide, but rather that repeating a number without foundation can create a dubious truth.
A Woman Looking at Men Looking at Women Page 50