Work and Love
If thinking involves feeling and movement, as well as symbols, if the early proto-dialogues between infant and adult are important to how thought develops, if there is a reciprocal relation between spontaneous organic growth and experience, if meaning begins in early life with the establishment of motor-sensory-emotional rhythms, a kind of metrics of being, which remain with us and are part of even the most sophisticated forms of intellectual and creative achievement, then classical cognitive computational theory of mind is impossible. If we have a shared physiological preconceptual connection to other people, which means no person is an isolated monad, and if the imagination itself is predicated on those early, vital relations with other people, which then evolve into conceptual and symbolic forms over time, then the premise of CTM is also wrong. I have argued elsewhere that the origin of narrative lies in the early exchanges between self and other, that all forms of creativity—artistic and scientific—are inseparable from our moving, sensing bodies.376
If a person sticks to the straight-and-narrow concerns of a single field or a field within a field, he will pose very different questions and come up with very different answers from the person who roams about in various disciplines or just takes different perspectives into consideration within the same field. If there is one thing scientists now agree on, it is that there is an unconscious and that most of what our minds do is done unconsciously. Their conception of the unconscious is not Freudian for the most part, and there are many disputes about how this great underground reality works, but no one is arguing anymore that it does not exist or can be ignored.
Since scientists and scholars have minds, they also have unconscious minds, and that unconscious has an influence on what each one of us believes and perceives. William James freely acknowledged the connection between a person’s temperament and his ideas. In James’s view, there were tough-minded philosophers and tender-minded ones, and they would be forever at odds with each other.377 Character plays an important role in the ideas a person embraces, and, James argued, this will never change as long as people continue to think. Some people lean toward the hard and others toward the soft, and they do so for reasons both conscious and unconscious. The hard might be described as atomic, mechanical, rational, and wholly intelligible. The soft, on the other hand, is gemmule-like, corporeal, emotional, and more ambiguous. The former is usually coded as masculine in our culture and the latter as feminine, although it is obvious that one need not be identified as either male or female to hold views of either kind.
Furthermore, every person has a story, a shaping narrative, if you will. I cannot tell you how many times I have met neurologists who suffer from migraine or have family members with brain damage. I have met numbers of psychiatrists and psychoanalysts who grew up with mentally ill parents or siblings or had difficult childhoods themselves. I have met people who have devoted their lives to suicidology because a loved one killed him- or herself. I have also met neuroscientists whose personalities mirror their work. Those who are stiff, withheld, and socially unforthcoming seem to produce work narrow in scope and rigorous in method. They are generally uninterested in literature and philosophy and find it hard to see their importance in general. Others who are warm and amiable produce work that is broader in scope and more prone to speculation. They are also more likely to mention work that has been done in other disciplines. I am not making a comment about quality. There is superb narrow work and superb broad work. Usually, although not always, personal stories are suppressed in published work by these scientists, but it would be foolish not to acknowledge that emotional events in every life influence not just the work a person chooses to do but how she does her work. The reasons some ideas attract us and others repel us are by no means always conscious.
Descartes lost his mother at an early age and became a thinker who gave birth to the truth out of the lonely chamber of his own head. Surely this is not without significance. Margaret Cavendish was a female philosopher in the seventeenth century. Women simply did not publish books on natural philosophy in the seventeenth century, although there were a number of women who wrote and engaged in philosophical debates, especially in the form of correspondences. Is Cavendish’s position as a woman in that culture at that time unrelated to her idea that “man” is not the only creature in the universe possessed of reason? Isn’t it reasonable to acknowledge that her marginalized position gave her a perspective most of her philosophical peers at the time could not share, but also insights to which they were blind? Like many children of his class in England, John Bowlby saw little of his parents and was raised by a nanny whom he loved. According to most accounts, she left the household when he was four years old, and he suffered from the loss. At eleven, Bowlby was sent to boarding school, where he was not happy. In private, he admitted that his childhood experiences had influenced him. He understood himself to “have been sufficiently hurt but not sufficiently damaged.”378 Presumably, he meant by this that the pains of his childhood gave him a way into questions of attachment without ruining him as a person.
Objective methodologies are important. Studies that are replicated over and over again may be said to have “proven” this or that fact, but as Bateman’s fruit fly study shows, sometimes results that conform to expectation are so welcome they become true for hundreds of scientists until proved wrong or at least problematic. Furthermore, ongoing research in science is forever coming up with conflicting results, as is easily seen in spatial rotation studies, but that is just a single example. There are hundreds of others. And research that is inconclusive or comes up with no result usually remains obscure. For all the publicity her twin study received, Myrtle McGraw’s originality was ignored for years. Then again, her tone was careful and thoughtful, and she didn’t loudly advertise a conclusion that supported either heredity or experience. On top of it, she was a woman. The genius of women has always been easy to discount, suppress, or attribute to the nearest man.
When a person cares about her work, whether she is a poet or a physicist, doesn’t her labor involve identification and attachment? In every discipline, without exception, human beings attach themselves to ideas with a passion that is not dissimilar to our love for other people. If you have spent most of your life on Virginia Woolf or have been immersed for thirty years in the writings of John Dewey, you may well inflate their importance in the history of literature and ideas, at least in the eyes of people who have spent their lives on other texts or have no interest in books at all. If you have made a career in evolutionary psychology, arguing that human beings have evolved hundreds of discrete problem-solving mental modules, is it surprising if you discount evidence that suggests those modules may not exist? If your entire career rests on the assumption that the mind is a computational device, are you likely to abandon this idea just because there are people hard at work writing papers in other fields who insist you are wrong? A neuroscientist who has devoted herself to a particular part of the brain, say the insula or the hippocampus or the thalamus, is bound to have a somewhat aggrandized feeling about that particular brain area, just as someone who has been working on the “connectome” will be devoted to the idea that the complete mapping of a creature’s neural connections will be invaluable to the future of science. Could it be any other way?
I am deeply attached to the novel as a form of almost enchanted flexibility. I believe in it, and, unlike many people, I think reading novels expands human knowledge. I also believe it is an extraordinary vehicle for ideas. I make my living writing novels. My attraction to phenomenology and psychoanalysis, both of which explore lived experience, fits nicely with my interest in literature. The novel is a form that addresses the particularity of human experience in one way or another. Phenomenology investigates consciousness from a first-person point of view, and psychoanalysis is a theory that includes the minutiae of a patient’s day-to-day life. If you couple the inevitable attachment many of us feel for our work, if we are lucky enough to feel attached to it,
with a need to “advance” in a field of choice, then it is hardly odd that people who care about their work form attachments to its content that cannot be described as objective. These passions are, in fact, subjective, but they are also intersubjective because no one, not even the novelist, works entirely alone. She sits in a room of her own and writes, but she is in that room with others, not only the real people who have shaped her unconscious and conscious imagination, but also fictive people and the voices of hundreds of people now dead who left their words in the books she has read.
Human beings are animals with hearts and livers and bones and brains and genitals. We yearn and lust, feel hunger and cold, are still all born from a woman’s body, and we all die. These natural realities are inevitably framed and understood through the culture we live in. If each of us has a narrative, both conscious and unconscious, a narrative that begins in the preverbal rhythms and patterns of our early lives, that cannot be extricated from other people, those to whom we were attached, people who were part of shaping the sensual, muscular, emotional rhythms that lie beneath what become fully articulated narratives intended to describe symbolically the arc of a singular existence, then each of us has been and is always already bound up in a world of others. Every story implies a listener, and we learn how to tell stories to make sense of a life with those others. Every story requires both memory and imagination. When I recall myself at six walking to school, I am not really back in my six-year-old body. I must travel back in time and try to imagine what I felt like then. When I imagine the future, I rely on the patterns of the past to frame what may happen next Thursday. When I invent a character I use the same faculty. I draw on that continuum of memory and imagination. Human beings are predictive, imaginative creatures who navigate the world accordingly.
Could it be that the language we have to speak about what we are has itself become intractable? How far have we come from Descartes and Hobbes and Cavendish and Vico? How are we to think of minds and bodies or embodied minds or bodies with brains and nervous systems that move about in the world? In what way are these biological bodies machines? Does what we call mental life emerge from a developing organism or is matter shot through with mind as some physicists and panpsychist philosophers have argued? Is the whole person more than her parts or can she be broken down like a car engine? Exactly how do we understand an individual’s borders in relation to what is outside him? Is it possible to have a theory about the mind or the world or the universe that doesn’t leave something out? Should we turn away from things we can’t explain?
When I think of these questions, I travel back to childish thoughts, to when I lay in the grass and watched the clouds and thought how strange it was to be alive, and I placed my hand on my chest to feel my heart beat and counted until I got bored. Sometimes I would say a word to hear it move from inside my head to my mouth and then outside me as sound. Sometimes I would feel I was floating and leaving my body behind. I liked to go to a place behind my family house where the fat roots of a tree extruded from the steep banks above a creek and curled to form a chair where I could sit and meditate on the same questions I have meditated on in this essay, albeit from inside a much younger, naïve self, who lived in another time and place. My recollection of those reveries stays alive in me only from my current perspective in an ever-moving present. Over and over, I thought how strange it was to be a person, to see through eyes, and smell through something that poked out of the middle of my face and had holes in it. I would wiggle my fingers and stare at them amazed. Aren’t tongues awfully odd? Why am I “I” and not “you”? Are these not philosophical thoughts? And don’t many children have them? Isn’t good thinking at least in part a return to early wonder? Every once in a while, I tried to imagine being nowhere—that is, never having been anywhere. For me, it was like trying to imagine being no one. I still wonder why people are so sure about things. What they seem to share is their certainty. Much else is up for grabs.
Coda
I do not know who put me into the world, nor what the world is, nor what I am myself. I am terribly ignorant about everything. I do not know what my body is, or my senses, or my soul, or even that part of me that thinks what I am saying, which reflects about everything and about itself, and does not know itself any better than it knows anything else.379
Blaise Pascal—mathematician, physicist, religious thinker—wrote these words in his Pensées, a collection of notes for a work he did not live to write, but which were published in 1669, seven years after his death. Pascal knew a lot. He invented an early calculating machine, the syringe, the hydraulic press, and a roulette machine, and he pioneered an early version of the bus system. His work on barometric pressure resulted in Pascal’s law. He contributed theorems to geometry and binomial mathematics. Nevertheless, his claim to ignorance must be taken seriously. The domains of ignorance he mentions—about the soul or psyche, about the sensual body, as well as about the nature of reflective self-consciousness, that part of a person that can think about the world around himself and about himself and his own thoughts—remain mysterious in ways general relativity does not.
This can be hard to accept because if anything seems to exist on a high and rarefied plane it is physics. After all, what could be more important than puzzling out the secret laws of the universe? And yet, the physicists who have entered into the consciousness debates do not have one answer; they have different answers. Many biologically oriented scientists point to their own hands-on, close-up research that appears to fly in the face of timeless mathematical reduction. To borrow an image from Cavendish’s world: The worm- and fish-men are in conflict with the spider-men.
It is true that since the seventeenth century most people have lived in an age of science and have reaped the benefits and lived the nightmares of its discoveries. The “mind,” however, has been an especially bewildering concept, one fought over fiercely for centuries now. Computational theory of mind has deep roots in the history of mathematical abstraction in the seventeenth century and its “misplaced concreteness,” as Whitehead called it, mistaking an abstraction or model for the actuality it represents. With its mind-body split and its prejudice against the body and the senses, this tradition also harbors, sometimes more and sometimes less, strains of misogyny that have infected philosophy since the Greeks. The brain science that adopted the computer as its model for the mind cannot explain how neural processes are affected by psychological ones, how thoughts affect bodies, because the Cartesian split between soul and body continues to thrive within it.
These scientists have ended up in a peculiar place. Descartes related his rational immaterial soul to God. The immaterial soul of the present appears to be disembodied information. Some of the AI scientists who have embraced the latter kind of soul have been led step-by-step to its logical conclusion: an imminent supernatural age of immortal machines. Computation has become increasingly sophisticated and ingenious, but I believe computational theory of mind as it was originally understood in cognitive science will eventually breathe its last breath, and the science historians of the future will regard it as a wrong turn that took on the qualities of dogma. I could be wrong, but I have read nothing that leads me to believe otherwise. Personally, I think the corporeal turn is a move in the right direction.
But then I, too, am a situation, the product of years of particular experiences that include reading and writing and thinking and loving and hating, of seeking and finding and losing and seeking again. I did not make myself but was made in and through other people. I cannot begin to understand myself outside my own history, which includes my whiteness and femaleness and class and privileged education, as well as my tallness and the fact that I like oatmeal, but also myriad elements that I will never be able to name, bits and pieces of a life lived but long forgotten or sometimes half remembered in the way dreams are, with no guarantee that it was actually like that at all.
I am still a stranger to myself. I know I am a creature of unconscious biases and murky, indefinab
le feelings. Sometimes I act in ways I can’t comprehend at all. I also know that my perception of the world is not necessarily another person’s perception of it, and I often have to work to discover that alien perspective. Other times I seem to feel what another person feels so well, it is almost as if I have become him. Some fictional characters are much more important to me than real men and women. Every discipline has its own myths and fictions, for better and for worse. Many words slide in meaning depending on their use. The words “genes,” “biology,” “information,” “psychological,” “physiological” change so often, depending on their contexts, that confusion is bound to result.
My own views have been and are subject to continual revision as I read and think more about the questions that interest me. Openness to revision does not mean a lack of discrimination. It does not mean infinite tolerance for rank stupidity, for crude thinking, or ideology and prejudice masquerading as science or scholarship. It does not mean smiling sweetly through inane social chatter about genes, hardwiring, testosterone, or whatever the latest media buzz has on offer. It means reading serious texts in many fields, including the arts, that make unfamiliar arguments or inspire foreign thoughts you resist by temperament, whether you are a tough-minded thinker or a tender-minded one, and allowing yourself to be changed by that reading. It means adopting multiple perspectives because each one has something to tell you and no single one can hold the truth of things. It means maintaining a lively skepticism accompanied by avid curiosity. It means asking questions that are disturbing. It means looking closely at evidence that undermines what you thought had been long settled. It means getting all mixed up.
A Woman Looking at Men Looking at Women Page 41