Simone Weil wrote, “Doubt is a virtue of the intelligence.”380 As with every other principle, enshrining doubt as the highest principle in thought may become merely another excuse for intolerance, but I believe there are forms of doubt that are virtuous. Doubt is less attractive than certainty to most people. The kind of doubt I am thinking of doesn’t swagger. It doesn’t shake its finger in your face, and it doesn’t go viral on the Internet. Newspapers do not write about it. Military parades do not march to tunes of doubt. Politicians risk mockery if they admit to it. In totalitarian regimes people have been murdered for expressing doubt. Although theologians have understood its profound value, religious fanatics want nothing to do with it. The kind of doubt I am thinking of begins before it can be properly articulated as a thought. It begins as a vague sense of dissatisfaction, a feeling that something is wrong, an as-yet-unformed hunch, at once suspended and suspenseful, which stretches toward the words that will turn it into a proper question framed in a language that can accommodate it. Doubt is not only a virtue in intelligence; it is a necessity. Not a single idea or work of art could be generated without it, and although it is often uncomfortable, it is also exciting. And it is the well-articulated doubt, after all, that is forever coming along to topple the delusions of certainty.
III
* * *
WHAT ARE WE?
* * *
LECTURES ON THE HUMAN CONDITION
Borderlands: First, Second, and Third Person Adventures in Crossing Disciplines
* * *
We must, in general, be prepared to accept the fact that a complete elucidation of one and the same object may require diverse points of view which defy unique description.
—Niels Bohr, 1929
EVERY theoretical construct, every system of ideas, every intellectual map created to explain who and what human beings are is vulnerable at the site of incision—the place where we sever one thing from another. Without such dissections, there can be no formal thought, no discrete disciplines, no articulations of human experience. These separations delineate the borders between inside and outside, me and you, up and down, here and there, true and false. They become part of every scholarly life. As in a jigsaw puzzle, however, the lines of suture change depending on your historical moment and your field of study. For many years now, I have found my mental life parsed in multiple ways for the simple reason that I have been immersed in disciplines that not only have distinct vocabularies but may rest on different paradigms and employ different methodologies for understanding the big question I care most about: What are we? This question frequently changes shape and becomes other questions: What is a person, a self? Is there a self? What is a mind? Is a mind different from a brain?
Our taxonomies, our categories, our truths vary. The objective or third-person truth of the neuroscientist is not the subjective first-person truth of the artist. At a conference last year, a neuroscientist friend of mine articulated the difference in this comment. Artistic truths, he said, are inevitably “squishy.” Scientific truth, on the other hand, is hard, tough, verifiable, and rigorous, to which I replied, “And often muddled by dubious epistemological assumptions.” Rules for knowing—how we can know what we know—lie beneath every edifice we call a discipline. Knowing turns on perspective, first or third person, as well as notions of what is hard and soft. What is certain is that if we want to do the interdisciplinary dance, we must dislodge ourselves from a fixed place and begin to jump across borders and adopt alien views.
In his novel Life: A User’s Manual, Georges Perec tells many stories within stories about the residents of an apartment building in Paris. One of them is about an anthropologist, Marcel Appenzzell, who goes off alone in 1932 to study a people in Sumatra, the Kubus. Despite grotesque deprivations that result in his near starvation, the determined Appenzzell pursues the Kubus for years. Although he is convinced that they are not a nomadic people, he chases them continually as they pull up stakes and move into increasingly uninhabitable, mosquito-infested parts of the island’s interior. At last, the “cruel and obvious truth” dawns on the poor researcher.1 The Kubus are running away from him. At once comic and tragic, Perec’s narrative is a parable about perception. The anthropologist does not hover over his domain as a god might—looking down on an already-given reality. Like so many researchers, Appenzzell has forgotten his own role in the story, has left himself out, as if he were not occupying any space, as if he were an invisible Mr. Nobody.
The third-person discourse of much academic writing belongs to an authoritative Professor Nobody. We all know someone is there, but she or he has disappeared from the text except as an author or authors—a name or list of names. The voice from the clouds is a deeply embedded convention in the novel as well, which takes the form of the omniscient narrator, but the novel asks for the reader’s leap of faith, which the science paper does not. The absence of the “I” or “we” in academic writing—whether in the sciences or the humanities—is a bid to cleanse the text of subjective taint, of “squishiness.” This cleansing is achieved not by becoming omniscient but through an intersubjective consensus among those who know the rules and are playing the same game. In the “hard” sciences, “control” is the operative word. Terms must be defined and agreed upon, methods clearly elucidated, and if all goes well, say, in an ideal experiment, results can be replicated again and again. The game is played as if there is an objective reality that can be accurately perceived without implicating the perceiver. I am not arguing against the scientific method or the third person. The restrictions of Mr. Nobody’s model, his game of objectivity, have brought us a world of medicines and technology most of us depend upon. But when the object of study is subjectivity, what does it mean to take an objective view?
David Chalmers is the Anglo-American analytical philosopher who coined the term “the hard problem” in consciousness studies. The hard problem is the gap between the first-person experience of mind-brain-self versus an objective third-person view of a working mind or brain. For the Analyticals, as I call them, the question turns on the problem of qualia—the subjective experience of sensation and feeling, usually framed as “what it is like to be” inside a particular person or creature. In a paper from 1989, Chalmers wrote, “It is extremely difficult to imagine how a physical explanation of brain architecture could solve these problems [inner experience, qualia].” He then goes on to distinguish between “hard-line reductionists,” those who believe that everything about the mind can eventually be explained by and reduced to third-person brain science, and the soft-line reductionists, those like Chalmers himself, who believe further explanation is needed to understand inner reality. “Most hard-line reductionists,” he writes, “probably regard soft-line reductionists as unbearably ‘wimpy,’ in rather the same way that an atheist would regard an agnostic.” For the third-person hard-liners, the defenders of first-person qualia appear, Chalmers writes, “soft, squishy, and mystical.”2
It is patently obvious that studying a brain-mind from the outside is not the same as having a brain-mind on the inside of one’s skull and looking out at the world. From the inside, we are totally unaware of the neuronal, chemical processes that go on in our brains, and, from the outside, thoughts, feelings, and perceptions are invisible. As Georg Northoff and Alexander Heinzel point out in a paper, “The Self in Philosophy, Neuroscience, and Psychiatry: An Epistemic Approach,” the third-person approach of neuroscience is nonphenomenal. It remains outside immediate experience, subjective time, and the lived body because it is not grounded in any actual human perspective. Unlike the first and second person, in a third-person view there is no qualia. They write, “The self appears so different in the FPP [first-person perspective], SPP [second-person perspective] and TPP [third-person perspective] that it may even be said that we are not talking about the same thing. Thus the question arises of who is talking about the real self. We believe that is one of the reasons why it is so difficult to combine and integrate the different scien
ces in transdisciplinary theories.” The authors decide to put the question of ontology—the philosophical problem of being—on the shelf, and then continue, “The concepts of self should be considered as relative to the respective perspective” and there should be no “superiority or inferiority” among them.3
I agree with Northoff and Heinzel that an evenhanded approach has much to offer. It acknowledges that theoretical models are just that—frames for viewing, which alter what is seen. The problem is that a hierarchy exists, and it involves underlying metaphors of “tough” and “squishy.” In our world, the disciplines considered hard have an implicit, if not explicit, superiority. The ideas that attract one person and another cannot be divorced from his or her temperament. And temperament, I would add, belongs to the realm of feeling. Every one of us is attracted to ideas that confirm a gut feeling about how things are, and that gut feeling is inevitably subjective, not objective, because it belongs to a particular body and its reality. After all, human beings are not born poets or engineers or literature professors or mathematicians. They are drawn toward a kind of work or way of thinking for reasons that are often not fully conscious but have strong emotional meanings. A deep fear of or tolerance for squishiness may surely be considered one of them.
In his book Theory and Practice, Jürgen Habermas takes the third-person, value-neutral, positivistic philosophy of science to task with savage irony. In the following passage, his targets are specific writers who have attempted to modify their positivism, but his words may stand as a criticism of hard science in general and its view of subjectivity. He writes, “But the result of its labors is monstrous enough: from the mainstream of rationality the pollutants, the sewage of emotionality, are filtered off and locked away hygienically in a storage basin—an imposing mass of subjective value qualities.”4 Through this trope, emotion as sewage, Habermas identifies a Western prejudice that posits a hierarchy of objectivity over subjectivity, mind over body, head over bowels, reason over emotion, order and cleanliness over mess and dirt, and, of course, the masculine over the feminine. It is within this intellectual heritage that we must situate Simone de Beauvoir’s striking formulation in The Second Sex, “In masculine hands logic is often violence.”5
For Chalmers, the third person is simple and transparent, the first person murky and impenetrable. This is rather fascinating because he clearly doesn’t consider his own first-person position reliable as evidence of any kind, but that is because he is a product of a particular tradition. Most contemporary thinkers are parochial. They rarely leave the arguments of their close compatriots for those of foreigners. The escape from subjectivity, however, is problematic. How do we cleave the object in the world from the subject who perceives it? Where is the line drawn? Can a line be drawn? The subject/object problem is an old and fraught one, and many a philosophy student can recite the narrative, which may be said to begin with Descartes’s extreme doubt about knowing and himself, which is exploded by Hume, then reconfigured by Kant in his answer to Hume, and after that returns in a new form in German idealism and is further transformed in phenomenology with its supreme focus on the first person in the work of Husserl, who influenced Maurice Merleau-Ponty but also Heidegger, who then had considerable influence on poststructuralist, posthumanist continental thinking.
Another tradition that began with Gottlob Frege and that metamorphosed into various trends in Anglo-American analytical philosophy took issue with Kant’s idea that “without the sensible faculty no object would be given to us.”6 Frege did not believe that logic was a product of how the human mind worked. Rather, he believed in a realm of logic that has its own objective reality. In this theoretical model, the “I” has a relation to an external “it,” logic, and the task is to carve up that reality into truthful categories that create sharp boundaries or joints that may never have been seen before but nevertheless are already there, lying in wait, prior to any scientific investigation.
Behaviorism wiped its hands of subjectivity altogether, arguing that there was no need to study or ponder the inner life of a person. The only thing that was needed was to look from the outside at behavior. Behaviorism is now out of fashion, although its tenets haunt much of science. The underlying foundations of these ideas are very different. At the two extremes lie analytical philosophy and much of natural science, in which the first person is an unreliable murky cave. At the other, in phenomenology and the arts, the first person is a pregiven reality of all human experience and must be explored.
Another parallel but vital story I can tell only from a great distance because my knowledge extends no further, begins with one revolution, Newtonian physics, and is transformed in another, quantum physics: A stable theoretical frame viewed from nowhere is replaced by one that wobbles and implicates the observer in the observed. The view from nowhere becomes the view from somewhere. There are those who believe quantum theory is relevant to understanding brain function and others who don’t. The battles over these myriad positions and sliding borders continue, and you will be relieved to know I won’t try to solve these questions in an absolute way. Were I to make the attempt, I am afraid I would find myself as lost as poor Appenzzell in the wilds of Sumatra and would never return from the field.
At the very least, we can say a pronominal stance is crucial to what appears before us—the thing we see. While writing this essay, I have changed pronouns several times, moved from “I” to “we,” as well as to “she” and “he,” often without fully conscious deliberation. My “I” can be purely rhetorical or deeply personal. By using the first person, however, I always imply the second person. I am speaking to someone out there, a general “you,” but a you nevertheless. The “I” carries its own ghostly interlocutor. In the eighteenth century, Wilhelm von Humboldt (1767–1835) wrote:
There lies in the primordial nature of language an unalterable dualism, and the very possibility of speech is conditioned by address and response. Even thinking is essentially accompanied by the inclination toward social existence and one longs for a Thou who will correspond to his I.7
Writing about dialogue and what he called “the between,” which he considered an ontological reality, Martin Buber restates this position: “Humboldt knew exactly through what process the fact of the Thou in the I is established: through the I becoming a Thou to another I.”8 In Problems in General Linguistics, 1966, Émile Benveniste reiterates a similar dialectic of person. “Consciousness of self,” he writes, “is only possible if it is experienced by contrast. I use I only when speaking to someone who will be a you in my address. It is this condition of dialogue that is constitutive of person, for it implies that reciprocally I becomes you in the address of the one who in turn designates himself as I.”9 Reciprocity of person might be described as a fundamental linguistic hinge, but the flexibility required to use this axis of discourse is acquired late. Initially, young children refer to themselves by their proper names in the third person, perfectly understandable in terms of human development because proper names are static and pronouns are mobile.
For Benveniste, the third person is nonperson, because the third person cannot enunciate. Corresponding to this distinction is his idea of personal discours, located on the I-you axis as opposed to third-person histoire, in which, he writes, “there is then no longer even a narrator . . . No one speaks here; the events seem to narrate themselves.”10 Benveniste’s linguistic boundary is not drawn between self and other but rather between a first- and second-person lived reality and what remains outside it—the nonperson or Mr. Nobody who narrates from nowhere.
We are inexorably led to the fundamental question: What does saying “I” and “you” have to do with who and what we are? For Benveniste, “Ego is he who says ego,” and language is responsible for subjectivity “in all its parts.”11 This situates the linguist in a twentieth-century Continental tradition in which the subject is constituted by signs. Michel Foucault is a brilliant elucidator of this position, a mode of thought that posits a world in which
the body is an entity created by the discourses of history, a body made of words. However, as Lynda Burke points out, “The body, for all its apparent centrality in Foucault’s work, disappears as a material entity.”12 In her book Giving an Account of Oneself, Judith Butler articulates a postmodern position: “Indeed, when the ‘I’ seeks to give an account of itself, an account that must include the conditions of its own emergence, it must, as a matter of necessity, become a social theorist.”13
I agree with Butler that the “I” is profoundly shaped by our moment in history and its social conventions, that our relations to our own bodies, crucially to what has been called gender, are bound up in intersubjective cultural creations that for better and for worse become us. For example, when metaphors of hard and soft are applied to a person, a discipline, a theory, or a text, we are binding those persons, disciplines, theories, and texts in meanings that have a long and complex social history, and to deny this seems absurd.
The insight that the self is a cultural fiction overlaps with that of the analytical philosopher Daniel Dennett. Dennett does not believe in either qualia or the self. For him, we are made of the fictions we spin or, rather, that spin us. These create what he calls “a narrative center of gravity,” an illusion of selfness. Dennett, highly influenced by behaviorism, remains far from the tradition of Foucault, Butler, and French theory, but in Consciousness Explained, he recognizes that his ideas bear a certain resemblance to this alien thought. He seems to have arrived at this revelation not by reading the thinkers themselves but through the English novelist David Lodge. In Lodge’s novel Nice Work, an academic, Robyn, a fan of Jacques Derrida, espouses “semiotic materialism,” which her creator, Lodge, in an ironic rephrasing of Heidegger’s famous dictum “Language speaks man,” reformulates as “You are what speaks you.” With evident good humor, Dennett writes, “Robyn and I think alike—and of course we are both, by our own accounts, fictional characters of a sort, though of a slightly different sort.”14
A Woman Looking at Men Looking at Women: Essays on Art, Sex, and the Mind Page 42