The Blank Slate: The Modern Denial of Human Nature

Home > Nonfiction > The Blank Slate: The Modern Denial of Human Nature > Page 7
The Blank Slate: The Modern Denial of Human Nature Page 7

by Steven Pinker


  This idea can make sense of other kinds of variability across cultures. Many anthropologists sympathetic to social constructionism have claimed that emotions familiar to us, like anger, are absent from some cultures.19 (A few anthropologists say there are cultures with no emotions at all!)20 For example, Catherine Lutz wrote that the Ifaluk (a Micronesian people) do not experience our “anger” but instead undergo an experience they call song. Song is a state of dudgeon triggered by a moral infraction such as breaking a taboo or acting in a cocky manner. It licenses one to shun, frown at, threaten, or gossip about the offender, though not to attack him physically. The target of song experiences another emotion allegedly unknown to Westerners: metagu, a state of dread that impels him to appease the song–ful one by apologizing, paying a fine, or offering a gift.

  The philosophers Ron Mallon and Stephen Stich, inspired by Chomsky and other cognitive scientists, point out that the issue of whether to call Ifaluk song and Western anger the same emotion or different emotions is a quibble about the meaning of emotion words: whether they should be defined in terms of surface behavior or underlying mental computation.21 If an emotion is defined by behavior, then emotions certainly do differ across cultures. The Ifaluk react emotionally to a woman working in the taro gardens while menstruating or to a man entering a birthing house, and we do not. We react emotionally to someone shouting a racial epithet or raising the middle finger, but as far as we know, the Ifaluk do not. But if an emotion is defined by mental mechanisms—what psychologists like Paul Ekman and Richard Lazarus call “affect programs” or “if-then formulas” (note the computational vocabulary)—we and the Ifaluk are not so different after all.22 We might all be equipped with a program that responds to an affront to our interests or our dignity with an unpleasant burning feeling that motivates us to punish or to exact compensation. But what counts as an affront, whether we feel it is permissible to glower in a particular setting, and what kinds of retribution we think we are entitled to, depend on our culture. The stimuli and responses may differ, but the mental states are the same, whether or not they are perfectly labeled by words in our language.

  And as in the case of language, without some innate mechanism for mental computation, there would be no way to learn the parts of a culture that do have to be learned. It is no coincidence that the situations that provoke song among the Ifaluk include violating a taboo, being lazy or disrespectful, and refusing to share, but do not include respecting a taboo, being kind and deferential, and standing on one’s head. The Ifaluk construe the first three as similar because they evoke the same affect program—they are perceived as affronts. That makes it easier to learn that they call for the same reaction and makes it more likely that those three would be lumped together as the acceptable triggers for a single emotion.

  The moral, then, is that familiar categories of behavior—marriage customs, food taboos, folk superstitions, and so on—certainly do vary across cultures and have to be learned, but the deeper mechanisms of mental computation that generate them may be universal and innate. People may dress differently, but they may all strive to flaunt their status via their appearance. They may respect the rights of the members of their clan exclusively or they may extend that respect to everyone in their tribe, nation-state, or species, but all divide the world into an in-group and an out-group. They may differ in which outcomes they attribute to the intentions of conscious beings, some allowing only that artifacts are deliberately crafted, others believing that illnesses come from magical spells cast by enemies, still others believing that the entire world was brought into being by a creator. But all of them explain certain events by invoking the existence of entities with minds that strive to bring about goals. The behaviorists got it backwards: it is the mind, not behavior, that is lawful.

  A fifth idea: The mind is a complex system composed of many interacting parts. The psychologists who study emotions in different cultures have made another important discovery. Candid facial expressions appear to be the same everywhere, but people in some cultures learn to keep a poker face in polite company.23 A simple explanation is that the affect programs fire up facial expressions in the same way in all people, but a separate system of “display rules” governs when they can be shown.

  The difference between these two mechanisms underscores another insight of the cognitive revolution. Before the revolution, commentators invoked enormous black boxes such as “the intellect” or “the understanding,” and they made sweeping pronouncements about human nature, such as that we are essentially noble or essentially nasty. But we now know that the mind is not a homogeneous orb invested with unitary powers or across-the-board traits. The mind is modular, with many parts cooperating to generate a train of thought or an organized action. It has distinct information-processing systems for filtering out distractions, learning skills, controlling the body, remembering facts, holding information temporarily, and storing and executing rules. Cutting across these data-processing systems are mental faculties (sometimes called multiple intelligences) dedicated to different kinds of content, such as language, number, space, tools, and living things. Cognitive scientists at the East Pole suspect that the content-based modules are differentiated largely by the genes;24 those at the West Pole suspect they begin as small innate biases in attention and then coagulate out of statistical patterns in the sensory input.25 But those at both poles agree that the brain is not a uniform meatloaf. Still another layer of information-processing systems can be found in the affect programs, that is, the systems for motivation and emotion.

  The upshot is that an urge or habit coming out of one module can be translated into behavior in different ways—or suppressed altogether—by some other module. To take a simple example, cognitive psychologists believe that a module called the “habit system” underlies our tendency to produce certain responses habitually, such as responding to a printed word by pronouncing it silently. But another module, called the “supervisory attention system,” can override it and focus on the information relevant to a stated problem, such as naming the color of the ink the word is printed in, or thinking up an action that goes with the word.26 More generally, the interplay of mental systems can explain how people can entertain revenge fantasies that they never act on, or can commit adultery only in their hearts. In this way the theory of human nature coming out of the cognitive revolution has more in common with the Judeo-Christian theory of human nature, and with the psychoanalytic theory proposed by Sigmund Freud, than with behaviorism, social constructionism, and other versions of the Blank Slate. Behavior is not just emitted or elicited, nor does it come directly out of culture or society. It comes from an internal struggle among mental modules with differing agendas and goals.

  The idea from the cognitive revolution that the mind is a system of universal, generative computational modules obliterates the way that debates on human nature have been framed for centuries. It is now simply misguided to ask whether humans are flexible or programmed, whether behavior is universal or varies across cultures, whether acts are learned or innate, whether we are essentially good or essentially evil. Humans behave flexibly because they are programmed: their minds are packed with combinatorial software that can generate an unlimited set of thoughts and behavior. Behavior may vary across cultures, but the design of the mental programs that generate it need not vary. Intelligent behavior is learned successfully because we have innate systems that do the learning. And all people may have good and evil motives, but not everyone may translate them into behavior in the same way.

  THE SECOND BRIDGE between mind and matter is neuroscience, especially cognitive neuroscience, the study of how cognition and emotion are implemented in the brain.27 Francis Crick wrote a book about the brain called The Astonishing Hypothesis, alluding to the idea that all our thoughts and feelings, joys and aches, dreams and wishes consist in the physiological activity of the brain.28 Jaded neuroscientists, who take the idea for granted, snickered at the title, but Crick was right: the hypothesis is as
tonishing to most people the first time they stop to ponder it. Who cannot sympathize with the imprisoned Dmitri Karamazov as he tries to make sense of what he has just learned from a visiting academic?

  Imagine: inside, in the nerves, in the head—that is, these nerves are there in the brain… (damn them!) there are sort of little tails, the little tails of those nerves, and as soon as they begin quivering… that is, you see, I look at something with my eyes and then they begin quivering, those little tails… and when they quiver, then an image appears… it doesn’t appear at once, but an instant, a second, passes… and then something like a moment appears; that is, not a moment—devil take the moment!—but an image; that is, an object, or an action, damn it! That’s why I see and then think, because of those tails, not at all because I’ve got a soul, and that I am some sort of image and likeness. All that is nonsense! Rakitin explained it all to me yesterday, brother, and it simply bowled me over. It’s magnificent, Alyosha, this science! A new man’s arising—that I understand…. And yet I am sorry to lose God!29

  Dostoevsky’s prescience is itself astonishing, because in 1880 only the rudiments of neural functioning were understood, and a reasonable person could have doubted that all experience arises from quivering nerve tails. But no longer. One can say that the information-processing activity of the brain causes the mind, or one can say that it is the mind, but in either case the evidence is overwhelming that every aspect of our mental lives depends entirely on physiological events in the tissues of the brain.

  When a surgeon sends an electrical current into the brain, the person can have a vivid, lifelike experience. When chemicals seep into the brain, they can alter the person’s perception, mood, personality, and reasoning. When a patch of brain tissue dies, a part of the mind can disappear: a neurological patient may lose the ability to name tools, recognize faces, anticipate the outcome of his behavior, empathize with others, or keep in mind a region of space or of his own body. (Descartes was thus wrong when he said that “the mind is entirely indivisible” and concluded that it must be completely different from the body.) Every emotion and thought gives off physical signals, and the new technologies for detecting them are so accurate that they can literally read a person’s mind and tell a cognitive neuroscientist whether the person is imagining a face or a place. Neuroscientists can knock a gene out of a mouse (a gene also found in humans) and prevent the mouse from learning, or insert extra copies and make the mouse learn faster. Under the microscope, brain tissue shows a staggering complexity—a hundred billion neurons connected by a hundred trillion synapses—that is commensurate with the staggering complexity of human thought and experience. Neural network modelers have begun to show how the building blocks of mental computation, such as storing and retrieving a pattern, can be implemented in neural circuitry. And when the brain dies, the person goes out of existence. Despite concerted efforts by Alfred Russel Wallace and other Victorian scientists, it is apparently not possible to communicate with the dead.

  Educated people, of course, know that perception, cognition, language, and emotion are rooted in the brain. But it is still tempting to think of the brain as it was shown in old educational cartoons, as a control panel with gauges and levers operated by a user—the self, the soul, the ghost, the person, the “me.” But cognitive neuroscience is showing that the self, too, is just another network of brain systems.

  The first hint came from Phineas Gage, the nineteenth-century railroad worker familiar to generations of psychology students. Gage was using a yard-long spike to tamp explosive powder into a hole in a rock when a spark ignited the powder and sent the spike into his cheekbone, through his brain, and out the top of his skull. Phineas survived with his perception, memory, language, and motor functions intact. But in the famous understatement of a co-worker, “Gage was no longer Gage.” A piece of iron had literally turned him into a different person, from courteous, responsible, and ambitious to rude, unreliable, and shiftless. It did this by impaling his ventromedial prefrontal cortex, the region of the brain above the eyes now known to be involved in reasoning about other people. Together with other areas of the prefrontal lobes and the limbic system (the seat of the emotions), it anticipates the consequences of one’s actions and selects behavior consonant with one’s goals.30

  Cognitive neuroscientists have not only exorcised the ghost but have shown that the brain does not even have a part that does exactly what the ghost is supposed to do: review all the facts and make a decision for the rest of the brain to carry out.31 Each of us feels that there is a single “I” in control. But that is an illusion that the brain works hard to produce, like the impression that our visual fields are rich in detail from edge to edge. (In fact, we are blind to detail outside the fixation point. We quickly move our eyes to whatever looks interesting, and that fools us into thinking that the detail was there all along.) The brain does have supervisory systems in the prefrontal lobes and anterior cingulate cortex, which can push the buttons of behavior and override habits and urges. But those systems are gadgets with specific quirks and limitations; they are not implementations of the rational free agent traditionally identified with the soul or the self.

  One of the most dramatic demonstrations of the illusion of the unified self comes from the neuroscientists Michael Gazzaniga and Roger Sperry, who showed that when surgeons cut the corpus callosum joining the cerebral hemispheres, they literally cut the self in two, and each hemisphere can exercise free will without the other one’s advice or consent. Even more disconcertingly, the left hemisphere constantly weaves a coherent but false account of the behavior chosen without its knowledge by the right. For example, if an experimenter flashes the command “WALK” to the right hemisphere (by keeping it in the part of the visual field that only the right hemisphere can see), the person will comply with the request and begin to walk out of the room. But when the person (specifically, the person’s left hemisphere) is asked why he just got up, he will say, in all sincerity, “To get a Coke”—rather than “I don’t really know” or “The urge just came over me” or “You’ve been testing me for years since I had the surgery, and sometimes you get me to do things but I don’t know exactly what you asked me to do.” Similarly, if the patient’s left hemisphere is shown a chicken and his right hemisphere is shown a snowfall, and both hemispheres have to select a picture that goes with what they see (each using a different hand), the left hemisphere picks a claw (correctly) and the right picks a shovel (also correctly). But when the left hemisphere is asked why the whole person made those choices, it blithely says, “Oh, that’s simple. The chicken claw goes with the chicken, and you need a shovel to clean out the chicken shed.”32

  The spooky part is that we have no reason to think that the baloney-generator in the patient’s left hemisphere is behaving any differently from ours as we make sense of the inclinations emanating from the rest of our brains. The conscious mind—the self or soul—is a spin doctor, not the commander in chief. Sigmund Freud immodestly wrote that “humanity has in the course of time had to endure from the hands of science three great outrages upon its naïve self-love”: the discovery that our world is not the center of the celestial spheres but rather a speck in a vast universe, the discovery that we were not specially created but instead descended from animals, and the discovery that often our conscious minds do not control how we act but merely tell us a story about our actions. He was right about the cumulative impact, but it was cognitive neuroscience rather than psychoanalysis that conclusively delivered the third blow.

  Cognitive neuroscience is undermining not just the Ghost in the Machine but also the Noble Savage. Damage to the frontal lobes does not only dull the person or subtract from his behavioral repertoire but can unleash aggressive attacks.33 That happens because the damaged lobes no longer serve as inhibitory brakes on parts of the limbic system, particularly a circuit that links the amygdala to the hypothalamus via a pathway called the stria terminalis. Connections between the frontal lobe in each hemisphere a
nd the limbic system provide a lever by which a person’s knowledge and goals can override other mechanisms, and among those mechanisms appears to be one designed to generate behavior that harms other people.34

  Nor is the physical structure of the brain a blank slate. In the mid-nineteenth century the neurologist Paul Broca discovered that the folds and wrinkles of the cerebral cortex do not squiggle randomly like fingerprints but have a recognizable geometry. Indeed, the arrangement is so consistent from brain to brain that each fold and wrinkle can be given a name. Since that time neuroscientists have discovered that the gross anatomy of the brain—the sizes, shapes, and connectivity of its lobes and nuclei, and the basic plan of the cerebral cortex—is largely shaped by the genes in normal prenatal development.35 So is the quantity of gray matter in the different regions of the brains of different people, including the regions that underlie language and reasoning.36

  This innate geometry and cabling can have real consequences for thinking, feeling, and behavior. As we shall see in a later chapter, babies who suffer damage to particular areas of the brain often grow up with permanent deficits in particular mental faculties. And people born with variations on the typical plan have variations in the way their minds work. According to a recent study of the brains of identical and fraternal twins, differences in the amount of gray matter in the frontal lobes are not only genetically influenced but are significantly correlated with differences in intelligence.37 A study of Albert Einstein’s brain revealed that he had large, unusually shaped inferior parietal lobules, which participate in spatial reasoning and intuitions about number.38 Gay men are likely to have a smaller third interstitial nucleus in the anterior hypothalamus, a nucleus known to have a role in sex differences.39 And convicted murderers and other violent, antisocial people are likely to have a smaller and less active prefrontal cortex, the part of the brain that governs decision making and inhibits impulses.40 These gross features of the brain are almost certainly not sculpted by information coming in from the senses, which implies that differences in intelligence, scientific genius, sexual orientation, and impulsive violence are not entirely learned.

 

‹ Prev