Behave: The Biology of Humans at Our Best and Worst

Home > Other > Behave: The Biology of Humans at Our Best and Worst > Page 57
Behave: The Biology of Humans at Our Best and Worst Page 57

by Robert M. Sapolsky


  The clearest human mastery of symbolism comes with our use of language. Suppose you are being menaced by something and thus scream your head off. Someone listening can’t tell if the blood-curdling “Aiiiii!” is in response to an approaching comet, suicide bomber, or Komodo dragon. It just means that things are majorly not right; the message is the meaning. Most animal communication is about such present-tense emotionality.

  Symbolic language brought huge evolutionary advantages. This can be seen even in the starts of symbolism of other species. When vervet monkeys, for instance, spot a predator, they don’t generically scream. They use distinct vocalizations, different “protowords,” where one means “Predator on the ground, run up the tree!” and another means “Predator in the air, run down the tree!” Evolving the cognitive capacity to make that distinction is mighty useful, as it prompts you to run away from, rather than toward, something intent on eating you.

  Language pries apart a message from its meaning, and as our ancestors improved at this separation, advantages accrued.5 We became capable of representing past and future emotions, as well as messages unrelated to emotion. We evolved great expertise at separating message from reality, which, as we’ve seen, requires the frontal cortex to regulate the nuances of face, body, and voice: lying. This capacity creates complexities that no one else—from slime mold to chimp—deals with in life’s Prisoner’s Dilemmas.

  The height of the symbolic features of language is our use of metaphor. And this is not just flourish metaphors, when we declare that life is a bowl of cherries. Metaphors are everywhere in language—we may literally and physically be “in” a room, but we are only metaphorically inside something when we are “in” a good mood, “in” cahoots with someone, “in” luck, a funk, a groove,* or love. We are only metaphorically standing under something when we “understand” it.*6 The renowned cognitive linguist George Lakoff of UC Berkeley has explored the ubiquity of metaphor in language in books such as Metaphors We Live By (with philosopher Mark Johnson), and Moral Politics: How Liberals and Conservatives Think (where he demonstrates how political power involves controlling metaphors—do you favor “choice” or “life”? are you “tough on” crime, or does your “heart bleed”? are you loyal to a “fatherland” or a “motherland”? and have you captured the flag of “family values” from your opponent?). For Lakoff language is always a metaphor, transferring information from one individual to another by putting thought into words, as if words were shopping bags.7

  Symbols, metaphors, analogies, parables, synecdoche, figures of speech. We understand that a captain wants more than just hands when ordering all of them on deck, that Kafka’s Metamorphosis isn’t really about a cockroach, and that June doesn’t really bust out all over. If we are of a certain theological ilk, we see bread and wine intertwined with body and blood. We learn that the orchestral sounds constituting the 1812 Overture represent Napoleon getting his ass kicked when retreating from Moscow. And that “Napoleon getting his ass kicked” represents thousands of soldiers dying cold and hungry, far from home.

  This chapter explores the neurobiology of some of the most interesting outposts of symbolic and metaphorical thinking. It makes a key point: these capacities evolved so recently that our brains are, if you will, winging it and improvising on the fly when dealing with metaphor. As a result, we are actually pretty lousy at distinguishing between the metaphorical and literal, at remembering that “it’s only a figure of speech”—with enormous consequences for our best and worst behaviors.

  We start with examples of odd ways our brains handle metaphor, and the behavioral manifestations of those oddities; some have been introduced previously.

  FEELING SOMEONE ELSE’S PAIN

  Consider the following: You stub your toe. Pain receptors there send messages to the spine and on up to the brain, where various regions kick into action. Some of these areas tell you about the location, intensity, and quality of the pain. Is it your left toe or right ear that hurts? Was your toe stubbed or crushed by a tractor-trailer? These various pain-ometers, the meat and potatoes of pain processing, are found in every mammal.

  As we first learned in chapter 2, the frontal cortical anterior cingulate cortex (ACC) also plays a role, assessing the meaning of the pain.8 Maybe it’s bad news: your painful toe signals the start of some unlikely disease. Or maybe it’s good news: you’re going to get your fire-walker diploma because the hot coals only made your toes throb. As we saw in the last chapter, the ACC is heavily involved in “error detection,” noting discrepancies between what is anticipated and what occurs. And pain from out of nowhere surely represents a discrepancy between the pain-free setting that you anticipate versus a painful reality.

  But the ACC does more than just tell you the meaning of a painful toe. As we saw in chapter 6, put a subject in a brain scanner, make them think they’re tossing a Cyberball back and forth with two other players, and then make them feel excluded—the other two stop throwing the ball to them. “Hey, how come they don’t want to play with me?” And the ACC activates.

  In other words, rejection hurts. “Well, yeah,” you might say. “But that’s not like stubbing your toe.” But as far as those neurons in the ACC are concerned, social and literal pain are the same. And as proof of the rooting of the former in sociality, there isn’t ACC activation if the subject believes the ball isn’t being thrown to them because of a glitch connecting them to the other two subjects’ computers.

  And the ACC can take things a step further, as we saw in chapter 14. Receive a mild shock, and there’s activation of your ACC (along with activation of the more mundane pain-ometer regions). Now instead watch your beloved get shocked in the same way. Pain-ometer brain regions are silent, but the ACC activates. For those neurons, feeling someone else’s pain isn’t just a figure of speech.

  Moreover, the brain intermixes literal and psychic pain.9 The neurotransmitter substance P plays a central role in communicating painful signals from pain receptors in skin, muscles, and joints up into the brain. It’s got pain-ometer written all over it. And remarkably, its levels are elevated in clinical depression, and drugs that block the actions of substance P can have marked antidepressant properties. Stubbed toe, stubbed psyche. Moreover, there is activation of the cortical parts of pain networks when we feel dread—anticipating an impending shock.

  Furthermore, the brain becomes literal when we do the flip side of empathy.10 It’s painful watching a hated competitor succeed, and we activate the ACC at that time. Conversely, if he fails, we gloat, feel schadenfreude, get pleasure from his pain, and activate dopaminergic reward pathways. Forget “Your pain is my pain.” Your pain is my gain.

  DISGUST AND PURITY

  This is our familiar domain of the insular cortex. If you bite into rancid food, the insula activates, just as in every other mammal. You wrinkle your nose, raise your upper lip, narrow your eyes, all to protect mouth, eyes, and nasal cavities. Your heart slows. You reflexively spit out the food, gag, perhaps even vomit. All to protect yourself from toxins and infectious pathogens.11

  As humans we do some fancier things: Think about rancid food, and the insula activates. Look at faces showing disgust, or subjectively unattractive faces, and the same occurs. And most important, if you think about a truly reprehensible act, the same occurs. The insula mediates visceral responses to norm violations, and the more activation, the more condemnation. And this is visceral, not just metaphorically visceral—for example, when I heard about the Sandy Hook Elementary School massacre, “feeling sick to my stomach” wasn’t a mere figure of speech. When I imagined the reality of the murder of twenty first-graders and the six adults protecting them, I felt nauseous. The insula not only prompts the stomach to purge itself of toxic food; it prompts the stomach to purge the reality of a nightmarish event. The distance between the symbolic message and the meaning disappears.12

  The linking of visceral and moral disgust is bidirectional. As shown in a n
umber of studies, contemplating a morally disgusting act leaves more than a metaphorical bad taste in your mouth—people eat less immediately afterward, and a neutral-tasting beverage drunk afterward is rated as having a more negative taste (and, conversely, hearing about virtuous moral acts made the drink taste better).13

  In chapters 12 and 13 we saw the political implications of our brains intermixing visceral and moral disgust—social conservatives have a lower threshold for visceral disgust than do social progressives; the “wisdom of repugnance” school posits that being viscerally disgusted by something is a pretty good indicator that it is morally wrong; implicitly evoking a sense of visceral disgust (e.g., by sitting in close proximity to a foul odor) makes us more socially conservative.14 This is not merely because visceral disgust is an aversive state—inducing a sense of sadness, rather than disgust, doesn’t have the same effect; moreover, moralizing about purity, while predicted by people’s propensity toward feeling disgust, is not predicted by propensities toward fear or anger.*

  The physiological core of gustatory disgust is to protect yourself against pathogens. The core of the intermixing of visceral and moral disgust is a sense of threat as well. A socially conservative stance about, say, gay marriage is not just that it is simply wrong in an abstract sense, or even “disgusting,” but that it constitutes a threat—to the sanctity of marriage and family values. This element of threat is shown in a great study in which subjects either did or didn’t read an article about the health risks of airborne bacteria.15 All then read a history article that used imagery of America as a living organism, with statements like “Following the Civil War, the United States underwent a growth spurt.” Those who read about scary bacteria before thinking about the United States as an organism were then more likely to express negative views about immigration (without changing attitudes about an economic issue). My guess is that people with a stereotypically conservative exclusionary stance about immigration rarely have the sense that they feel disgusted that people elsewhere in the world would want to come to the United States for better lives. Instead there is threat by the rabble, the unwashed masses, to the nebulous entity that is the American way of life.

  How cerebral is this intertwining of moral and visceral disgust? Does the insula get involved in moral disgust only if it’s of a particularly visceral nature—blood and guts, coprophagia, body parts? Paul Bloom suggests this is the case. In contrast, Jonathan Haidt feels that even the most cognitive forms of moral disgust (“He’s a chess grand master and he shows off by beating that eight-year-old in three moves and reducing her to tears—that’s disgusting”) are heavily intertwined.16 In support of that, something as unvisceral as getting a lousy offer in an economic game activates the insula (a lousy offer from another human, rather than a computer, that is); the more insula activation, the greater the likelihood of the offer being rejected. Amid this debate, it is clear that the intertwining of visceral and moral disgust is, at the least, greatest when the latter taps into core disgust. To repeat a neat quote from Paul Rozin, introduced in chapter 11, “Disgust serves as an ethnic or out-group marker.” First you’re disgusted by how Others smell, a gateway to then being disgusted by how Others think.

  Of course, insofar as metaphorically being dirty and disorderly = bad, metaphorically being clean and orderly = good.*17 Just consider the use of the word “neat” in the previous paragraph. Similarly, in Swahili the word safi, meaning “clean” (from kusafisha, “to clean”), is used in the same slangy metaphorical sense of “neat” in English. Once while in Kenya, I was hitching a ride to Nairobi from somewhere out in the boondocks and got to chatting with a local teenager who was curious about me. “Where are you going?” he asked. Nairobi. “Nairobi ni [is] safi,” he said wistfully about the far-off metropolis. How are you going to keep them down on the farm once they’ve seen the neatness of Nairobi?

  Literal cleanliness and orderliness can release us from abstract cognitive and affective distress—just consider how, during moments where life seems to be spiraling out of control, it can be calming to organize your clothes, clean the living room, get the car washed.18 And consider how the displaced need to impose cleanliness and order runs and ruins the lives of people suffering from the archetypal anxiety disorder, obsessive-compulsive disorder. The ability of literal cleanliness to alter cognition was shown in one study. Subjects examined an array of music CDs, picked ten that they liked, and ranked them in order of liking; they were then offered a free copy of one of their midrange choices (number five or six). Subjects were then distracted with some other task and then asked to rerank the ten CDs. And they showed a common psychological phenomenon, which was to now overvalue the CD they’d been given, ranking it higher on the list than before. Unless they had just washed their hands (ostensibly to try a new brand of soap), in which case no reranking occurred. Clean hands, clean slate.

  But beginning much further back than the “social hygiene” movement of the turn of the twentieth century, being metaphorically neat, pure, and hygienic could be a moral state as well—cleanliness was not just a good way to avoid uncontrolled diarrhea, dehydration, and serious electrolyte imbalance, but was also ideal for cozying up to a god.

  One study was built around the phenomenon of visceral disgust making people harsher in their moral judgments. The authors first replicated this effect, showing that watching a short film clip of something physically disgusting made subjects more morally judgmental—unless they had washed their hands after watching the film. Another study suggests that the washing decreases emotional arousal, as it decreased the diameter of subjects’ pupils.19

  We intertwine physical and moral purity when it comes to our own actions. In one of my all-time favorite psychology studies, Chen-Bo Zhong of the University of Toronto and Katie Liljenquist of Northwestern University demonstrated that the brain has trouble distinguishing between being a dirty scoundrel and being in need of a bath. Subjects were asked to recount either a moral or an immoral act in their past. Afterward, as a token of appreciation, the researchers offered the volunteers a choice between the gift of a pencil and a package of antiseptic wipes. And the folks who had just wallowed in their ethical failures were more likely to go for the wipes. Another study, showing the same effect when people were instructed to lie, demonstrated that the more adversely consequential the lie was presented as being, the more washing subjects did. Lady Macbeth and Pontius Pilate weren’t the only ones to at least try to absolve their sins by washing their hands, and this phenomenon of embodied cognition is referred to as the “Macbeth effect.”20

  This effect is remarkably concrete. In another study subjects were instructed to lie about something—with either their mouths (i.e., to tell a lie) or their hands (i.e., to write down a lie).21 Afterward, remarkably, liars were more likely to pick complementary cleansing products than control subjects who communicated something truthful: the immoral mouth-ers were more likely to pick a mouthwash sample; the immoral scribes, hand soap. Furthermore, as shown with neuroimaging, when contemplating mouthwash versus soap, those who had just spoken a lie activated parts of the sensorimotor cortex related to the mouth (i.e., the subjects were more aware of their mouths at the time); those who had written the lie activated the cortical regions mapping onto their hand. Embodied cognition can be specific to parts of the body.

  Another fascinating study showed the influence of culture in the Macbeth effect. The studies just cited were carried out with European or American subjects. When the same is done with East Asian subjects, the urge afterward is to wash the face, rather than the hands. If you are going to save face, it should be a clean one.22

  Finally, most important, this intermixing of moral and physical hygiene affects the way we actually behave. That original study on contemplating one’s moral failings and the subsequent desire to wash hands included a second experiment. As before, subjects were told to recall an immoral act of theirs. Afterward subjects either did or didn’t have the opport
unity to clean their hands. Those who were able to wash were less likely to respond to a subsequent (experimentally staged) request for help. In another study merely watching someone else wash their hands in this situation (versus watching them type) also decreased helpfulness afterward (although to a lesser extent than the subject washing).23

  Many of our moments of prosociality, of altruism and Good Samaritanism, are acts of restitution, attempts to counter our antisocial moments. What these studies show is that if those metaphorically dirtied hands have been unmetaphorically washed in the interim, they’re less likely to reach out to try to balance the scales.

  REAL VERSUS METAPHORICAL SENSATION

  Then there are ways in which we confuse literal with metaphorical sensation.

  A brilliant study by John Bargh of Yale concerned haptic sensations (I had to look the word up—haptic: related to the sense of touch). Volunteers evaluated the résumés of supposed job applicants; crucially, the résumé was attached to a clipboard of one of two weights. When subjects held the heavier clipboard, they tended to judge candidates as more “serious” (while clipboard weight had no effect on other perceived traits). When you next apply for a job, hope that your résumé will be attached to a heavy clipboard. How else would the evaluator figure out that you can appreciate the gravity of a situation and deal with weighty matters, rather than being a lightweight?24

  In the next study subjects assembled a puzzle with pieces that were either smooth or rough as sandpaper, then observed a socially ambiguous interaction. Handle the rough puzzle pieces and the interactions were rated as less coordinated, smooth, or successful (it’s not clear, however, if those subjects were more likely, at home that evening, to use coarse language in describing their rough day).

 

‹ Prev