Book Read Free

The Anatomy of Violence

Page 12

by Adrian Raine


  Let’s start with lying. And please do not protest your innocence any further, because as Mark Twain rightly put it, “everybody lies—every day, every hour, awake, asleep, in his dreams, in his joy, in his mourning.”57 You do lie—honestly you do. So how do we probe your antisocial mind? What instruments can we use to detect when people are telling whoppers?

  “Oh, Agent Starling, you think you can dissect me with this blunt little tool?” Hannibal Lecter in the classic thriller The Silence of the Lambs had a point, and Clarice Starling, the FBI agent interviewing him, should have known better. The paper-and-pencil questionnaire tools she was using on the serial killer Hannibal “the Cannibal” Lecter in his prison cell have been traditionally used by forensic specialists to probe the minds of murderers. But they have been ineffective in revealing much that is fundamentally wrong with psychopaths like Lecter. After all, psychopaths have been known to tell a white lie or two about themselves, so do you really think they will tell the truth in a simple questionnaire? We need something far sharper than a blunt pencil and paper questionnaire to learn when people are fibbing.

  A big fat sixty-ton magnet of the type used in MRI does not sound very sharp, but it’s not a blunt tool. When it comes to discerning truth from fiction, it’s as sharp as a razor. My academic friends Tatia Lee, at Hong Kong University, Sean Spence, at the University of Sheffield,58 and Dan Langleben, at the University of Pennsylvania, are a triumvirate of pioneering scholars who each independently stumbled onto a sublime truth about lying—the prefrontal cortex is critical.

  Tatia Lee took normal individuals—just like you—and put them into a scanner. She then gave them tasks during which they had to either tell the truth or lie. Sometimes they lied about themselves, just as we do in life. So a question might be, “Were you born in Darlington?” “Yes,” I would say. “No,” you would say. We are telling the truth. And while that is happening, Tatia collects data on what the brain is doing. Then she reverses the situation. “Were you born in Darlington?” “No,” I say. “Yes,” you say. This is an autobiographic lie—similar to when you sometimes lie to your friend about whether you are free to meet up tonight or not.

  In another task, subjects were given a simple memory task to complete in which a three-digit number—like 714—was quickly followed by either the same or a different set of numbers. The subject had to say whether the sets of digits were the same or different. Sometimes they were instructed to tell the truth, while at other times they had to deliberately lie and feign memory impairment—just like some people feign injury after a claimed accident to financially gain from medical insurance.

  It did not matter what the task was, Tatia found that lying was consistently associated with increased activation in the prefrontal cortex as well as areas of the parietal cortex.59 At just the same time as Tatia was doing her work in Hong Kong, Sean Spence60 and Dan Langleben61 independently found essentially the same pattern of findings in England and the United States, results that span three different continents and cultures. In stark contrast, telling the truth was not associated with any increase in cortical activation.

  What’s going on here? The bottom line to deceit is that this antisocial act is a complex executive function that requires a lot of frontal lobe processing. Telling the truth is actually very easy. Telling tall tales is much harder and requires much more processing resources and brain activation. Deception involves theory of mind. When I lie to you about where I was at eight p.m. on Wednesday, January 27, I need to have an understanding of what you know about me—and what you do not know. Was I really celebrating my birthday with my family? I need to have a sense of what you think is plausible, and what is not. For this “mind reading” we need to recruit a number of brain regions that form connections between the frontal cortex and subregions in the temporal and parietal lobes.

  Yesterday it was paper-and-pencil tools. Today it’s becoming brain-imaging paraphernalia. By combining brain-imaging methodology with machine learning—equally new sophisticated statistical techniques—Dan Langleben and Ruben Gur, at the University of Pennsylvania, have been demonstrating accuracy rates upward of 88 percent in detecting deception. The disconcerting question is, How much longer will our lying minds remain stubbornly private to the latest investigative lie-detection tools? The current view is that lie detection based on functional imaging is not sufficiently developed for use in courts of law,62 although that could conceivably change in the future. For now, however, let’s turn to another antisocial arena that we frequently find ourselves caught up in and conflicted by—making moral decisions.

  COMPARING YOUR MORAL BRAIN WITH THE ANTISOCIAL BRAIN

  You know cannabis is illegal, but you’ve taken it anyway. You know you should not download movies from the Internet, but you persist in breaking copyright laws. And now you are reporting your taxes and wondering if you should nudge up those tax-deductible charitable contributions a hundred or two.

  We’ve all had those moments of being torn between right and wrong—between heaven and hell. The devil and the angel are battling it out hell-for-leather inside our hot heads, beating out the eventual choice with hammer and tongs. You’ve wondered what on earth to do.

  But you’ve never wondered what’s going on inside your brain during these moments, have you? That’s what a lot of social scientists and philosophers have been pondering for over a decade. And now we have some fairly clear-cut answers.

  It goes like this. We slot you inside a brain scanner and present you with a series of moral dilemmas using visual scripts. We’ll start with what is called a “personal” moral dilemma—one that’s really up close and personal. This one could almost have been plucked from a page in the life of Phineas Gage, a railway worker whom you’ll meet in a later chapter. You’re standing on a footbridge looking down on a railway track. Below you, farther back along the track, is a runaway trolley that is about to plow into a group of five unsuspecting railway men working farther ahead on the track. Standing next to you is a rather corpulent gentleman.

  Here’s the deal. If you do nothing, five innocent men are going to die right before your eyes. Alternatively, you can push the big bloke off the bridge. He’s a goner, but his big body will block the runaway trolley and save the lives of five men. What do you do?

  You only have two choices. You are out there on that bridge, hearing the death rattle of the oncoming trolley and envisioning the gory carnage that will occur. No, you’re not allowed to throw yourself off the bridge instead—saint that you are. You’re just not big enough to block the trolley. Calling out to the railway workers won’t work either.

  Put this book down and reflect on your decision—to do nothing or to push the man off the bridge.

  It’s difficult, isn’t it? And we can push and pull our minds in different directions. Are you really going to stand idly by and let five innocent men die? Look, the obese guy is likely to die early from heart disease anyway—why not give his life a dignified and worthwhile ending by saving five innocent men?

  Then again, isn’t it sort of wrong to kill? But at the same time it’s five for one—surely you cannot ignore those odds? This dilemma is damned difficult—it’s very personal and involves a high degree of conflict.

  Josh Greene, an amazing philosopher and neuroscientist at Harvard, published the first study to describe what happens at a neural level during personal moral dilemmas like this.63 Compared to more “impersonal” moral dilemmas that do not bring you face-to-face with someone else, your brain shows increased activation in a circuit that comprises the medial prefrontal cortex, the angular gyrus, the posterior cingulate, and the amygdala. This makes sense, as these brain areas contribute to complex thinking, and the ability to step outside of yourself and evaluate the bigger social picture.

  But let’s get back to how you actually processed the dilemma. I’m not as interested in exactly what decision you came to as I am in how you felt. Wasn’t it awkward? Didn’t you feel uncomfortable? You may have even physically squi
rmed in your seat a bit just as one undergraduate student did in my class earlier this week when I described this dilemma. This is where that amygdala and other limbic activation comes in, contributing to the emotional “conscience” component of moral decision-making alongside some subregions of the prefrontal cortex.

  What your actual answer was is not entirely uninteresting either. About 85 percent of you felt you could not bring yourself to push that man off the bridge. About 15 percent, however, would have sacrificed him. These numbers are obtained in large-scale surveys of moral dilemmas. In contrast, if you put the same question to patients who have lesions to the ventral prefrontal cortex—people who as we’ll later see are more psychopathic than the rest of us—that “push-him-off” rate triples to about 45 percent.64

  If these same patients with ventral prefrontal lesions are with other villagers hiding in a cellar from invading troops above, and if their baby starts crying, they are three times more likely to smother their baby to prevent the enemy from finding and killing everyone. This is a high-conflict dilemma. They are making a utilitarian moral decision—the greater good of the greater number.

  Don’t worry too much if you chose to push the man off the bridge or smother your own baby. The seventeenth-century English philosopher Jeremy Bentham, who espoused utilitarianism, would have been proud of you. It does not necessarily mean you have a frontal brain lesion or that you are a psychopath—although you may have a slightly different way of thinking about life than others.

  Josh Greene was not able to image the ventral prefrontal cortex back in 2001 when he conducted his groundbreaking study, due to what we call “susceptibility artifact,” but many other studies have replicated and extended Greene’s findings and shown activation of this region during moral-dilemma tasks.65, 66 The ventral prefrontal region is critical for making “appropriate” moral decisions—or at least passive decisions that result in no harm to others.

  We’ll come back to morality very soon, but here I want to recap where we stand with our murderous minds. I’ve been arguing that the prefrontal cortex and limbic system are misfiring in violent offenders. We also found that our murderers had poorer functioning in the angular gyrus. We’ve seen that other studies of antisocial individuals reveal abnormalities in the posterior cingulate, the amygdala, and the hippocampus, while others document abnormal functioning in the superior temporal gyrus in violent offenders,67 psychopaths,68 and antisocial individuals.69

  Let’s now compare this hit list of brain areas in antisocials to the hit list activated when normal people contemplate a moral dilemma. What are the areas most commonly activated across studies in moral tasks? They are none other than the polar/medial prefrontal cortex, the ventral prefrontal cortex, the angular gyrus, the posterior cingulate, and the amygdala.70 There is an undeniable degree of overlap.

  Let me make the point visually for you. Figure 3.5, in the color-plate section, puts together these two sets of findings—the antisocial brain and the moral brain—to create a neural model of morality and antisociality. The top scan slices the brain right down the middle from front to back—you can see the nose on the left. The middle scan slices the brain head-on. The bottom slice is a bird’s-eye view looking down on the brain. Brain regions implicated in both offending and moral decision-making are colored yellow. Areas found to be abnormal only in offenders are colored in red, and areas linked only to moral-judgment tasks are colored in green.

  You can see that there are substantial areas of overlap between antisocial/psychopathic behavior and making moral judgments. Brain regions common to both include the ventral prefrontal cortex, the polar/medial prefrontal areas, the amygdala, the angular gyrus, and the posterior superior temporal gyrus.

  It’s not a perfect match by any means. Furthermore, while the posterior cingulate is activated during moral judgment, evidence implicating this region in antisocial behavior is sparse to date, although studies have indeed found abnormalities in the posterior cingulate in psychopaths,71 impulsively aggressive patients,72 and spouse-abusers.73 Nevertheless, there are commonalities we cannot ignore. Some parts of offenders’ brains critical for thinking morally just don’t seem to be functioning very well.

  JOLLY JANE’S VOLUPTUOUS BRAIN

  We have been learning what brain areas are activated when normal people make moral decisions. But what happens in the brains of psychopaths when given the same moral dilemmas?

  Historically, psychopaths have been viewed as “morally insane.” On the outside they seem normal, and can even be very pleasant, sociable, and likeable. Ted Bundy is a classic example of a serial killer who had a charismatic personality that allowed him to lure young female victims into his deadly trap.74 Yet when it comes to having a sense of morality, there is something missing in psychopaths. Here we’ll take a closer look at what this “moral insanity” is like from a real-world case. What exactly is broken in the brains of psychopaths at the moral level?

  My sister Roma was a nurse. My wife, Jianghong, is a nurse. My cousin Heather is a nurse. So allow me to pick the case of a nurse for our discussion of a breakdown in the moral brain. “Jolly” Jane Toppan cheerfully killed at least thirty-one people in Massachusetts during a six-year period, from 1895 to 1901. Like Randy Kraft, she was not caught for several years. Nicknamed “Jolly Jane” by hospital staff and patients due to her gregarious and happy demeanor, she became one of the most successful private nurses in Cambridge.

  Jolly Jane liked to live life to the full. Like many serial killers, she enjoyed experimenting in her modus operandi and exploring her life-or-death power over others. Like many modern-day female offenders, she particularly took pleasure in experimenting with drugs—but in an unusual way. One of her greatest excitements in life was to see life itself slowly sucked out of the patients she cared for. She would first inject them with an overdose of morphine. She would then sit patiently with them, gazing into their eyes almost like a lover, observing the moment when their pupils contracted and their breath shortened.75 Just when they were about to sink into a coma, Jane would revive them with a jab of atropine—an alkaloid extracted from deadly nightshade. It blocks the activity of the vagus nerve. This causes the contracted pupils to dilate, the slowing heart to beat rapidly, the cooling body to sweat, and shaking spasms to overcome the patient. Eventually they would die, but not before Jane had her high from observing their eyes dilate and watching their bodies contort in a slow death.

  As with Randy Kraft, the only insight we have into what else Jolly Jane would get up to during these murderous moments comes from the dramatic testimony of the one individual to survive an attack. Amelia Phinney was a thirty-six-year-old patient hospitalized with a uterine ulcer in 1887. Jolly Jane attentively floated around her like Florence Nightingale. The good nurse gave her patient a drink purportedly to help her pain—to Amelia it tasted bitter. Then Amelia felt her throat dry up, her body turn numb, and her eyes become heavy. She felt herself sinking into sleep.

  At that point she became aware of something unusual—Jane was pulling back the bedsheets and getting into bed with her. Jolly Jane stroked her hair, kissed her face, and cuddled up to her. After a period of carnal embraces, Jane jumped onto her knees to peer deeply into her patient’s pupils. She then gave Amelia another drink—presumably atropine to reverse the physiological symptoms of morphine. At that critical point, Jane abruptly disengaged. Amelia was aware of Jane dashing quickly out of the room—presumably because she heard someone approaching.

  So Amelia Phinney lived to tell the tale, but not immediately. To this patient, the experience was so utterly bizarre that it must surely have been a dream during her ill state. Like Joey Fancher, who only testified a long time after his attack, in the court case of Randy Kraft, Amelia kept her bizarre story to herself. It came to light fourteen years later, after Jolly Jane’s arrest, in 1901.76 As with Randy Kraft, a serial killer who could have been caught, she continued on her killing spree.

  Unlike many other female serial killers, who f
requently kill for monetary gain, Jane was not profiting from her murders. The killings did, however, give her what she herself termed “voluptuous delight”—a shorthand nineteenth-century term for a sexual turn-on. Today she would be called a lust serial killer—which is very unusual for a female. Yet while Jane needed her sexual turn-ons, as a nurse aren’t there other ways of getting such worldly pleasures? How could she morally justify her actions given the awful loss of innocent life?

  It seemed almost motiveless malignity. It doesn’t morally make much sense. And in fact, this is essentially how Jane herself sums it all up:

  When I try to picture it, I say to myself, “I have poisoned Minnie Gibbs, my dear friend. I have poisoned Mrs. Gordon. I have poisoned Mr. and Mrs. Davis.” This does not convey anything to me, and when I try to sense the condition of the children and all the consequences, I cannot realize what an awful thing it is. Why don’t I feel sorry and grieve over it? I cannot make any sense of it.77

  Jane could never understand herself. Nor could those who knew her. After her arrest a deluge of letters were received attesting to the fact that she was a compassionate, dedicated, and caring professional. She could not have committed these heinous deeds. If you look at her picture, in Figure 3.6, and peer into her eyes, can’t you too see a gentle, kindhearted, motherly nurse?

  Jane racked her mind for the cause of her crimes. She could gaze longingly into the eyes of her dying victims and experience her voluptuous delight while watching their agony. She knew what she was doing. She knew she was killing. Jane was utterly perplexed when at her trial in 1902 she was found not guilty by reason of insanity. To her mind, she could not possibly be insane because she knew full well what she was doing.78 She truly could not make sense of it.

  But I feel I can. And I literally mean feel. Jane knew cognitively what was moral behavior and what was not. Of course she could tell right from wrong at a thinking, cognitive level. But she did not have the feeling of what is moral. She could not empathize emotionally with the human suffering that resulted from her actions. She couldn’t grieve or even feel sorry for her victims. I strongly suspect it was because she had a defective amygdala and ventral prefrontal cortex. She lacked the feeling for what is moral.

 

‹ Prev