Book Read Free

The Disordered Mind

Page 19

by Eric R. Kandel


  TREATING PEOPLE WITH ANXIETY DISORDERS

  At present, the two main classes of treatment for anxiety disorders are medication and psychotherapy. Both decrease activity in the amygdala, but they do so in different ways.

  As we learned in chapter 3, depression is commonly treated with drugs that increase the concentration of serotonin in the brain. The same antidepressants are effective in treating 50 to 70 percent of people with generalized anxiety disorders because they lessen worry and guilt, feelings that are associated with depression. However, the drugs do not work nearly as well for people with specific fear-related disorders. For them, psychotherapy has proven much more effective. PTSD, for example, can be managed with cognitive behavioral therapy, including prolonged exposure therapy and virtual reality exposure therapy.

  Recently, Edna Foa and others have shown that prolonged exposure therapy works particularly well for people with fear-related disorders.4 This form of psychotherapy essentially teaches the brain to stop being afraid by reversing learned fear associations in the amygdala. If we were to try to extinguish fear in LeDoux’s mice, for example, we would present the animals with the tone over and over again—but without the electric shock. Eventually, the synaptic connections underlying the fear association would weaken and disappear, and the mice would no longer cringe in response to the tone.

  While exposing a person to the cause of his or her fear only a few times can actually exacerbate fear, proper use of exposure therapy can extinguish or inhibit it. Sometimes, this involves exposing patients to a virtual experience. Virtual experiences are useful in situations that might be difficult in real life, such as riding an elevator a hundred times. Results produced by virtual exposure are almost as effective as their real-world counterparts.

  Barbara Rothbaum, director of the Trauma and Anxiety Recovery Program at Emory University, is a pioneer in virtual reality exposure therapy. She began by fitting Vietnam veterans who had chronic PTSD with a helmet that plays one of two filmed scenarios: a landing zone or the inside of an in-flight helicopter. She then followed the patients’ reactions on a monitor and talked to the patients as they re-experienced traumatic events. When this therapy proved effective, she extended it to other patients as well.5

  Another approach is to erase a terrifying memory entirely. As we learned in chapter 5, short-term memory results when existing connections among synapses are strengthened, but long-term memory requires repeated training and the formation of new synaptic connections. In the interim, while a memory is being consolidated, it is sensitive to disruption. Recent studies have revealed that a similar sensitivity to disruption occurs when a memory is retrieved from long-term storage; that is, memories become unstable for a short period of time after they have been retrieved.6 Thus, when a person recalls a memory that evokes the fear response (or, in the case of a rat, when it is re-exposed to the tone), the memory is destabilized for several hours. If during that time the storage processes in the brain are perturbed, either behaviorally or with a drug, the memory often does not go back into storage properly. Instead, it is erased or made inaccessible. Thus the rat is no longer afraid, and the person feels better.

  Alain Brunet, a clinical psychologist at McGill University in Montreal, studied nineteen people who had been suffering for several years from PTSD.7 (Their traumas included sexual assaults, car crashes, and violent muggings.) People in the treatment group were given propranolol, a drug that blocks the action of noradrenaline, a neurotransmitter released in response to stress that triggers our fight, flight, or freeze response. Brunet gave one group of study participants a dose of propranolol, then asked them to write a detailed description of their traumatic experience. While the participants were remembering the awful event, the drug suppressed the visceral aspects of their fear response, thereby containing their negative emotions. As James was the first to suggest, minimizing the body’s emotional response can also minimize our conscious awareness of the emotion.

  One week later, the patients returned to the lab and were asked to remember the traumatic event once again. Participants who had not received propranolol exhibited high degrees of arousal that were consistent with anxiety (for example, their heart rate spiked suddenly), but those given the drug had significantly lower stress responses. Although they could still remember the event in vivid detail, the emotional component of the memory located in the amygdala had been modified. The fear wasn’t gone, but it was no longer crippling.

  Emotions do more than affect our behavior; they also affect the decisions we make. We accept that we sometimes make hasty decisions in response to our feelings. But surprisingly, emotion plays a role in all of our decisions, even moral ones. In fact, without emotion, our ability to make sound decisions is impaired.

  EMOTION IN DECISION MAKING

  William James was one of the first scientists to propose a role for emotion in decision making. In his 1890 textbook, The Principles of Psychology, he launched into a critique of the “rationalist” account of the human mind. “The facts of the case are really tolerably plain,” he wrote. “Man has a far greater variety of impulses than any other, lower animal.”8 In other words, the prevailing view of humans as purely rational creatures, defined “by the almost total absence of instincts,” was mistaken. James’s principal insight, however, was that our emotional impulses aren’t necessarily bad. In fact, he believed that the preponderance of habits, instincts, and emotions in the human brain is an essential part of what makes our brain so effective.

  Scientists have recorded several powerful demonstrations of the importance of emotion in decision making. In his book Descartes’ Error, Antonio Damasio describes the case of a man named Elliot.9 In 1982 a small tumor was discovered in the ventromedial prefrontal cortex region of Elliot’s brain. The tumor was removed by a team of surgeons, but the resulting damage to his brain changed his behavior dramatically.

  Before the operation, Elliot had been a model father and husband. He held an important management job in a large corporation and was active in his local church. After the surgery, Elliot’s IQ stayed the same—he still tested in the ninety-seventh percentile—but he exhibited several profound flaws in decision making. He made a series of reckless choices and started a series of businesses that quickly failed. He got involved with a con man and was forced into bankruptcy. His wife divorced him. The IRS began investigating him. Eventually, he had to move in with his parents. Elliot also became quite indecisive, especially when it came to minor details such as where to eat lunch or what radio station to listen to. As Damasio would later write, “Elliot emerged as a man with a normal intellect who was unable to decide properly, especially when the decision involved personal or social matters.”10

  Why was Elliot suddenly incapable of making good personal decisions? Damasio’s first insight occurred while talking to Elliot about the tragic turn his life had taken. “He was always controlled,” Damasio writes, “always describing scenes as a dispassionate, uninvolved spectator. Nowhere was there a sense of his own suffering, even though he was the protagonist.… I never saw a tinge of emotion in my many hours of conversation with him: no sadness, no impatience, no frustration.”11

  Intrigued by this emotional deficit, Damasio hooked Elliot up to a machine that measures the activity of the sweat glands in the palms of the hands. (Whenever we experience strong emotions, our skin is literally aroused and our palms start to perspire.) Damasio then showed him various photographs that would normally trigger an immediate emotional response: a severed foot, a naked woman, or a house on fire. No matter how dramatic the picture, Elliot’s palms never got sweaty. He felt nothing. Clearly, the surgery had damaged an area of the brain that is essential for processing emotion.

  Damasio began to study other people with similar patterns of brain damage. They all appeared perfectly intelligent and showed no deficits on any conventional cognitive tests, yet they all suffered from the same profound flaw: they didn’t experience emotion and therefore had tremendous difficulty mak
ing decisions.

  MORAL DECISION MAKING

  The first indication of a link between moral functions and the brain dates back to 1845 and the famous case of Phineas Gage, which we touched on in chapter 1. Gage, a railroad worker, was handling explosives when a terrible accident occurred: an iron bar was driven through his skull. The bar entered the base of his skull and came out at the top, damaging his brain severely (fig. 8.7). A local physician took excellent care of him, and Gage recovered physically to an amazing degree. Within days, he was able to walk and talk and function effectively. Within a few weeks he was back at the job. But Gage had changed dramatically.

  Before the accident, Gage had been the foreman of the crew. He was absolutely reliable. He could always be counted on to do the job and do it well. After the accident, he was completely irresponsible. He never showed up on time. He became obscene in his language and his behavior. He paid no attention to his fellow workers. He had lost any sense of moral judgment.

  Many years after Gage’s death, Hanna and Antonio Damasio, using Gage’s skull and the iron bar, reconstructed the pathway through his brain (fig. 8.7). They realized that the prefrontal cortex was damaged, particularly the underside, where the ventromedial prefrontal cortex and the orbitofrontal cortex are located—regions that are extremely important for emotion, decision making, and moral behavior.

  Figure 8.7. Phineas Gage with the iron bar that injured his brain (left); reconstruction of the iron bar’s pathway through Gage’s brain (right)

  Joshua Greene, an experimental psychologist, neuroscientist, and philosopher at Harvard, has made use of a fascinating puzzle known as the “trolley problem” to study how emotion affects our moral decision making.12 The trolley problem has numerous variations, but the simplest poses two dilemmas (fig. 8.8). The switch dilemma goes like this:

  A runaway trolley whose brakes have failed is approaching a fork in the track at top speed. If you do nothing, the trolley will stay to the right, where it will run over five travelers. All five of them will die. However, if you divert the trolley to the left—by flipping a switch—the trolley will hit and kill one traveler. What do you do? Are you willing to intervene and change the path of the trolley?

  Figure 8.8. The runaway trolley problem: the switch dilemma (top) and the footbridge dilemma (bottom)

  Most people agree it is morally permissible to divert the trolley. The decision is based on simple arithmetic: it’s better to kill fewer people. Some moral philosophers even argue that it is immoral not to divert the trolley, since such passivity leads to the death of four additional people. But what about this scenario, the footbridge dilemma?

  You are standing on a footbridge over the trolley track. You see a trolley racing out of control toward five travelers. All five travelers will die unless the trolley can be stopped. Standing next to you on the footbridge is a large man. He is leaning over the railing, watching the trolley hurtle toward the travelers. If you give the man a push, he will fall over the railing and into the path of the trolley. Because he is so big, he will stop the trolley from killing the travelers. Do you push the man off the footbridge? Or do you allow five travelers to die?

  The facts are the same in both scenarios: one person must die in order for five people to live. If our decisions were perfectly rational, we would act identically in both situations. We’d be as willing to push the man as we are to divert the trolley. Yet almost nobody is willing to push another person onto the tracks. Both decisions lead to the same violent outcome, yet most people view one as moral and the other as murder.

  Greene argues that pushing the man feels wrong because the killing is direct: we are using our body to hurt his body. He calls this a personal moral decision. In contrast, when we switch the trolley onto a different track, we aren’t directly hurting someone else. We are just diverting the trolley: the ensuing death seems indirect. In this case, we are making an impersonal moral decision.

  What makes this thought experiment so interesting is that the fuzzy moral distinction—the difference between personal and impersonal moral decisions—is built into our brain. It doesn’t matter what culture we live in or what religion we subscribe to: the two trolley scenarios trigger different patterns of activity in the brain. When Greene asked study participants whether or not they should divert the trolley, their conscious decision-making machinery was turned on. A network of brain regions assessed the various alternatives, sent the verdict onward to the prefrontal cortex, and the people chose the clearly superior option. Their brain quickly realized that it was better to kill one person than five people.

  However, when participants were asked whether they would be willing to push a man onto the track, a separate network of brain regions was activated. These regions are associated with the processing of emotions, both for ourselves and for others. People in the study couldn’t justify their moral decisions, but their certainty never wavered. Pushing a man off a bridge just felt wrong.

  Such research reveals the surprising ways in which our moral judgments are shaped by our unconscious emotions. Even though we can’t explain these urges—we don’t know why our heart is racing or why our stomach feels queasy—we are nevertheless influenced by them. While feelings of fear and stress can lead to aggression, the fear of harming someone else can keep us from engaging in violence.

  Studies of other people with brain damage similar to Elliot’s and Gage’s—that is, damage to the ventromedial prefrontal cortex—suggest that this part of the brain is very important for integrating emotional signals into decision making. If it is, then we might expect these people to make very different kinds of decisions in Greene’s trolley problem. They might view it as essentially an accounting question. Five lives for one? Sure, use the oversized man to stop the trolley. In fact, when faced with this dilemma, people with damage to the ventromedial prefrontal cortex are four or five times more likely than ordinary people to say “Push the guy off the footbridge” in the name of the greater good.

  This finding underscores the theory that different kinds of moralities are embedded in different systems in the brain. On the one hand we have an emotional system that says, “No, don’t do it!” like an alarm bell going off. On the other hand we have a system that says, “We want to save the most lives, so five lives for one sounds like a good deal.” In ordinary people these moralities compete, but in people with Gage’s kind of brain damage one system is knocked out and the other is intact.

  THE BIOLOGY OF PSYCHOPATHIC BEHAVIOR

  What about psychopaths, people who would have no difficulty deciding to push somebody off the footbridge? Research on psychopathy indicates that it is primarily an emotional disorder with two defining features: antisocial behavior and lack of empathy for other people. The first can result in horrendous crimes, the second in lack of remorse for those crimes.

  Kent Kiehl at the University of New Mexico drives a mobile fMRI machine to prisons to scan the brains of prisoners, many of whom are psychopathic, as indicated by their scores on a standardized checklist. He wants to see whether moral reasoning, or lack of it, can be used to understand the mind of the psychopath—and whether understanding the mind of the psychopath can improve our understanding of moral reasoning.

  Greene’s theory would predict that psychopaths don’t have the emotional response that says pushing the man off the footbridge just feels wrong. They would be likely to go with the numbers, one life for five. But psychopaths are not like people with brain damage; psychopaths work very hard to seem normal, to blend in. To capture what they’re really thinking, Kiehl watches not only what the prisoners do but how quickly they do it. For example, a psychopath may be able to hide an emotional reaction to a stimulus—a word or visual image—but he can’t do it quickly, and brain imaging will capture his initial reaction.

  Using brain imaging, Kiehl found that psychopathic inmates have more gray matter in and around the limbic system than do non-psychopathic inmates or non-inmates. The limbic system, which includes the amygdala and
the hippocampus, comprises the regions of the brain involved in how we process emotions. Moreover, the neural circuitry connecting the limbic system to the frontal lobes of the cortex is disrupted in psychopathic inmates. Kiehl notes that several studies have found less activity in those neural circuits when psychopathic prisoners engage in emotional processing and moral decision making.13

  If psychopathic behavior is based in biology, what does this mean for free will, for individual responsibility? Do these built-in neural processes lead inexorably to certain decisions, or does our conscious sense of morality, our cognitive mental function, have the last word?

  This question is becoming increasingly salient in the criminal justice system. Judges look to psychologists and neuroscientists for help in understanding the value and limitations of scientific findings. They want to know if the findings are highly reliable, what they mean in terms of behavior, and how they should be used in a court of law to improve the fairness of the judicial system. The U.S. Supreme Court, for example, recently ruled that a sentence of life in prison without parole for juvenile criminals is unconstitutional. The justices pointed to findings from brain science which indicate that adolescents and adults use different parts of their brain to control behavior.

  Most neuroscientists think we should be held responsible for our actions, but the opposing argument has some validity. Should people with brain damage that leaves them incapable of making appropriate moral judgments be treated the same way as people who can make moral judgments? What neuroscience reveals about this question is going to affect our legal system and the rest of our society in the decades to come.

  Studies of psychopaths are likely to have a major impact not only on our understanding of how people can be influenced to make appropriate decisions but also on the development of new kinds of diagnoses and new kinds of treatments. Research suggests that both genes and environment contribute to psychopathy, as they do to other disorders. In his continuing search for biomarkers of psychopathy, Kiehl has recently expanded his brain-imaging studies to include young people who show signs of psychopathic traits.14 This is important because not everyone with psychopathic traits becomes a violent criminal. If scientists could identify children who have a tendency toward psychopathy, they might be able to develop behavioral therapies to head off future violent behavior. If a malfunction of some region of the brain is identified, perhaps some other region of the brain could be encouraged to take over and suppress violent aspects of behavior.

 

‹ Prev