Book Read Free

Behave: The Biology of Humans at Our Best and Worst

Page 14

by Robert M. Sapolsky


  This sets us up for a positive feedback loop. As noted, with the onset of stress, the amygdala indirectly activates the glucocorticoid stress response. And in turn glucocorticoids increase amygdala excitability.

  Stress also makes it harder to unlearn fear, to “extinguish” a conditioned fear association. This involves the prefrontal cortex, which causes fear extinction by inhibiting the BLA (as covered in chapter 2); stress weakens the PFC’s hold over the amygdala.72

  Recall what fear extinction is about. You’ve learned to fearfully associate a light with a shock, but today the light keeps coming on with no shock. Extinction is not passively forgetting that light equals shock. It is the BLA actively learning that light no longer equals shock. Thus stress facilitates learning fear associations but impairs learning fear extinction.

  Sustained Stress, Executive Function, and Judgment

  Stress compromises other aspects of frontal cortical function. Working memory is disrupted; in one study, prolonged administration of high glucocorticoid levels to healthy subjects impaired working memory into the range seen after frontal cortical damage. Glucocorticoids accomplish this by enhancing norepinephrine signaling in the PFC so much that, instead of causing aroused focus, it induces chicken-with-its-head-cut-off cognitive tumult, and by enhancing disruptive signaling from the amygdala to the PFC. Stress also desynchronizes activation in different frontocortical regions, which impairs the ability to shift attention between tasks.73

  These stress effects on frontal function also make us perseverative—in a rut, set in our ways, running on automatic, being habitual. We all know this—what do we typically do during a stressful time when something isn’t working? The same thing again, many more times, faster and more intensely—it becomes unimaginable that the usual isn’t working. This is precisely where the frontal cortex makes you do the harder but more correct thing—recognize that it’s time for a change. Except for a stressed frontal cortex, or one that’s been exposed to a lot of glucocorticoids. In rats, monkeys, and humans, stress weakens frontal connections with the hippocampus—essential for incorporating the new information that should prompt shifting to a new strategy—while strengthening frontal connections with more habitual brain circuits.74

  Finally, the decreased frontal function and increased amygdaloid function during stress alter risk-taking behavior. For example, the stress of sleep deprivation or of public speaking, or the administration of high glucocorticoid levels, shifts people from protecting against losses to seeking bigger gains when gambling. This involves an interesting gender difference—in general, major stressors make people of both genders more risk taking. But moderate stressors bias men toward, and women away from, risk taking. In the absence of stress, men tend toward more risk taking than women; thus, once again, hormones enhance a preexisting tendency.75

  Whether one becomes irrationally risk taking (failing to shift strategy in response to a declining reward rate) or risk averse (failing to respond to the opposite), one is incorporating new information poorly. Stated most broadly, sustained stress impairs risk assessment.76

  Sustained Stress and Pro- and Antisociality

  During sustained stress, the amygdala processes emotional sensory information more rapidly and less accurately, dominates hippocampal function, and disrupts frontocortical function; we’re more fearful, our thinking is muddled, and we assess risks poorly and act impulsively out of habit, rather than incorporating new data.77 This is a prescription for rapid, reactive aggression; stress and acute administration of glucocorticoids increase such aggression in both rodents and humans. We have two familiar qualifications: (a) rather than creating aggression, stress and glucocorticoids increase sensitivity to social triggers of aggression; (b) this occurs most readily in individuals already predisposed toward aggression. As we will see in the next chapter, stress over the course of weeks to months produces a less nuanced picture.

  There’s an additional depressing reason why stress fosters aggression—because it reduces stress. Shock a rat and its glucocorticoid levels and blood pressure rise; with enough shocks, it’s at risk for a “stress” ulcer. Various things can buffer the rat during shocks—running on a running wheel, eating, gnawing on wood in frustration. But a particularly effective buffer is for the rat to bite another rat. Stress-induced (aka frustration-induced) displacement aggression is ubiquitous in various species. Among baboons, for example, nearly half of aggression is this type—a high-ranking male loses a fight and chases a subadult male, who promptly bites a female, who then lunges at an infant. My research shows that within the same dominance rank, the more a baboon tends to displace aggression after losing a fight, the lower his glucocorticoid levels.78

  Humans excel at stress-induced displacement aggression—consider how economic downturns increase rates of spousal and child abuse. Or consider a study of family violence and pro football. If the local team unexpectedly loses, spousal/partner violence by men increases 10 percent soon afterward (with no increase when the team won or was expected to lose). And as the stakes get higher, the pattern is exacerbated: a 13 percent increase after upsets when the team was in playoff contention, a 20 percent increase when the upset is by a rival.79

  Little is known concerning the neurobiology of displacement aggression blunting the stress response. I’d guess that lashing out activates dopaminergic reward pathways, a surefire way to inhibit CRH release.*80 Far too often, giving an ulcer helps avoid getting one.

  More bad news: stress biases us toward selfishness. In one study subjects answered questions about moral decision-making scenarios after either a social stressor or a neutral situation.* Some scenarios were of low emotional intensity (“In the supermarket you wait at the meat counter and an elderly man pushes to the front. Would you complain?”), others high intensity (“You meet the love of your life, but you are married and have children. Would you leave your family?”). Stress made people give more egoistic answers about emotionally intense moral decisions (but not milder ones); the more glucocorticoid levels rose, the more egoistic the answers. Moreover, in the same paradigm, stress lessened how altruistic people claimed they’d be concerning personal (but not impersonal) moral decisions.81

  We have another contingent endocrine effect: stress makes people more egoistic, but only in the most emotionally intense and personal circumstances.* This resembles another circumstance of poor frontal function—recall from chapter 2 how individuals with frontal cortical damage make reasonable judgments about someone else’s issues, but the more personal and emotionally potent the issue, the more they are impaired.

  Feeling better by abusing someone innocent, or thinking more about your own needs, is not compatible with feeling empathy. Does stress decrease empathy? Seemingly yes, in both mice and humans. A remarkable 2006 paper in Science by Jeffrey Mogil of McGill University showed the rudiments of mouse empathy—a mouse’s pain threshold is lowered when it is near another mouse in pain, but only if the other mouse is its cagemate.82

  This prompted a follow-up study that I did with Mogil’s group involving the same paradigm. The presence of a strange mouse triggers a stress response. But when glucocorticoid secretion is temporarily blocked, mice show the same “pain empathy” for a strange mouse as for a cagemate. In other words, to personify mice, glucocorticoids narrow who counts as enough of an “Us” to evoke empathy. Likewise in humans—pain empathy was not evoked for a stranger unless glucocorticoid secretion was blocked (either after administration of a short-acting drug or after the subject and stranger interacted socially). Recall from chapter 2 the involvement of the anterior cingulate cortex in pain empathy. I bet that glucocorticoids do some disabling, atrophying things to neurons there.

  Thus, sustained stress has some pretty unappealing behavioral effects. Nonetheless there are circumstances where stress brings out the magnificent best in some people. Work by Shelley Taylor of UCLA shows that “fight or flight” is the typical response to stress in males,
and naturally, the stress literature is predominantly studies of males by males.83 Things often differ in females. Showing that she can match the good old boys when it comes to snappy sound bites, Taylor framed the female stress response as being more about “tend and befriend”—caring for your young and seeking social affiliation. This fits with striking sex differences in stress management styles, and tend-and-befriend most likely reflects the female stress response involving a stronger component of oxytocin secretion.

  Naturally, things are subtler than “male = fight/flight and female = tend/befriend.” There are frequent counterexamples to each; stress elicits prosociality in more males than just pair-bonded male marmosets, and we saw that females are plenty capable of aggression. Then there’s Mahatma Gandhi and Sarah Palin.* Why are some people exceptions to these gender stereotypes? That’s part of what the rest of this book is about.

  Stress can disrupt cognition, impulse control, emotional regulation, decision making, empathy, and prosociality. One final point. Recall from chapter 2 how the frontal cortex making you do the harder thing when it’s the right thing is value free—“right thing” is purely instrumental. Same with stress. Its effects on decision making are “adverse” only in a neurobiological sense. During a stressful crisis, an EMT may become perseverative, making her ineffectual at saving lives. A bad thing. During a stressful crisis, a sociopathic warlord may become perseverative, making him ineffectual at ethnically cleansing a village. Not a bad thing.

  SOME IMPORTANT DEBUNKING: ALCOHOL

  No review of the biological events in the minutes to hours prior to a behavior can omit alcohol. As everyone knows, alcohol lessens inhibitions, making people more aggressive. Wrong, and in a familiar way—alcohol only evokes aggression only in (a) individuals prone to aggression (for example, mice with lower levels of serotonin signaling in the frontal cortex and men with the oxytocin receptor gene variant less responsive to oxytocin are preferentially made aggressive by alcohol) and (b) those who believe that alcohol makes you more aggressive, once more showing the power of social learning to shape biology.84 Alcohol works differently in everyone else—for example, a drunken stupor has caused many a quickie Vegas wedding that doesn’t seem like a great idea with the next day’s sunrise.

  SUMMARY AND SOME CONCLUSIONS

  Hormones are great; they run circles around neurotransmitters, in terms of the versatility and duration of their effects. And this includes affecting the behaviors pertinent to this book.

  Testosterone has far less to do with aggression than most assume. Within the normal range, individual differences in testosterone levels don’t predict who will be aggressive. Moreover, the more an organism has been aggressive, the less testosterone is needed for future aggression. When testosterone does play a role, it’s facilitatory—testosterone does not “invent” aggression. It makes us more sensitive to triggers of aggression, particularly in those most prone to aggression. Also, rising testosterone levels foster aggression only during challenges to status. Finally, crucially, the rise in testosterone during a status challenge does not necessarily increase aggression; it increases whatever is needed to maintain status. In a world in which status is awarded for the best of our behaviors, testosterone would be the most prosocial hormone in existence.

  Oxytocin and vasopressin facilitate mother-infant bond formation and monogamous pair-bonding, decrease anxiety and stress, enhance trust and social affiliation, and make people more cooperative and generous. But this comes with a huge caveat—these hormones increase prosociality only toward an Us. When dealing with Thems, they make us more ethnocentric and xenophobic. Oxytocin is not a universal luv hormone. It’s a parochial one.

  Female aggression in defense of offspring is typically adaptive and is facilitated by estrogen, progesterone, and oxytocin. Importantly, females are aggressive in many other evolutionarily adaptive circumstances. Such aggression is facilitated by the presence of androgens in females and by complex neuroendocrine tricks for generating androgenic signals in “aggressive,” but not “maternal” or “affiliative,” parts of the female brain. Mood and behavioral changes around the time of menses are a biological reality (albeit poorly understood on a nuts-and-bolts level); in contrast, pathologizing these shifts is a social construct. Finally, except for rare, extreme cases, the link between PMS and aggression is minimal.

  Sustained stress has numerous adverse effects. The amygdala becomes overactive and more coupled to pathways of habitual behavior; it is easier to learn fear and harder to unlearn it. We process emotionally salient information more rapidly and automatically, but with less accuracy. Frontal function—working memory, impulse control, executive decision making, risk assessment, and task shifting—is impaired, and the frontal cortex has less control over the amygdala. And we become less empathic and prosocial. Reducing sustained stress is a win-win for us and those stuck around us.

  “I’d been drinking” is no excuse for aggression.

  Over the course of minutes to hours, hormonal effects are predominantly contingent and facilitative. Hormones don’t determine, command, cause, or invent behaviors. Instead they make us more sensitive to the social triggers of emotionally laden behaviors and exaggerate our preexisting tendencies in those domains. And where do those preexisting tendencies come from? From the contents of the chapters ahead of us.

  Five

  Days to Months Before

  Our act has occurred—the pulling of a trigger or the touching of an arm that can mean such different things in different contexts. Why did that just happen? We’ve seen how, seconds before, that behavior was the product of the nervous system, whose actions were shaped by sensory cues minutes to hours before, and how the brain’s sensitivity to those cues was shaped by hormonal exposure in the preceding hours to days. What events in the prior days to months shaped that outcome?

  Chapter 2 introduced the plasticity of neurons, the fact that things alter in them. The strength of a dendritic input, the axon hillock’s set point for initiating an action potential, the duration of the refractory period. The previous chapter showed that, for example, testosterone increases the excitability of amygdaloid neurons, and glucocorticoids decrease excitability of prefrontal cortical neurons. We even saw how progesterone boosts the efficacy with which GABA-ergic neurons decrease the excitability of other neurons.

  Those versions of neural plasticity occur over hours. We now examine more dramatic plasticity occurring over days to months. A few months is enough time for an Arab Spring, for a discontented winter, or for STDs to spread a lot during a Summer of Love. As we’ll see, this is also sufficient time for enormous changes in the brain’s structure.

  NONLINEAR EXCITATION

  We start small. How can events from months ago produce a synapse with altered excitability today? How do synapses “remember”?

  When neuroscientists first approached the mystery of memory at the start of the twentieth century, they asked that question on a more macro level—how does a brain remember? Obviously, a memory was stored in a single neuron, and a new memory required a new neuron.

  The discovery that adult brains don’t make new neurons trashed that idea. Better microscopes revealed neuronal arborization, the breathtaking complexity of branches of dendrites and axon terminals. Maybe a new memory requires a neuron to grow a new axonal or dendritic branch.

  Knowledge emerged about synapses, neurotransmitter-ology was born, and this idea was modified—a new memory requires the formation of a new synapse, a new connection between an axon terminal and a dendritic spine.

  These speculations were tossed on the ash heap of history in 1949, because of the work of the Canadian neurobiologist Donald Hebb, a man so visionary that even now, nearly seventy years later, neuroscientists still own bobblehead dolls of him. In his seminal book, The Organization of Behaviour, Hebb proposed what became the dominant paradigm. Forming memories doesn’t require new synapses (let alone new branches or n
eurons); it requires the strengthening of preexisting synapses.1

  What does “strengthening” mean? In circuitry terms, if neuron A synapses onto neuron B, it means that an action potential in neuron A more readily triggers one in neuron B. They are more tightly coupled; they “remember.” Translated into cellular terms, “strengthening” means that the wave of excitation in a dendritic spine spreads farther, getting closer to the distant axon hillock.

  Extensive research shows that experience that causes repeated firing across a synapse “strengthens” it, with a key role played by the neurotransmitter glutamate.

  Recall from chapter 2 how an excitatory neurotransmitter binds to its receptor in the postsynaptic dendritic spine, causing a sodium channel to open; some sodium flows in, causing a blip of excitation, which then spreads.

  Glutamate signaling works in a fancier way that is essential to learning.2 To simplify considerably, while dendritic spines typically contain only one type of receptor, those responsive to glutamate contain two. The first (the “non-NMDA”) works in a conventional way—for every little smidgen of glutamate binding to these receptors, a smidgen of sodium flows in, causing a smidgen of excitation. The second (the “NMDA”) works in a nonlinear, threshold manner. It is usually unresponsive to glutamate. It’s not until the non-NMDA has been stimulated over and over by a long train of glutamate release, allowing enough sodium to flow in, that this activates the NMDA receptor. It suddenly responds to all that glutamate, opening its channels, allowing an explosion of excitation.

 

‹ Prev