Book Read Free

Behave: The Biology of Humans at Our Best and Worst

Page 18

by Robert M. Sapolsky


  In chapter 13 we consider how neither the most burning emotional capacity for empathy nor the most highfalutin moral reasoning makes someone likely to actually do the brave, difficult thing. This raises a subtle limitation of adolescent empathy.

  As will be seen, one instance where empathic responses don’t necessarily lead to acts is when we think enough to rationalize (“It’s overblown as a problem” or “Someone else will fix it”). But feeling too much has problems as well. Feeling someone else’s pain is painful, and people who do so most strongly, with the most pronounced arousal and anxiety, are actually less likely to act prosocially. Instead the personal distress induces a self-focus that prompts avoidance—“This is too awful; I can’t stay here any longer.” As empathic pain increases, your own pain becomes your primary concern.

  In contrast, the more individuals can regulate their adverse empathic emotions, the more likely they are to act prosocially. Related to that, if a distressing, empathy-evoking circumstance increases your heart rate, you’re less likely to act prosocially than if it decreases it. Thus, one predictor of who actually acts is the ability to gain some detachment, to ride, rather than be submerged, by the wave of empathy.

  Where do adolescents fit in, with their hearts on their sleeves, fully charged limbic systems, and frontal cortices straining to catch up? It’s obvious. A tendency toward empathic hyperarousal that can disrupt acting effectively.29

  This adolescent empathy frenzy can seem a bit much for adults. But when I see my best students in that state, I have the same thought—it used to be so much easier to be like that. My adult frontal cortex may enable whatever detached good I do. The trouble, of course, is how that same detachment makes it easy to decide that something is not my problem.

  ADOLESCENT VIOLENCE

  Obviously, the adolescent years are not just about organizing bake sales to fight global warming. Late adolescence and early adulthood are when violence peaks, whether premeditated or impulsive murder, Victorian fisticuffs or handguns, solitary or organized (in or out of a uniform), focused on a stranger or on an intimate partner. And then rates plummet. As has been said, the greatest crime-fighting tool is a thirtieth birthday.

  On a certain level the biology underlying the teenaged mugger is similar to that of the teen who joins the Ecology Club and donates his allowance to help save the mountain gorillas. It’s the usual—heightened emotional intensity, craving for peer approval, novelty seeking, and, oh, that frontal cortex. But that’s where similarities end.

  What underlies the adolescent peak in violence? Neuroimaging shows nothing particularly distinct about it versus adult violence.30 Adolescent and adult psychopaths both have less sensitivity of the PFC and the dopamine system to negative feedback, less pain sensitivity, and less amygdaloid/frontal cortical coupling during tasks of moral reasoning or empathy.

  Moreover, the adolescent peak of violence isn’t caused by the surge in testosterone; harking back to chapter 4, testosterone no more causes violence in adolescents than it does in adult males. Moreover, testosterone levels peak during early adolescence, but violence peaks later.

  The next chapter considers some of the roots of adolescent violence. For now, the important point is that an average adolescent doesn’t have the self-regulation or judgment of an average adult. This can prompt us to view teenage offenders as having less responsibility than adults for criminal acts. An alternative view is that even amid poorer judgment and self-regulation, there is still enough to merit equivalent sentencing. The former view has held in two landmark Supreme Court decisions.

  In the first, 2005’s Roper v. Simmons, the Court ruled 5–4 that executing someone for crimes committed before age eighteen is unconstitutional, violating the Eighth Amendment ban on cruel and unusual punishment. Then in 2012’s Miller v. Alabama, in another 5–4 split, the Court banned mandatory life sentences without the chance of parole for juvenile offenders, on similar grounds.31

  The Court’s reasoning was straight out of this chapter. Writing for the majority in Roper v. Simmons, Justice Anthony Kennedy said:

  First, [as everyone knows, a] lack of maturity and an underdeveloped sense of responsibility are found in youth more often than in adults and are more understandable among the young. These qualities often result in impetuous and ill-considered actions and decisions.32

  I fully agree with these rulings. But, to show my hand early, I think this is just window dressing. As will be covered in the screed that constitutes chapter 16, I think the science encapsulated in this book should transform every nook and cranny of the criminal justice system.

  A FINAL THOUGHT: WHY CAN’T THE FRONTAL CORTEX JUST ACT ITS AGE?

  As promised, this chapter’s dominant fact has been the delayed maturation of the frontal cortex. Why should the delay occur? Is it because the frontal cortex is the brain’s most complicated construction project?

  Probably not. The frontal cortex uses the same neurotransmitter systems as the rest of the brain and uses the same basic neurons. Neuronal density and complexity of interconnections are similar to the rest of the (fancy) cortex. It isn’t markedly harder to build frontal cortex than any other cortical region.

  Thus, it is not likely that if the brain “could” grow a frontal cortex as fast as the rest of the cortex, it “would.” Instead I think there was evolutionary selection for delayed frontal cortex maturation.

  If the frontal cortex matured as fast as the rest of the brain, there’d be none of the adolescent turbulence, none of the antsy, itchy exploration and creativity, none of the long line of pimply adolescent geniuses who dropped out of school and worked away in their garages to invent fire, cave painting, and the wheel.

  Maybe. But this just-so story must accommodate behavior evolving to pass on copies of the genes of individuals, not for the good of the species (stay tuned for chapter 10). And for every individual who scored big time reproductively thanks to adolescent inventiveness, there’ve been far more who instead broke their necks from adolescent imprudence. I don’t think delayed frontal cortical maturation evolved so that adolescents could act over the top.

  Instead, I think it is delayed so that the brain gets it right. Well, duh; the brain needs to “get it right” with all its parts. But in a distinctive way in the frontal cortex. The point of the previous chapter was the brain’s plasticity—new synapses form, new neurons are born, circuits rewire, brain regions expand or contract—we learn, change, adapt. This is nowhere more important than in the frontal cortex.

  An oft-repeated fact about adolescents is how “emotional intelligence” and “social intelligence” predict adult success and happiness better than do IQ or SAT scores.33 It’s all about social memory, emotional perspective taking, impulse control, empathy, ability to work with others, self-regulation. There is a parallel in other primates, with their big, slowly maturing frontal cortices. For example, what makes for a “successful” male baboon in his dominance hierarchy? Attaining high rank is about muscle, sharp canines, well-timed aggression. But once high status is achieved, maintaining it is all about social smarts—knowing which coalitions to form, how to intimidate a rival, having sufficient impulse control to ignore most provocations and to keep displacement aggression to a reasonable level. Similarly, as noted in chapter 2, among male rhesus monkeys a large prefrontal cortex goes hand in hand with social dominance.

  Adult life is filled with consequential forks in the road where the right thing is definitely harder. Navigating these successfully is the portfolio of the frontal cortex, and developing the ability to do this right in each context requires profound shaping by experience.

  This may be the answer. As we will see in chapter 8, the brain is heavily influenced by genes. But from birth through young adulthood, the part of the human brain that most defines us is less a product of the genes with which you started life than of what life has thrown at you. Because it is the last to mature, by definition the frontal co
rtex is the brain region least constrained by genes and most sculpted by experience. This must be so, to be the supremely complex social species that we are. Ironically, it seems that the genetic program of human brain development has evolved to, as much as possible, free the frontal cortex from genes.

  Seven

  Back to the Crib, Back to the Womb

  After journeying to Planet Adolescence, we resume our basic approach. Our behavior—good, bad, or ambiguous—has occurred. Why? When seeking the roots of behavior, long before neurons or hormones come to mind, we typically look first at childhood.

  COMPLEXIFICATION

  Childhood is obviously about increasing complexity in every realm of behavior, thought, and emotion. Crucially, such increasing complexity typically emerges in stereotypical, universal sequences of stages. Most child behavioral development research is implicitly stage oriented, concerning: (a) the sequence with which stages emerge; (b) how experience influences the speed and surety with which that sequential tape of maturation unreels; and (c) how this helps create the adult a child ultimately becomes. We start by examining the neurobiology of the “stage” nature of development.

  A BRIEF TOUR OF BRAIN DEVELOPMENT

  The stages of human brain development make sense. A few weeks after conception, a wave of neurons are born and migrate to their correction locations. Around twenty weeks, there is a burst of synapse formation—neurons start talking to one another. And then axons start being wrapped in myelin, the glial cell insulation (forming “white matter”) that speeds up action.

  Neuron formation, migration, and synaptogenesis are mostly prenatal in humans.1 In contrast, there is little myelin at birth, particularly in evolutionarily newer brain regions; as we’ve seen, myelination proceeds for a quarter century. The stages of myelination and consequent functional development are stereotypical. For example, the cortical region central to language comprehension myelinates a few months earlier than that for language production—kids understand language before producing it.

  Myelination is most consequential when enwrapping the longest axons, in neurons that communicate the greatest distances. Thus myelination particularly facilitates brain regions talking to one another. No brain region is an island, and the formation of circuits connecting far-flung brain regions is crucial—how else can the frontal cortex use its few myelinated neurons to talk to neurons in the brain’s subbasement to make you toilet trained?2

  As we saw, mammalian fetuses overproduce neurons and synapses; ineffective or unessential synapses and neurons are pruned, producing leaner, meaner, more efficient circuitry. To reiterate a theme from the last chapter, the later a particular brain region matures, the less it is shaped by genes and the more by environment.3

  STAGES

  What stages of child development help explain the good/bad/in-between adult behavior that got the ball rolling in chapter 1?

  The mother of all developmental stage theories was supplied in 1923, pioneered by Jean Piaget’s clever, elegant experiments revealing four stages of cognitive development:4

  Sensorimotor stage (birth to ~24 months). Thought concerns only what the child can directly sense and explore. During this stage, typically at around 8 months, children develop “object permanence,” understanding that even if they can’t see an object, it still exists—the infant can generate a mental image of something no longer there.*

  Preoperational stage (~2 to 7 years). The child can maintain ideas about how the world works without explicit examples in front of him. Thoughts are increasingly symbolic; imaginary play abounds. However, reasoning is intuitive—no logic, no cause and effect. This is when kids can’t yet demonstrate “conservation of volume.” Identical beakers A and B are filled with equal amounts of water. Pour the contents of beaker B into beaker C, which is taller and thinner. Ask the child, “Which has more water, A or C?” Kids in the preoperational stage use incorrect folk intuition—the water line in C is higher than that in A; it must contain more water.

  Concrete operational stage (7 to 12 years). Kids think logically, no longer falling for that different-shaped-beakers nonsense. However, generalizing logic from specific cases is iffy. As is abstract thinking—for example, proverbs are interpreted literally (“‘Birds of a feather flock together’ means that similar birds form flocks”).

  Formal operational stage (adolescence onward). Approaching adult levels of abstraction, reasoning, and metacognition.

  Kid playing hide-and-seek while in the “If I can’t see you (or even if I can’t see you as easily as usual), then you can’t see me” stage.

  Other aspects of cognitive development are also conceptualized in stages. An early stage occurs when toddlers form ego boundaries—“There is a ‘me,’ separate from everyone else.” A lack of ego boundaries is shown when a toddler isn’t all that solid on where he ends and Mommy starts—she’s cut her finger, and he claims his finger hurts.5

  Next comes the stage of realizing that other individuals have different information than you do. Nine-month-olds look where someone points (as can other apes and dogs), knowing the pointer has information that they don’t. This is fueled by motivation: Where is that toy? Where’s she looking? Older kids understand more broadly that other people have different thoughts, beliefs, and knowledge than they, the landmark of achieving Theory of Mind (ToM).6

  Here’s what not having ToM looks like. A two-year-old and an adult see a cookie placed in box A. The adult leaves, and the researcher switches the cookie to box B. Ask the child, “When that person comes back, where will he look for the cookie?” Box B—the child knows it’s there and thus everyone knows. Around age three or four the child can reason, “They’ll think it’s in A, even though I know it’s in B.” Shazam: ToM.

  Mastering such “false belief” tests is a major developmental landmark. ToM then progresses to fancier insightfulness—e.g., grasping irony, perspective taking, or secondary ToM (understanding person A’s ToM about person B).7

  Various cortical regions mediate ToM: parts of the medial PFC (surprise!) and some new players, including the precuneus, the superior temporal sulcus, and the temporoparietal junction (TPJ). This is shown with neuroimaging; by ToM deficits if these regions are damaged (autistic individuals, who have limited ToM, have decreased gray matter and activity in the superior temporal sulcus); and by the fact that if you temporarily inactivate the TPJ, people don’t consider someone’s intentions when judging them morally.8

  Thus there are stages of gaze following, followed by primary ToM, then secondary ToM, then perspective taking, with the speed of transitions influenced by experience (e.g., kids with older siblings achieve ToM earlier than average).9

  Naturally, there are criticisms of stage approaches to cognitive development. One is at the heart of this book: a Piagetian framework sits in a “cognition” bucket, ignoring the impact of social and emotional factors.

  One example to be discussed in chapter 12 concerns preverbal infants, who sure don’t grasp transitivity (if A > B, and B > C, then A > C). Show a violation of transitivity in interactions between shapes on a screen (shape A should knock over shape C, but the opposite occurs), and the kid is unbothered, doesn’t look for long. But personify the shapes with eyes and a mouth, and now heart rate increases, the kid looks longer—“Whoa, character C is supposed to move out of character A’s way, not the reverse.” Humans understand logical operations between individuals earlier than between objects.10

  Social and motivational state can shift cognitive stage as well. Rudiments of ToM are more demonstrable in chimps who are interacting with another chimp (versus a human) and if there is something motivating—food—involved.*11

  Emotion and affect can alter cognitive stage in remarkably local ways. I saw a wonderful example of this when my daughter displayed both ToM and failure of ToM in the same breath. She had changed preschools and was visiting her old class. She told everyone about life in her new school: �
��Then, after lunch, we play on the swings. There are swings at my new school. And then, after that, we go inside and Carolee reads us a story. Then, after that . . .” ToM: “play on the swings”—wait, they don’t know that my school has swings; I need to tell them. Failure of ToM: “Carolee reads us a story.” Carolee, the teacher at her new school. The same logic should apply—tell them who Carolee is. But because Carolee was the most wonderful teacher alive, ToM failed. Afterward I asked her, “Hey, why didn’t you tell everyone that Carolee is your teacher?” “Oh, everyone knows Carolee.” How could everyone not?

  Feeling Someone Else’s Pain

  ToM leads to a next step—people can have different feelings than me, including pained ones.12 This realization is not sufficient for empathy. After all, sociopaths, who pathologically lack empathy, use superb ToM to stay three manipulative, remorseless steps ahead of everyone. Nor is this realization strictly necessary for empathy, as kids too young for ToM show rudiments of feeling someone else’s pain—a toddler will try to comfort someone feigning crying, offering them her pacifier (and the empathy is rudimentary in that the toddler can’t imagine someone being comforted by different things than she is).

  Yes, very rudimentary. Maybe the toddler feels profound empathy. Or maybe she’s just distressed by the crying and is self-interestedly trying to quiet the adult. The childhood capacity for empathy progresses from feeling someone’s pain because you are them, to feeling for the other person, to feeling as them.

 

‹ Prev