Book Read Free

The Trouble with Testosterone

Page 7

by Robert M. Sapolsky


  But there was another reason as well, which didn’t become apparent until later. One morning, when Hobbes was concentrating on whom to hassle next and paying no attention to us, I managed to put a dart into his haunches. Months later, when examining his blood sample in the laboratory, we found that Hobbes had among the highest levels of Cortisol in the troop and extremely low lymphocyte counts, less than one-quarter the troop average. (Ruto and Fatso, conveniently sitting on the sidelines, had three and six times as many lymphocytes as Hobbes had.) The young baboon was experiencing a massive stress response himself, larger even than those of the females he was traumatizing, and certainly larger than is typical of the other, meek transfer males I’ve studied. In other words, it doesn’t come cheap to be a bastard twelve hours a day—a couple of months of this sort of thing is likely to exert a physiological toll.

  As a postscript, Hobbes did not hold on to his position. Within five months he was toppled, dropping down to number three in the hierarchy. After three years in the troop, he disappeared into the sunset, transferring out to parts unknown to try his luck in some other troop.

  All this only reaffirms that transferring is awful for adolescents—whether they’re average geeks opting for the route of slow acceptance or rare animals like Hobbes who try to take a troop by storm. Either way it’s an ordeal, and the young animals pay a heavy price. The marvel is that they keep on doing it. Transferring may solve the inbreeding problem for a population, but what’s in it for the individual?

  Adolescent transfer is a feature of many social mammals, not just primates, and the mechanisms can differ. Sometimes transfer can arise from intrasexual competition—a fancy way of saying that the adolescents are driven from the group by a more powerful same-sex competitor. You usually see this in species like gazelles and impalas. The core social group consists of a single breeding male, a large collection of females, and their offspring. At any given point some of those male offspring are likely to be entering puberty. But since the breeding male typically doesn’t hold on to his precarious position for long, he is probably not the father of these adolescent males. He doesn’t view them as sons coming of age but as unrelated males growing into reproductive competitors. At their first signs of puberty he violently drives them out of the group.

  In primates, however, forced dispersion almost never happens. Something absolutely remarkable occurs: these adolescents choose to go—even though the move seems crazy. After all, they live in a troop surrounded by family and friends. They know their home turf, which trees are fruiting at what time of year, where the local predators lurk. Yet they leave these home comforts to endure parasites, predators, and loneliness. And why? To dwell among strangers who treat them terribly. It makes no sense, from the standpoint of the individual animal. Behaviorist theories state this more formally: Animals, including humans, tend to do things for which they are rewarded and tend not to do things for which they get punished. Yet here they are, leaving their comfortable, rewarding world in order to be amply dumped on far away. Furthermore, animals usually hate novelty. Place a rat in a new cage, give it a new feeding pattern, and it exhibits a stress response. Yet here young primates are risking life and limb for novelty. Old World monkeys have been known to transfer up to five times over their lifetime, or travel nearly forty miles to a new troop.

  Why should any individual in its right mind want to do this?

  I do not know why transfer occurs, but it is clearly very deeply rooted. Humans, in part because their diets are adaptable, are the most widely distributed mammal on earth, inhabiting nearly every godforsaken corner of this planet. Among our primate relatives, those with the least finicky of diets, such as baboons, are also among the most widely distributed beasts on earth (African baboons range from desert to rain forest, from mountain to savannah). Inevitably, someone had to be the first to set foot in each of those new worlds, an individual who transferred in a big way. And it is overwhelmingly likely it was a young individual who did that.

  This love affair with risk and novelty seems to be why the young of all our primate species are the most likely to die of accidents, doing foolhardy things while their elders cluck over how they told them so.2 And it is also the reason that the young are most likely to discover something really new and extraordinary, whether in the physical or the intellectual realm. When the novel practice of washing food in seawater was discovered by snow monkeys in Japan, it was a youngster who did so, and it was her playmates who picked up the adaptation; hardly any of the older animals did. And when Darwin’s ideas about evolution swept through academic primates in the mid-nineteenth century, it was the new, up-and-coming generation of scientists that embraced his ideas with the greatest enthusiasm.

  You don’t have to search far for other examples. Think of the tradition of near-adolescent mathematicians revolutionizing their fields, or of the young Picasso and Stravinsky inventing twentieth-century culture. Think about teenagers inevitably, irresistibly wanting to drive too fast, or trying out some new improvisatory sport guaranteed to break their necks, or marching off in an excited frenzy to whatever stupid war their elders have cooked up. Think of the endless young people leaving their homes, homes perhaps rife with poverty or oppression, but still their homes, to go off to new worlds.

  Part of the reason for the evolutionary success of primates, human or otherwise, is that we are a pretty smart collection of animals. What’s more, our thumbs work in particularly fancy and advantageous ways, and we’re more flexible about food than most. But our primate essence is more than just abstract reasoning, dexterous thumbs, and omnivorous diets. Another key to our success must have something to do with this voluntary transfer process, this primate legacy of feeling an itch around adolescence. How did voluntary dispersal evolve? What is going on with that individual’s genes, hormones, and neurotransmitters to make it hit the road? We don’t know, but we do know that following this urge is one of the most resonantly primate of acts. A young male baboon stands riveted at the river’s edge; an adolescent female chimp cranes to catch a glimpse of the chimps from the next valley. New animals, a whole bunch of ‘em! To hell with logic and sensible behavior, to hell with tradition and respecting your elders, to hell with this drab little town, and to hell with that knot of fear in your stomach. Curiosity, excitement, adventure—the hunger for novelty is something fundamentally daft, rash, and enriching that we share with our whole taxonomic order.

  FURTHER READING

  For a good general overview of intertroop transfer among primates, see A. Pusey and C. Packer, “Dispersal and Philopatry,” in B. Smuts, D. Cheney, R. Seyfarth, R. Wrangham, and T. Struhsackers, eds., Primate Societies (Chicago: University of Chicago Press, 1987), 250. For a very detailed analysis of the costs and benefits of transfer among baboons, see S. Alberts and J. Altmann, “Balancing Costs and Opportunities: Dispersal in Male Baboons,” American Naturalist 145 (1995): 279.

  For the story of Hobbes, in dauntingly technical detail, see S. Alberts, J. Altmann, and R. Sapolsky, “Behavioral, Endocrine and Immunological Correlates of Immigration by an Aggressive Male into a Natural Primate Group,” Hormones and Behavior 26 (1992): 167.

  The Solace of Patterns

  Juan Gonzales, Mar de Lagrimas, 1987–88; courtesy Nancy Hoffman Gallery, New York. Photo: Chris Watson

  A short time ago my father died, having spent far too many of his last years in pain and degeneration. Although I had expected his death and tried to prepare myself for it, when the time came it naturally turned out that you really can’t prepare. A week afterward I found myself back at work, bludgeoned by emotions that swirled around a numb core of unreality—a feeling of disconnection from the events that had just taken place on the other side of the continent, of disbelief that it was really him frozen in that nightmare of stillness. The members of my laboratory were solicitous. One, a medical student, asked me how I was doing, and I replied, “Well, today it seems as if I must have imagined it all.” “That makes sense,” she said. “Don’t fo
rget about DABDA.”

  DABDA. In 1969 the psychiatrist Elisabeth Kübler-Ross published a landmark book, On Death and Dying. Drawing on her research with terminally ill people and their families, she described the process whereby people mourn the death of others and, when impending, of themselves. Most of us, she observed, go through a fairly well-defined sequence of stages. First we deny the death is happening. Then we become angry at the unfairness of it all. We pass through a stage of irrational bargaining, with the doctors, with God: Just let this not be fatal and I will change my ways. Please, just wait until Christmas. There follows a stage of depression and, if one is fortunate, the final chapter, serene acceptance. The sequence is not ironclad; individuals may skip certain stages, experience them out of order, or regress to earlier ones. DABDA, moreover, is generally thought to give a better description of one’s own preparation for dying than of one’s mourning the demise of someone else. Nevertheless, there is a broadly recognized consistency in the overall pattern of mourning: denial, anger, bargaining, depression, acceptance. I was stuck at stage one, right on schedule.

  Brevity is the soul of DABDA. A few years ago I saw that point brilliantly dramatized on television—on, of all programs, The Simpsons. It was the episode in which Homer, the father, accidentally eats a poisonous fish and is told he has twenty-four hours to live. There ensues a thirty-second sequence in which the cartoon character races through the death and dying stages, something like this: “No way! I’m not dying.” He ponders a second, then grabs the doctor by the neck. “Why, you little . . .” He trembles in fear, then pleads, “Doc, get me outta this! I’ll make it worth your while.” Finally he composes himself and says, “Well, we all gotta go sometime.” I thought it was hilarious. Here was a cartoon suitable to be watched by kids, and the writers had sneaked in a parody of Kübler-Ross.

  For sheer conciseness, of course, Homer Simpson’s vignette has nothing on the phrase itself, “DABDA.” That’s why medical students memorize the acronym along with hundreds of other mnemonic devices in preparation for their national board examinations. What strikes me now is the power of those letters to encapsulate human experience. My father, by dint of having been human, was unique; thus was my relationship to him, and thus must be my grieving. And yet I come up with something reducible to a medical school acronym. Poems, paintings, symphonies by the most creative artists who ever lived, have been born out of mourning; yet, on some level, they all sprang from the pattern invoked by two pedestrian syllables of pseudo-English. We cry, we rage, we demand that the oceans’ waves stop, that the planets halt their movements in the sky, all because the earth will no longer be graced by the one who sang lullabies as no one else could; yet that, too, is reducible to DABDA. Why should grief be so stereotypical?

  Scientists who study human thought and behavior have discerned many stereotyped, structured stages through which all of us move at various times. The logic of some of the sequences is obvious. It is no surprise that infants learn to crawl before they take their first tentative steps, and only later learn to run. Other sequences are more subtle. Freudians claim that in normal development the child undergoes the invariant transition from a so-called oral stage to an anal stage to a genital stage, and they attribute various aspects of psychological dysfunction in the adult to an earlier failure to move successfully from one stage to the next.

  Similarly, the Swiss psychologist Jean Piaget mapped stages of cognitive development. For example, he noted, there is a stage at which children begin to grasp the concept of object permanence: Before that developmental transition, a toy does not exist once it is removed from the child’s sight. Afterward, the toy exists—and the child will look for it—even when it is no longer visible. Only at a reliably later stage do children begin to grasp concepts such as the conservation of volume—that two pitchers of different shapes can hold the same quantity of liquid. The same developmental patterns occur across numerous cultures, and so the sequence seems to describe the universal way that human beings learn to comprehend a cognitively complex world.

  The American psychologist Lawrence Kohlberg mapped the stereotyped stages people undergo in developing morally. At one early stage of life, moral decisions are based on rules and on the motivation to avoid punishment: actions considered for their effects on oneself. Only at a later stage are decisions made on the basis of a respect for the community, where actions are considered for their effects on others. Later still, and far more rarely, some people develop a morality driven by a set of their own internalized standards derived from a sense of what is right and what is wrong for all possible communities. The pattern is progressive: People who now act out of conscience invariably, at some earlier stage of life, believed that you don’t do bad things because you might get caught.

  The American psychoanalyst Erik Erikson discerned a sequence of psychosocial development, framing it as crises that a person resolves or fails to resolve at each stage. For infants, the issue is whether one attains a basic attitude of trust toward the world; for adolescents, it is identity versus identity confusion; for young adults, intimacy versus isolation; for adults, generativity versus stagnation; and for the aged, peaceful acceptance and integrity versus despair. Erikson’s pioneering insight that one’s later years represent a series of transitions that must be successfully negotiated is reflected in a quip by the geriatrician Walter M. Bortz II of Stanford University Medical School. Asked whether he was interested in curing aging, Bortz responded, “No, I’m not interested in arrested development.”

  Those are some of the patterns we all are reported or theorized to have in common, across many settings and cultures. I think such conceptualizations are often legitimate, not just artificial structures that scientists impose on inchoate reality. Why should we share such patterning? It is certainly not for lack of alternatives. As living beings, we represent complex, organized systems—an eddy in the random entropy of the universe. When all the possibilities are taken into account, it is supremely unlikely for elements to assemble themselves into molecules, for molecules to form cells, for vast assemblages of cells to form us. How much more unlikely, it seems, that such complex organisms conform to such relatively simple patterns of behavior, of development, or of thought.

  One way of coming to grips with the properties of complex systems is through a field of mathematics devoted to the study of so-called cellular automata. The best way of explaining its style of analysis is by example. Imagine a long row of boxes—some black, some white—arranged to form some initial pattern, a starting stage. The row of boxes is to give rise to a second row, just below the first. The way that takes place in a cellular automation is that each box in the first row is subjected to a set of reproduction rules. For example, one rule might stipulate that a black box in the first row gives rise to a black box immediately below it in the next row only if exactly one of its two nearest neighbors is black. Other rules might apply to a black box flanked by two white boxes or two black boxes. Once the set of rules is applied to each box in the first row, a second row of black and white boxes is generated; then the rules are applied again to each box in the second row to generate a third row and so on.

  Metaphorically, each row represents one generation, one tick of a clock. A properly programmed computer could track any possible combination of colored boxes, following any conceivable set of reproduction rules, down through the generations. In the vast majority of cases, somewhere down the line it would end up with a row of boxes all the same color. After that, the single color would repeat itself forever. In other words, the line would go extinct.

  Return now to my earlier question: How can it be, in this entropic world, that we human beings share so many stable patterns—one nose; two eyes; a reliable lag time before we learn object permanence; happier adulthoods if we become confident about our identities as adolescents; a tendency to find it hard to believe in tragedy when it strikes. What keeps us from following an almost infinite number of alternative developmental paths? The studies of ce
llular automata provide a hint.

  Not all complex patterns, it turns out, eventually collapse into extinction. A few combinations of starting states and reproduction rules beat the odds and settle down into mature stable patterns that continue down through the generations forever. In general, it is impossible to predict whether a given starting state will survive, let alone which pattern it will generate after some large number of generations. The only way to tell is to crank it through the computer and see. And when you do this, there is a surprise—only a small number of such mature patterns are possible.

  A similar tendency in living systems has long been known to evolutionary biologists. They call it convergence. Among the staggering number of species on this planet, there are only a few handfuls of solutions to the problem of how to locomote, how to conserve fluids in a hot environment, how to store and mobilize energy. And among the staggering variety of humans, it may be a convergent feature of our complexity that there are a small number of ways in which we grow through life or mourn its inevitabilities.

  In an entropic world, we can take a common comfort from our common patterns, and there is often consolation in attributing such patterns to forces larger than ourselves. As an atheist, I have long taken an almost religious solace from a story by the Argentine minimalist Jorge Luis Borges. In his famous short story “The Library of Babel,” Borges describes the world as a library filled with an unimaginably vast number of books, each with the same number of pages and the same number of letters on each page. The library contains a single copy of every possible book, every possible permutation of letters. People spend their lives swimming through this ocean of gibberish, searching for the incalculably rare books whose random arrays of letters form something meaningful, searching above all else for the single book (which must exist) that explains everything, the book that gives the history of all that came before and all that will come, the book containing the dates of birth and of death of every human who would ever wander the halls of the library. And of course, given the completeness of the library, in addition to that perfect book, there must also be one that convincingly disproves the conclusions put forth in it, and yet another book that refutes the malicious solipsisms of the second book . . . plus hundreds of thousands of books that differ from any of those three by a single letter or a comma.

 

‹ Prev