Last Ape Standing: The Seven-Million-Year Story of How and Why We Survived

Home > Other > Last Ape Standing: The Seven-Million-Year Story of How and Why We Survived > Page 7
Last Ape Standing: The Seven-Million-Year Story of How and Why We Survived Page 7

by Chip Walter


  How the Epigenome Changes Your Brain

  So while the codes in our DNA are set for life depending on what our parents pass along to us, we still have plenty of room to deviate from the precise commands of those genes. Thanks to the epigenome, events, and the physical and psychological environments in which we live during our childhoods, can modify the expression of some genes that affect brain development. Some of these amendments can be temporary; others can change us for the rest of our lives.

  Study after study, for example, has found that children exposed to high stress are more likely to suffer mental illnesses later in life, including generalized anxiety and serious depression. High childhood stress has also been shown to modify how a person later handles adversity in adolescence and adulthood. When we are frightened, our adrenal glands release adrenaline, which focuses our attention, increases our heart rate, and prepares our bodies to either fight or flee, handy reactions when your life is on the line. But chronic fear and stress—the kind that continues relentlessly—can corrode us because intense, ongoing awareness of the flight–or–fight kind wears us out. In children an epigenome exposed to constant stress tends to make those children more sensitive to even minimal stress throughout their lives, and more likely to feel anxious when others might not feel the least bit nervous. Poor nutrition or toxic substances can affect epigenomes related to brain development during childhood in ways that blunt brain function later. Together these forces can gang up to have a kind of psychological domino effect that spills into our physical health to make us more susceptible to ailments like asthma, hypertension, heart disease, and diabetes.

  On the other hand, positive experiences—warmth, stability, security, love, and the joy that comes from play—can create equally powerful, but entirely positive, results. Your genes write the basic blueprint of what is personally possible, or impossible. They set the boundaries of who you are physically, psychologically, socially, and intellectually, but your epigenome etches the finer details of your personality—the ways you handle others, your fears, joys your intellectual and emotional prowess, personal talents, confidence, proclivities for optimism or pessimism, and your annoying (not to mention altogether charming) quirks. They influence whether, when, and how your personal set of genes build the capacity for thought, emotional control, and a whole bushel of other future skills. Exactly what route the timing and depth of their effect takes depends on the infinitely complex molecular interactions that constitute your world and your “self.” No matter what, the result is that you come out of it all as unique as a snowflake.

  In case the connection has eluded you, it’s our neotenous nature, our long childhoods, that makes our epigenome so inclined to the influences of our personal experience during the first seven years of life. Because we are born early and since we have extended our brain development well beyond the womb, neuronal networks that in other animals would never have been susceptible to change remain open and flexible, like the branches of a sapling. Although other primates enjoy these “sensitive periods,” too, they pass rapidly, and their circuits become “hardwired” by age one, leaving them far less touched by the experiences of their youth. This epigenetic difference helps explain how chimpanzees, remarkable as they are, can have 99 percent of our DNA, but nothing like the same level of intellect, creativity, or complexity.6

  As productive and interesting as all the goofy openness and flexibility of our toddlerhood is, it also creates a problem. It’s not sustainable. Unless we hope to be a race of primates suffering from terminal cases of attention deficit disorder, the time eventually comes for our lively cerebral growth to be curbed. It’s the biological equivalent of fishing or cutting bait. We can’t afford to record, some way or another, every experience throughout our lives. The costs are too high. Our minds would grow so flexible they would become floppy, and so cluttered they would be incapable of focus. Besides, not every new experience is useful (sitting in traffic, for example). There are also physical limits to how big a brain can grow, though we Homo sapiens have certainly pushed the boundaries. Finally, brains, being greedy organs, devour immense quantities of energy for their size, especially in childhood. A growing toddler’s cerebral appetite gobbles up as much as 85 percent of all the energy that its body requires each day. Over a lifetime that would be insupportable.

  No, at some point the brain must make some tough biological choices by locking into and holding on to the influences the epigenome has expressed, while it somehow brings the sweeping array of connections it generates during childhood under control.

  In the case of the epigenome, this process is relatively simple, if anything that happens in the human brain can be called simple. Despite the wild partying your youthful cerebral cortex undertakes, your genes are still ultimately in charge and dictate when different areas must calm down and mature. Genes decide when the sensitive periods of different cerebral circuits end, and when they end, that’s that. It’s interesting that it happens this way, almost as if each sector were a different brain, each with different genetic rules, which just happen to be expressed within the same skull (which in many ways is precisely the case, since different parts of the brain evolved at different times).

  Neural circuits that analyze color, shape, and motion, for example, mature in the visual cortex long before higher–level functions develop to comprehend facial expressions, or the shape and meaning of frequently used objects, a glass, a fork, or a toy, for example.

  For its part the auditory cortex first learns to recognize simple sounds, then later comprehends the meaning of those noises as words in strings of language that in turn help us make decisions or digest a great novel. The same process is true of other areas that handle physical and cognitive capabilities. Once these parts of the brain mature, the chances of changing them drop precipitously. Not that this is the absolute end of the epigenome’s work. Throughout life, some matured areas will keep interconnecting to record revisions, mistakes, and the additions of knowledge that make us smarter, even wiser. But generally, it is during your childhood that the brain takes what the world has to offer and makes a bet that what its circuits have recorded is representative enough to handle what life will bring in the future. Following our childhoods, memory becomes our most effective way for us to change our behavior.7 (In a way, memory is evolution’s way of allowing you and me to remain in a permanently “sensitive period,” always open to change, regardless of age.)

  Controlling the other way that personal experience changes us—that would be the rampant connections our young brains make—is also related to the epigenome, but different from closing down sensitive periods. Remember, many of the pathways are created based on what we are exposed to as we grow, from music and language to sports and social interactions. You might assume that together they confer immense evolutionary benefits. They do, up to a point. But again there are limits to what we can handle. Too many connections make for a cluttered cerebellum.

  The solution to this overabundance is a kind of intracranial evolutionary competition. Just as the organisms in an ecosystem are “selected out” if they can’t find a niche where they can make their living, connections in our brains that are not used much after they are formed die out as well, unmasked as extravagances that have no place in the neurological ecosystems of our personal experience.

  You can look at the first, rapid interconnections the brain makes based on experience as something like a matrix of dirt paths branching from your neurons, tentative explorations of this or that destination. The more often you undergo an experience—listen to music, catch a ball, hear a language, or are subjected to scary or stressful situations—the more often the path is walked and the more grooved it becomes. Paths may even become, metaphorically, paved or built into interstate highways, autobahns of thought and experience, because they are traveled so often. The highways remain for life, but in time, if the dirt paths and the back roads aren’t traveled much, they disappear from lack of use. Only the well–traveled roads surviv
e.

  The synaptic routes we build, or not, deeply affect our perception and sense of reality. One of the most dramatic examples of this is the experience in the 1950s of anthropologist Colin Turnbull, who was researching the BaMbuti Pygmies, who live in the dense Ituri forests of central Africa. During his research he and a BaMbuti tribesman that he had come to know named Kenge traveled to another part of Africa characterized by broad plains, as opposed to the dense jungle Kenge had grown up in.

  One day the two men were standing overlooking the expansive grasslands. Kenge pointed his finger at a herd of water buffalo and asked Turnbull, “What insects are those?” At first Turnbull couldn’t figure what Kenge meant, then realized that he was referring to the buffalo. To this man, who had never before seen so much distance between anything, the buffalo appeared not small because they were far away, but small because, like an insect, they were, well, simply small. Turnbull realized that because the BaMbuti grow up in dense forests they never develop the ability to “see” or comprehend distance. When Turnbull told Kenge that the insects were buffalo, Kenge roared with laughter and told Turnbull not to tell such stupid lies. Given the world he grew up in, seeing things that were far away was a visual extravagance for Kenge, and so those connections to the visual cortex were paired away or, perhaps, never made at all.

  If you were blindfolded between the ages of three and five, the same biology would be at work for you except in this case you would grow up entirely and forever blind. Not because your eyes are incapable of sight, but because the synaptic connections your brain made for sight before you were blindfolded would have been pruned away from lack of use by age five, the time when the visual cortex “hardwires” itself. The pathways would never have become paved highways because they were never used. Once that part of the brain locks in, there is no known way to restore sight. The pathways are gone. Children who suffer from amblyopia (lazy eye) often end up blind in their weak eye for the same reason. If the eye is rarely used, its connections to the visual cortex atrophy and die, even though the eye itself is perfectly functional.8

  The most universal example of how our brains discard unused synaptic connections is language. Within five months or so of birth we all become capable of babbling every one of the sounds required to speak any of the sixty–three hundred languages humans utter throughout the world, and probably many that have long been extinct. Up to about age seven, the last year of childhood, children rapidly learn to speak in whatever tongue they are exposed to. If they encounter several, they will adopt them all with ease because to their brain the separate languages aren’t separate at all, they are one; it’s just that they have more words and rules.

  Acquiring fluency in different languages later in life becomes more difficult because the neural circuits that help us master the sounds, the accents, and the grammar of those languages were never formed or have largely evaporated from lack of use. Even if you do learn a new language and get the grammar and the vocabulary right, it is nearly impossible to drop the accents you bring with you from your mother tongue. Henry Kissinger, for example, who has been speaking English since his teens, still speaks it with a strong German accent because English was not his first language. He didn’t learn it until age fifteen, when his family migrated to the United States.

  If the cerebral Rubicon theory is accurate, humans like Homo ergaster began being born “early” one million years ago. Their brains were now roughly three quarters the size of ours, so they weren’t arriving in the world at a fetal stage as delicate as yours or mine, but the increased size of their heads was pushing them out of the womb sooner, and extending their childhoods. This meant they exited the womb uncompleted, a work in progress, hormonally primed to grow vast new farms of neurons and synapses, an amalgamation of their parents’ genetic donations, but editable, more than any other creature up to that time, by their personal experience and the forces of the environment they faced.

  There is no way to overestimate how important this was to our evolution. This was the birth of human childhood itself, and the beginning of wild and complicated processes that explain how you or I can be born in Fargo, North Dakota, learn to speak fluent French in Paris, develop wit like Woody Allen’s, or become as reclusive as Howard Hughes, all while still plumbing the intricacies of subjects as wildly different as calculus, Mozart, and baseball. This began the trend that has, in many ways, made children of all of us the entire course of our lives, neurologically nimble enough that we can keep learning, changing, and overriding the primal commands of our DNA. As we age the pliability of our younger brains may grow more brittle, but they also become more stable, deeper, and broader. Or as anthropologist Ashley Montagu put it, “Our uniqueness lies in always remaining in a state of development.”

  If we stand back and gaze thoughtfully at the whole vista of human evolution, it grows clearer that the longer a childhood lasts, the more individualized the creatures that experience it become. It is the foundation of the thing we call our personalities, the unique attributes that make you, you and me, me. Without it, we would be far more similar to one another, and far less quirky and creative and charming. Our childhoods bestow upon us the great variety of interests and personalities and talents that the seven billion of us display all around the world every day from Barack Obama to Lady Gaga to Itzhak Perlman. This diversity led, slowly, to a new line of “early born” humans living along the shores of Lake Turkana who were developing an unparalleled ability to adapt to the world around them. Something different was afoot: a species that was becoming, to borrow the phrase of Jacob Bronowski, “not a figure in the landscape, but a shaper of the landscape.”

  Improvements, however, have a way of creating new challenges. Now that a series of surprising and unintended events had taken human evolution into entirely new territory, our ancestors found themselves caught in a strange, runaway feedback loop that would favor the arrival of increasingly large–brained, increasingly intelligent, and increasingly helpless babies. All good, you would think. Except that every one of them would require more care and long periods of time to grow up. That would shake the social lives of Africa’s gracile primates to the core and lead to yet another profoundly important twist in our evolutionary story.

  Chapter Four

  Tangled Webs—

  The Moral Primate

  Morality, like art, means drawing a line someplace.

  —Oscar Wilde

  In 2005 England found itself mesmerized by the gruesome murder of fifty–seven–year–old businessman Kenneth Iddon. Each Sunday, Mr. Iddon would drive to nearby Deanwood Golf Club to play snooker with his friends, then return home around midnight. On February 1, 2004, before he could get out of the car in his driveway, prosecutors said three men bludgeoned him, dragged him into his garage, repeatedly stabbed him and finally killed him by severing his carotid. It all happened while his wife and stepson were in the family house nearby. Others in the suburban neighborhood later reported they heard cries for help, yet Lynda, Mr. Iddon’s wife, and Lee Shergold, her thirty–one–year–old son by a previous marriage, said they never heard a thing.

  They denied hearing any cries for help, the local prosecutor charged, because both Mrs. Iddon and her son had hired the three men who killed Mr. Iddon. They wanted his money, they said, all of it, not simply what Lynda Iddon might get in a divorce settlement. The irony was that when Mr. Iddon’s will was read, he had left nothing to his wife. His entire fortune was bequeathed to his twenty-two–year–old daughter, Gemma. Neither Lynda, nor Lee, inherited a dime.

  It’s an old human story. Greed, hatred, envy, and violence. We have, in case you haven’t read today’s newspaper, been known to commit acts we call immoral. We find them abhorrent and disturbing, yet, given our immense numbers, we actually show our ugly sides relatively little. One of the reasons we call attention to the terrible things we do, and are horrified by them, is because the majority of us don’t do them. We are the only animals that even struggle with the idea of morality b
ecause we are the only truly ethical animal.

  Our moral tendencies are apparently so thoroughly wired into our psyche that they even reveal themselves in young children. For years psychologists—from Sigmund Freud to Jean Piaget to Lawrence Kohlberg—denied that infants or toddlers could have any sense of right or wrong. The traditional view has long been that babies arrive without the slimmest grasp of empathy, fairness, or other similarly moral sentiments. But recent experiments show otherwise.

  At Yale University, psychologists Paul Bloom, Karen Wynn, and Kiley Hamlin placed infants between the ages of five and twelve months in front of a simple morality play where three puppets were throwing a ball. As the babies watched, one puppet rolled the ball to another puppet on the right, then that puppet promptly rolled it back. Next the center puppet rolled the ball to a third puppet on the left, who took the ball and, instead of rolling it back, ran off with it.

  The infant audiences didn’t cotton to this sort of behavior. Later when they were presented with the two puppets to which the balls were rolled, each with a pile of treats, and were asked to take a treat from one of them. Invariably they took a treat from the “naughty” puppet that had absconded with the ball. One one–year–old went so far as to smack the offending puppet on the head, raising the question, is violence a proper response to an immoral act?!

 

‹ Prev