Pandora's Seed

Home > Other > Pandora's Seed > Page 13
Pandora's Seed Page 13

by Spencer Wells

Even in today’s postindustrial countries, the process of specialization continues, with subcultures in the same society speaking entirely different languages during most of their waking hours. Would the average tort lawyer understand a seminar in subatomic physics? How many chemists would be able to contribute meaningfully to an academic conference on literary theory? And when was the last time you worked on your own car? We tolerate this because of the benefits it provides—from the point of view of society, it is better to have experts focus on a limited aspect of human endeavor and excel than to have generalists spend their time as dilettantes, moving from one task to another as their interests suit them. While this has produced an obvious bounty in the form of advanced technology, for instance, it has left many people feeling out in the cold, excluded from the decision-making process about their own lives.

  FIGURE 23: SCENE FROM THE TUCKERS FACTORY IN BIRMINGHAM, ENGLAND, 1910. (PHOTO COURTESY OF PAUL TUCKER.)

  In the final chapter of this book I will discuss how this alienation has been a driving factor in the rise of fundamentalism during the late twentieth and early twenty-first centuries. For now, though, we need to turn from the origin of art and job specialization to the curious nature of modern-day stress and its impact on our hunter-gatherer psyches.

  THE DRONE OF MODERN LIFE

  Cars rush by outside your window, a horn blaring occasionally. The refrigerator hums in the corner of the kitchen, and the heat coming out of a duct over your head whooshes softly. Bills sit stacked on the counter, insistently waiting to be opened. A television—perhaps one of several in the house—blares advertisements from the next room, and Internet pop-up ads interrupt your attempts to check on your retirement investments. The cacophony reaches a crescendo when your spouse’s cell phone rings, vibrating along the tabletop like some sort of angry digital dervish. The blare of the outside world goes on all around us, even while we attempt to focus on our “real” lives.

  We are constantly surrounded by surreptitious stimuli—so much so that we take it all for granted. We are used to the notion that advertisements saturate our lives—exposure estimates for the average American range from several hundred to several thousand every day—as promoters try to sell us everything from life insurance to an enhanced sex life. Data flows at us from every direction. Information is ubiquitous and, with the rise of the Internet and broadband connectivity, more easily accessible than ever. But even things we might not think of as intrusive bombard our subconsciousnesses with stimuli. Inadvertently, the machines we have created to improve our lives may actually be causing some degree of psychological harm.

  The journalist Toby Lester, in an article published in the Atlantic magazine in 1997, pointed out that ours is the first generation to live in an environment where background sounds from machines saturate our lives. He explained the various tones he hears at work, made up of noise from the heater, computer fan, and telephone: “My office plays a curious combination of intervals, one joyous and stable (do-mi), another devilish and inimical (do-fa-sharp), and the third (mi-fa-sharp) emotionally neutral. The overall result is an ambiguous chord that, at its upper end, begs for resolution.”

  According to musical theorists, who have assigned moods to certain chords, this combination of notes is particularly dissonant. The medieval Catholic church called the chord generated by Lester’s office machines the diabolus in musica (devil in music), and it creates a strong feeling of unease in listeners. As Lester asks later in the piece, “Could this ambiguity and tension be one reason I so often feel on edge?” Perhaps—and it certainly provides an impetus to cherish the silence that is so rare in modern life.

  This same sort of constant stimulation exists in our other sensory realms as well. We are all bombarded visually on a daily basis, and some of us experience regular assaults on our senses of smell and touch, as well (think of a crowded subway car on a hot summer day). Our lives are now lived in a state that could be called “stream of subconsciousness,” as we subliminally lurch from one unrelated (and usually unwanted) stimulus to the next like floating dust particles buffeted by the random forces of air currents. Some people seem to thrive on constant overstimulation, whether dissonant or not, but most of us react rather badly to it. The end result is chronic stress of the kind that drove Michael Douglas’s character William over the edge in the 1993 film Falling Down. Endless Los Angeles traffic, feelings of inadequacy and powerlessness brought on by divorce and job loss, the odd combination of isolation and crowding that characterizes much of modern life—all combine to drive this character into a psychotic state, and he rampages around the city in a violent attempt to right the wrongs he perceives.

  While most of us will never go this far in our reaction to the stresses of modern life, internally we are actually fighting the same sort of battle. A large body of scientific evidence shows that chronic stress damages everything from our psychological health to our sexual performance to our immune systems. Yes, long-term stress diminishes our sensitivity to glucocorticoid hormones, which are involved in inflammatory responses. In essence, our immune systems become overstimulated, fooled into thinking that there is a chronic infection or some other assault on our bodies, and the normal process by which we recognize the end of an infection is damaged. The effect increases not only our chances of catching a cold but also the likelihood of clogged arteries and autoimmune diseases.

  Surprisingly, though, short-term stress is actually good for the immune system. Studies of skydivers, for instance, have revealed that the increase in their adrenaline levels adds to the number of a certain type of immune system component known as a natural killer cell. These cells are typically in the first line of defense against infections, before the rest of the immune system (antibodies and so on) has had a chance to kick into gear. Such a response would have been highly adaptive to our hunter-gatherer ancestors, since their regular exposure to short jolts of adrenaline as a result of the fight-or-flight response would have been priming their immune systems for action.

  The impact of stress caused by overcrowding and geographic confinement—typically something that occurred only after the transition to agriculture—can actually be seen in some late Paleolithic sites. In these locations the population was forced to make use of a limited resource and lacked the typical hunter response to such a situation: to leave. One of these unusual sites was found along the Nile River, as described by Clive Gamble in his excellent survey of the Paleolithic, Timewalkers:

  The Nile (upper Paleolithic) sites show the social consequences of being tethered to such a limiting resource. Intensification in subsistence can pose other problems for survival, especially when it involves living together for longer periods. Hunters and gatherers usually cope with conflict and disagreements by walking away from the problem. However, this is not always possible. Wendorff and Close have uncovered a good deal of evidence for violence along the Nile which is not found elsewhere at 18,000 BP [years ago] in the more fertile North African refuges of the Maghreb and Cyrenaica…in the Jebel Sahaba graveyard at least 40 percent, regardless of age and sex, had met a violent death.

  The incredibly rich Nile environment allowed the population to grow to a large enough number that such strife was inevitable. It is mirrored at other Upper Paleolithic sites with abundant aquatic resources, such as Japan and the Pacific Coast of northwestern North America. Here, as we saw in Chapter 2, large group sizes led to social hierarchies, which eventually led to fighting over limited resources. Again, it is only when you have something to fight over, and no recourse but to fight, that warfare becomes common. It simply doesn’t make evolutionary sense to waste resources in frequent or protracted battles unless you have no other choice.

  The reasons for this stressed response to living in large groups are complicated, and may have something to do with an intriguing pattern discovered by the evolutionary psychologist Robin Dunbar. Dunbar’s analysis showed that the average group size among apes and Old World monkeys (the species closest to us evolutionarily) is rel
ated to the size of their brains. The larger the brain, the larger the group size, presumably because the increased neural connectivity in larger brains allows individuals to keep track of more social connections. Most species in his analysis had average group sizes of between 5 and 50 individuals. When he extrapolated the resulting line out to a human-sized brain, he found that we would be predicted to have group sizes on the order of 150. This number turns out to be remarkably close to all sorts of natural groups that humans belong to, ranging from companies in the military to traditional Hutterite farming communities in Canada. It also turns out to be the average size of hunter-gatherer bands.

  FIGURE 24: RELATIONSHIP BETWEEN THE NEOCORTEX RATIO AND GROUP SIZE IN MONKEYS AND APES, AS DESCRIBED BY ROBIN DUNBAR. HUMANS HAVE A PREDICTED GROUP SIZE OF 148.

  Dunbar explains that the reason humans are ideally suited to groups of this size isn’t that we can’t remember more people than this—that number runs to around 2,000—but that this is the maximum number of meaningful social relationships a person can keep track of. It turns out that 150 is the average number of people from whom we would be comfortable asking a favor, as well as the average number of friends and family members to whom British people sent Christmas cards in the 1990s (before the widespread use of email). Dunbar’s result implies that our social structure is hardwired into our biology. If this number of people is exceeded in a hunter-gatherer band, two things typically happen. Either the group splits or, if that is impossible, it has to find some way of maintaining order by instituting the types of social structures we have in the modern world—governments, religions, laws, police forces, and so on. While the first option was open to our Paleolithic ancestors, the second is our only recourse in the post-Neolithic world.

  Despite the meliorating influence of governments—what Thomas Hobbes called the Leviathan, the threatening state entity that he believed kept our “animal” impulses in check in a civilized society—there is still some psychological baggage associated with living in groups with far more than 150 people. First, it is no longer possible to treat other group members in the way we would treat them in a smaller group. We begin to dehumanize one another, and our behavior becomes decidedly unnatural. Think of standing in an elevator with strangers, where everyone tries to avoid eye contact and seems unnaturally interested in the floor number or the message they just received on their BlackBerry. No hunter-gatherer would think of not talking to other members of the group that they encountered in such a closed social situation, yet a typical city dweller often pretends that other people don’t exist; to do otherwise would be overwhelming. Imagine engaging socially with every person you were in close proximity to on an average urban workday—the mind boggles at the number of possible interactions. Our brains simply wouldn’t be able to handle it (not to mention our schedules), so we have developed the coping mechanism of acting as if those people aren’t there, and they become yet another part of the background noise of modern life.

  This psychological mismatch between the densely populated, noisy agricultural world and the sparsely populated hunter-gatherer one is almost certainly one of the reasons for the psychological unease felt by many people. Along with the other “noisy” aspects of modern life, such excessive background social stimulation is very likely part of the reason why we see increasing levels of mental illness in most societies. According to the report mentioned at the end of the last chapter, the WHO expects that by 2020 mental illness will be the second most important cause of disability and mortality worldwide. Surveys conducted in Europe and the United States show that more than a quarter of the population in any given year has had symptoms that would be diagnosed as mental illness (most are never diagnosed, however), with the most common being anxiety disorder.

  This trend is reflected in the increasing use of prescription psychoactive drugs. People have always enjoyed altering their states of consciousness, with the help of substances ranging from alcohol to cannabis to the potent psychedelic ayahuasca used in the Amazon basin, but this is the first time in history that we are routinely drugging ourselves in order to appear normal. According to the Centers for Disease Control, antidepressants such as Prozac and Paxil are now the most prescribed drugs in the United States—more so than drugs for hypertension, high cholesterol, or headaches. The stimulant Ritalin, used to control attention deficit/hyperactivity disorder, is part of the daily routine of around 10 percent of American boys. Overprescription by doctors is certainly contributing to this widespread medicating, but there is clearly an underlying problem that is making us feel psychologically unwell. I would argue that this, too, is part of the continuing fallout from the Neolithic population explosion.

  INTO THE FUTURE

  As we hurtle into the twenty-first century with our Neolithic baggage in tow, we are clearly still adapting to this new culture, which dates back only 10,000 years or so. The twin burdens of disease and unease that we accept as part of modern life will certainly produce profound changes in our medical systems as chronic disorders and mental illness become more and more common. New drugs will be discovered and prescribed, and we will become ever more used to living pharmacologically enhanced lives. Will we be able to find a drug to treat everything that ails us? Probably not—but the pharmaceutical companies will definitely keep trying.

  There is another intriguing possibility, though, one that has become imaginable only in the past decade. This audacious new technology offers the hope of not simply treating a disease, reacting to its symptoms, and trying, in an often hit-or-miss fashion, to cure it. Rather, we are told, disease itself may become a thing of the past—we will be able to anticipate rather than simply react, prevent rather than treat, and when treatments are necessary, we will be able to target them with unprecedented specificity. This technology also, ominously, offers the possibility of eradicating diseases forever, both in ourselves and in future generations. It is perhaps the most potent force ever to be harnessed in the name of medicine, and it offers us a chance to, once and for all, mold our biology to suit the new culture we have created—to remake ourselves in the image of post-Neolithic society. It is the field of genomics, and the brave new world it promises is where we’re headed next.

  Chapter Five

  Fast-Forward

  {Eugenics} must be introduced into the national conscience, like a new religion. … What Nature does blindly, slowly and ruthlessly, man must do providently, quickly and kindly.

  —FRANCIS GALTON,

  Essays in Eugenics, 1909

  DERBYSHIRE

  I was still groggy from my overnight flight, and as I piloted my rental car onto the M1, outside London, I had to keep reminding myself to drive on the left. I merged into the morning traffic headed toward “The North” (as the signs proclaimed in their very British way), pushed the small car up to seventy-five miles per hour, found a radio station playing classic rock hits from the 1960s and ’70s, and settled in for a two-hour drive. The trip through the English countryside (albeit seen in small glimpses from the motorway) gave me a chance to ponder what I had come here to do. My other travels in researching this book had been motivated by some sort of academic interest, and in that sense this trip was different. Rather than being part of an effort to understand the scientific or technical methods behind the forces I have been discussing, this journey was much more personal. I had come here to talk to a family that had inadvertently found itself at the cutting edge of modern technology’s remodeling of humanity. And my research would occur in a most unlikely place—around the kitchen table in Michelle and Jayson Whitaker’s cozy nineteenth-century Derbyshire cottage.

  The Whitakers were thrust into the international spotlight in 2002, when the United Kingdom’s Human Fertilisation and Embryology Authority (HFEA) turned down their request to create a “designer baby,” as the child was dubbed by the press. The Whitakers’ first child, Charlie, had been born three years before. Although he seemed reasonably healthy at birth, by the age of three months it was very clear that som
ething was wrong. Charlie was diagnosed with an extremely rare genetic disease known as Diamond-Blackfan anemia (DBA)—only a few dozen other cases had been found in the entire country—which meant that his blood was unable to carry enough oxygen to allow him to grow normally. The only known treatment was regular blood transfusions, but unfortunately the increase in red-blood-cell death would place enormous stresses on his liver and kidneys. Although they started these treatments immediately, and the treatments improved Charlie’s symptoms, the Whitakers knew that eventually, like all other Diamond-Blackfan patients, Charlie would probably die unless a bone marrow donor could be found.

  The idea was to destroy Charlie’s own faulty bone marrow through chemotherapy, rendering him incapable of producing blood cells. Although this was incredibly dangerous—it’s like acquiring AIDS by another means and leaves the body tremendously vulnerable to infection—it was necessary for the next step, the transplant. If a donor could be found whose immune system antigens matched Charlie’s, then a sample of that person’s bone marrow could be injected into Charlie’s body. Hopefully, the procedure would allow these healthy cells to take up residence in Charlie’s bones, eventually creating a new immune system—and healthy blood cells—which would cure Charlie’s disease. However, unlike leukemia patients, who can search through the registry of millions of potential marrow donors for a match, Charlie was left out in the cold.

  Even with a perfect match between donor and recipient, the chance of a successful transplant between two unrelated people is only about 15 percent, meaning that 85 percent of the time the transplant will fail. Such a risk is worth taking, however, if the patient has terminal leukemia—even a 15 percent chance is better than dying. In the case of a nonterminal disease such as DBA, though, the marrow registries don’t consider the chances of success worth the risk involved in the procedure, and such transplants are not allowed. If, however, a matching sibling is the donor, the likelihood of success increases to a much more favorable 85 percent. The problem, of course, is finding a donor in the same family. Since Charlie was Michelle and Jayson’s only child, their sole hope was to have another baby that could serve as a donor.

 

‹ Prev