Book Read Free

The Lying Stones of Marrakech

Page 33

by Stephen Jay Gould


  The deepest error of this third category lies in the reductionist, and really rather silly, notion that we can even define discrete, separable, specific traits within the complex continua of human behaviors. We encounter enough difficulty in trying to identify characters with clear links to particular genes in the much clearer and simpler features of human anatomy. I may be able to specify genes “for” eye color, but not for leg length or fatness. How then shall I parse the continuous and necessarily subjective categories of labile personalities? Is “novelty seeking” really a “thing” at all? Can I even talk in a meaningful way about “genes for” such nebulous categories? Have I not fallen right back into the errors of Davenport’s search for the internal scarlet letter W of wanderlust?

  I finally realized what had been troubling me so much about the literature on “genes for” behavior when I read the Times’s account of C. R. Cloninger’s theory of personality (Cloninger served as principal author of the Nature Genetics editorial commentary):

  Novelty seeking is one of four aspects that Dr. Cloninger and many other psychologists propose as the basic bricks of normal temperament, the other three being avoidance of harm, reward dependence and persistence. All four humors are thought to be attributable in good part to one’s genetic makeup.

  The last line crystallized my distress—“all four humors”—for I grasped, with the emotional jolt that occurs when all the previously unconnected pieces of an argument fall suddenly into place, why the canny reporter (or the scientist himself) had used this old word. Consider the theory in outline: four independent components of temperament, properly in balance in “normal” folks, with each individual displaying subtly different proportions, thus determining our individual temperaments and building our distinct personalities. But if our body secretes too much, or too little, of any particular humor, then a pathology may develop.

  But why four, and why these four? Why not five, or six, or six hundred? Why any specific number? Why try to parse such continua into definite independent “things” at all? I do understand the mathematical theories and procedures that lead to such identifications (see my book The Mismeasure of Man), but I regard the entire enterprise as a major philosophical error of our time (while I view the mathematical techniques, which I use extensively in my own research, as highly valuable when properly applied). Numerical clumps do not identify physical realities. A four-component model of temperament may act as a useful heuristic device, but I don’t believe for a moment that four homunculi labeled novelty seeking, avoidance of harm, reward dependence, and persistence reside in my brain, either vying for dominance or cooperating in balance.

  The logic of such a theory runs in uncanny parallel—hence the clever choice of “humor” as a descriptive term for the proposed modules of temperament—with the oldest and most venerable of gloriously wrong theories in the history of medicine. For more than a thousand years, from Galen to the dawn of modern medicine, prevailing wisdom regarded the human personality as a balance among four humors—blood, phlegm, choler, and melancholy. Humor, from the Latin word for “liquid” (a meaning still preserved in designating the fluids of the human eye as aqueous and vitreous humor), referred to the four liquids that supposedly formed the chyle, or digested food in the intestine just before it entered the body for nourishment. Since the chyle formed, on one hand, from a range of choices in the food we eat and, on the other hand, from constitutional differences in how various bodies digest this food, the totality recorded both innate and external factors—an exact equivalent to the modern claim that both genes and environment influence our behavior.

  The four humors of the chyle correspond to the four possible categories of a double dichotomy—that is, to two axes of distinction based on warm–cold and wet–dry. The warm and wet humor forms blood; cold and wet generates phlegm; warm and dry makes choler; while cold and dry builds melancholy. I regard such a logically abstract scheme as a heuristic organizing device, much like Cloninger’s quadripartite theory of personality. But we make a major error if we elevate such a scheme to a claim for real and distinct physical entities inside the body.

  In the medical theory of humors, good health results from a proper balance among the four, while distinctive personalities emerge from different proportions within the normal range. But too much of any one humor may lead to oddness or pathology. As a fascinating linguistic remnant, we still use the names of all four humors as adjectives for types of personality—sanguine (dominance of the hot–wet blood humor) for cheerful people, phlegmatic for stolid folks dominated by the cold–wet humor of phlegm, choleric for angry individuals saddled with too much hot–dry choler, and melancholic for sad people overdosed with black bile, the cold–dry humor of melancholia. Does the modern quadripartite theory of personality really differ in any substantial way from this older view in basic concepts of number, balance, and the causes of both normal personality and pathology?

  In conclusion, we might imagine two possible reasons for such uncanny similarity between a modern conception of four components to temperament, and the old medical theory of humors. Perhaps the similarity exists because the ancients had made a great and truthful discovery, while the modern version represents a major refinement of a central fact that our ancestors could only glimpse through a glass darkly. But alternatively—and ever so much more likely in my judgment—the stunning similarities exist because the human mind has remained constant throughout historical time, despite all our growth of learning and all the tumultuous changes in Western culture. We therefore remain sorely tempted by the same easy fallacies of reasoning.

  I suspect that we once chose four humors, and now designate four end members of temperament, because something deep in the human psyche leads us to impose simple taxonomic schemes of distinct categories upon the world’s truly complex continua. After all, our forebears didn’t invoke the number four only for humors. We parsed many other phenomena into schemes with four end members—four compass points, four ages of man, and four Greek elements of air, earth, fire, and water. Could these similarities of human ordering be coincidental, or does the operation of the human brain favor such artificial divisions? Carl G. Jung, for reasons that I do not fully accept, strongly felt that division by four represented something deep and archetypal in human proclivities. He argued that we inherently view divisions by three as incomplete and leading onward (for one triad presupposes another for contrast), whereas divisions by four stand in optimal harmony and internal balance. He wrote: “Between the three and the four there exists the primary opposition of male and female, but whereas fourness is a symbol of wholeness, threeness is not.”

  I think that Jung correctly discerned an inherent mental attraction to divisions by four, but I suspect that the true basis for this propensity lies in our clear (and probably universal) preference for dichotomous divisions. Division by four may denote an ultimate and completed dichotomization—a dichotomy of dichotomies: two axes (each with two end members) at right angles to each other. We may experience four as an ultimate balance because such schemes fill our mental space with two favored dichotomies in perfect and opposite coordination.

  In any case, if this second reason explains why we invented such eerily similar theories as four bodily humors and four end members of temperament, then such quadripartite divisions reflect biases of the mind’s organization, not “real things” out there in the physical world. We can hardly talk about “genes for” the components of such artificial and prejudicial parsings of a much more complex reality. Interestingly, the greatest literary work ever written on the theory of humors, the early-seventeenth-century Anatomy of Melancholy by the English divine and scholar Robert Burton, properly recognized the four humors as just one manifestation of a larger propensity to divide by four. This great man who used the balm of literature to assuage his own lifelong depression, wrote of his condition: “Melancholy, cold and drie, thicke, blacke, and sowre … is a bridle to the other two hot humors, bloode and choler, preserving them in the
blood, and nourishing the bones: These foure humors have some analogie with the foure elements, and to the foure ages in man.”

  I would therefore end—and where could an essayist possibly find a more appropriate culmination—with some wise words from Montaigne, the sixteenth-century founder of the essay as a literary genre. Perhaps we should abandon our falsely conceived and chimerical search for a propensity to wander, or to seek novelty (perhaps a spur to wandering), in a specific innate sequence of genetic coding. Perhaps, instead, we should pay more attention to the wondrous wanderings of our mind. For until we grasp the biases and propensities of our own thinking, we will never see through the humors of our vision into the workings of nature beyond. Montaigne wrote:

  It is a thorny undertaking, and more so than it seems, to follow a movement so wandering as that of our mind, to penetrate the opaque depths of its innermost folds, to pick out and immobilize the innumerable flutterings that agitate it.

  19

  Dolly’s Fashion

  and Louis’s Passion

  NOTHING CAN BE MORE FLEETING OR CAPRICIOUS THAN fashion. What, then, can a scientist, committed to objective description and analysis, do with such a haphazardly moving target? In a classic approach, analogous to standard advice for preventing the spread of an evil agent (“kill it before it multiplies”), a scientist might say, “quantify it before it disappears.”

  Francis Galton, Charles Darwin’s charmingly eccentric and brilliant cousin, and a founder of the science of statistics, surely took this prescription to heart. He once decided to measure the geographic patterning of female beauty. He therefore attached a piece of paper to a small wooden cross that he could carry, unobserved, in his pocket. He held the cross at one end in the palm of his hand and, with a needle secured between thumb and forefinger, made pinpricks on the three remaining projections (the two ends of the cross bar and the top). He would rank every young woman he passed on the street into one of three categories, as beautiful, average, or substandard (by his admittedly subjective preferences)—and he would then place a pinprick for each woman into the designated domain of this cross. After a hard day’s work, he tabulated the relative percentages by counting pinpricks. He concluded, to the dismay of Scotland, that beauty followed a simple trend from north to south, with the highest proportion of uglies in Aberdeen, and the greatest frequency of lovelies in London.

  Some fashions (body piercings, perhaps?) flower once and then disappear, hopefully forever. Others swing in and out of style, as if fastened to the end of a pendulum. Two foibles of human life strongly promote this oscillatory mode. First, our need to create order in a complex world, begets our worst mental habit: dichotomy (see chapter 3), or our tendency to reduce a truly intricate set of subtle shadings to a choice between two diametrically opposed alternatives (each with moral weight and therefore ripe for bombast and pontification, if not outright warfare): religion versus science, liberal versus conservative, plain versus fancy, “Roll Over Beethoven” versus the “Moonlight” Sonata. Second, many deep questions about our loves and livelihood, and the fates of nations, truly have no answers—so we cycle the presumed alternatives of our dichotomies, one after the other, always hoping that, this time, we will find the nonexistent key to an elusive solution.

  Among oscillating fashions governed primarily by the swing of our social pendulum, no issue can claim greater prominence for an evolutionary biologist, or hold more relevance to a broad range of political questions, than genetic versus environmental sources of human abilities and behaviors. This issue has been falsely dichotomized for so many centuries that English even features a mellifluous linguistic contrast for the supposed alternatives: nature versus nurture.

  As any thoughtful person understands, the framing of this question as an either-or dichotomy verges on the nonsensical. Both inheritance and upbringing matter in crucial ways. Moreover, an adult human being, built by interaction of these (and other) factors, cannot be disaggregated into separate components with attached percentages (see chapter 18 for detailed arguments on this vital issue). Nonetheless, a preference for either nature or nurture swings back and forth into fashion as political winds blow, and as scientific break-throughs grant transient prominence to one or another feature in a spectrum of vital influences. For example, a combination of political and scientific factors favored an emphasis upon environment in the years just following World War II: an understanding that Hitlerian horrors had been rationalized by claptrap genetic theories about inferior races; the heyday of behaviorism in psychology. Today genetic explanations have returned to great vogue, fostered by a similar mixture of social and scientific influences: a rightward shift of the political pendulum (and the cynical availability of “you can’t change them, they’re made that way” as a bogus argument for reducing government expenditures on social programs); and an overextension to all behavioral variation of genuinely exciting results in identifying the genetic basis of specific diseases, both physical and mental.

  Unfortunately, in the heat of immediate enthusiasm, we often mistake transient fashion for permanent enlightenment. Thus, many people assume that the current popularity of genetic determinism represents a permanent truth, finally wrested from the clutches of benighted environmentalists of previous generations. But the lessons of history suggest that the worm will soon turn again. Since both nature and nurture can teach us so much—and since the fullness of our behavior and mentality represents such a complex and unbreakable combination of these and other factors—a current emphasis on nature will no doubt yield to a future fascination with nurture as we move toward better understanding by lurching upward from one side to another in our quest to fulfill the Socratic injunction: know thyself.

  In my Galtonian desire to measure the extent of current fascination with genetic explanations (before the pendulum swings once again and my opportunity evaporates), I hasten to invoke two highly newsworthy items of recent times. The subjects may seem quite unrelated—Dolly the cloned sheep, and Frank Sulloway’s book on the effects of birth order upon human behavior*—but both stories share a curious common feature offering striking insight into the current extent of genetic preferences. In short, both stories have been reported almost entirely in genetic terms, but both cry out (at least to me) for a radically different reading as proof of strong environmental influences. Yet no one seems to be drawing (or even mentioning) this glaringly obvious inference. I cannot imagine that anything beyond current fashion for genetic arguments can explain this puzzling silence. I am convinced that exactly the same information, if presented twenty years ago in a climate favoring explanations based on nurture, would have elicited a strikingly different interpretation. Our world, beset by ignorance and human nastiness, contains quite enough background darkness. Should we not let both beacons shine all the time?

  CREATING SHEEP

  Dolly must be the most famous sheep since John the Baptist designated Jesus in metaphor as “lamb of God, which taketh away the sin of the world” (John 1:29). She has certainly edged past the pope, the president, Madonna, and Michael Jordan as the best-known mammal of the moment. And all this brouhaha for a carbon copy, a Xerox! I don’t mean to drip cold water on this little lamb, cloned from a mammary cell of her adult mother, but I remain unsure that she’s worth all the fuss and fear generated by her unconventional birth.

  When one reads the technical article describing Dolly’s manufacture (I. Wilmut, A. E. SchniekeJ. McWhir, A.J. Kind, and K. H. S. Campbell, “Viable offspring derived from fetal and adult mammalian cells,” Nature, February 27, 1997, pages 810–13), rather than the fumings and hyperbole of so much public commentary, one can’t help feeling a bit underwhelmed, and left wondering whether Dolly’s story tells less than meets the eye.

  I don’t mean to discount or underplay the ethical issues raised by Dolly’s birth (and I shall return to this subject in a moment), but we are not about to face an army of Hiders or even a Kentucky Derby run entirely by genetically identical contestants (a true test fo
r skills of jockeys and trainers!). First, Dolly breaks no theoretical ground in biology, for we have known how to clone in principle for at least two decades, but had developed no techniques for reviving the full genetic potential of differentiated adult cells. (Still, I admit that a technological solution can pack as much practical and ethical punch as a theoretical breakthrough. I suppose one could argue that the first atomic bomb only realized a known possibility.)

  Second, my colleagues have been able to clone animals from embryonic cell lines for several years, so Dolly does not rank as the first mammalian clone, but rather as the first clone from an adult cell. Ian Wilmut and his coworkers also cloned sheep from cells of a nine-day embryo and a twenty-six-day fetus—with much greater success. They achieved fifteen pregnancies (though not all proceeded to term) in thirty-two “recipients” (that is, surrogate mothers for transplanted cells) for the embryonic cell line, and five pregnancies in sixteen recipients for the fetal cell line, but only Dolly (one pregnancy in thirteen tries) for the adult cell line. This experiment cries out for confirming repetition. (Still, I allow that current difficulties will surely be overcome, and cloning from adult cells, if doable at all, will no doubt be achieved more routinely as techniques and familiarity improve.)*

  Third, and more seriously, I remain unconvinced that we should regard Dolly’s starting cell as adult in the usual sense of the term. Dolly grew from a cell taken from the “mammary gland of a 6-year-old ewe in the last trimester of pregnancy” (to quote the technical article of Wilmut et al.). Since the breasts of pregnant mammals enlarge substantially in late stages of pregnancy, some mammary cells, though technically adult, may remain unusually labile or even “embryolike,” and thus able to proliferate rapidly to produce new breast tissue at an appropriate stage of pregnancy. Consequently, we may only be able to clone from unusual adult cells with effectively embryonic potential, and not from any stray cheek cell, hair follicle, or drop of blood that happens to fall into the clutches of a mad Xeroxer. Wilmut and colleagues admit this possibility in a sentence written with all the obtuseness of conventional scientific prose, and therefore almost universally missed by journalists: “We cannot exclude the possibility that there is a small proportion of relatively undifferentiated stem cells able to support regeneration of the mammary gland during pregnancy.”

 

‹ Prev