Book Read Free

Triumphs of Experience: The Men of the Harvard Grant Study

Page 34

by Vaillant, George E.


  Another compelling explanation for our aberrant findings is that many American studies linking religious observance to physical health come out of the so-called Bible Belt, where agnostics are, at least statistically, also social outliers. In our sample of highly educated men centered in the northeastern United States, high religious involvement was not the cultural norm; more important, it was not deeply tied in to other sources of social supports.24 In other words, in samples where healthy social adjustment usually includes clear religious involvement, such involvement is likely to correlate with warm relationships, social supports, and good physical health. Evidence of that correlation, however, does not reflect a direct causal relationship between religious involvement and health.

  V. THE IMPORTANCE OF MATERNAL GRANDFATHERS

  The mean age at death of the College men’s grandparents (an 1860 birth cohort) was seventy-one years, and the mean age at death of their parents (an 1890 birth cohort) was seventy-six years. By historical standards, such ancestral longevity is remarkable, and comparable with that predicted for some contemporary European birth cohorts.25 The likelihood of adventitious ancestral death from poor medical care, hazardous occupation, poor nutrition, or infection was less in the Grant Study sample than for less socioeconomically favored groups. This reduction in environmentally mediated mortality increased our chances to identify some genetically mediated effects on longevity.

  While examining the effects of ancestral longevity on sustained good physical and mental health, we noted a marked and unexpected association between age at death of maternal grandfathers and the mental health of their grandsons.26 The age of death of the other five first-degree ancestors (mother, father, maternal grandmother, and both paternal grandparents) was associated with virtually nothing in the men’s lives except, modestly, with longevity. But the associations with the maternal grandfather’s (MGF’s) age of death were extraordinary. For example, while the average longevity of the other five ancestors was not relevant to the men’s scores in the Decathlon of Flourishing, the maternal grandfathers of the men with the highest flourishing scores lived nine years longer than those of the men with the lowest scores—a significant difference. The average age of MGF death for 147 Study men who never saw a psychiatrist was seventy years. The average age of MGF death for the thirty-two men who made one hundred or more visits to a psychiatrist was sixty-one—a very significant difference—but the age of death of the other five relatives made no difference as to psychiatric visits. The maternal grandfathers of upper-class men lived only three years longer than those of men from blue-collar families.

  In 1990, two board-eligible psychiatrists blind to other ratings were given the complete records of the sixty-one men in the Study who, by age fifty, manifested objective evidence of sustained psychosocial impairment (that is, psychiatric hospitalization, scoring in the bottom quartile of psychosocial adjustment at age forty-seven, or having used tranquilizers or antidepressants for more than one month). Many of these men were dependent on alcohol. They were rated for eight correlates of depression, the DSM-III criteria for major depressive disorder not yet having been developed. These eight correlates were (1) serious depression for two weeks or more by self-report, (2) diagnosis of clinical depression at some point in the man’s life by a non-Study clinician, (3) use of antidepressant medication, (4) psychiatric hospitalization for reasons other than alcohol abuse, (5) sustained anergia or anhedonia, (6) neurovegetative signs of depression (e.g., early morning awakening or weight loss when depressed), (7) suicidal preoccupations, attempts, or completions, and (8) evidence of mania.

  Of the sixty-one, thirty-six men were categorized as alcoholic or personality disordered. The remaining twenty-five included twelve men categorized by only one rater as having major depressive disorder and thirteen who both raters agreed manifested major depressive disorder. These twenty-five men met at least three—and an average of five—criteria of major depressive disorder. The other thirty-six men met an average of only one-tenth as many criteria.

  As a contrast, we identified fifty undistressed men who through the age of sixty had never reported evidence of alcohol abuse, had made no visits to a psychiatrist, and had not used psychotropic drugs more than one day a year on average over twenty years. (This stipulation covered the contingency, say, of a man’s having had to take Librium briefly during a surgical admission, or Ambien to recover from jet lag before an important overseas conference.) In addition, these men were classified in college as having well-integrated personalities, and the Study had never assigned them a psychiatric diagnosis. They were the antithesis of the depressed men.

  Table 10.3 shows our four diagnostic groups: undistressed, alco-holic/personality disordered, major depressive disorder, and “intermediate”—that is, the men who didn’t fall into any of the first three. The mean age at death of MGF for the twenty clearly depressed men was sixty. The mean age at death of the fifty-eight undistressed men was seventy-five—a very significant difference. The age of death of the MGF of the ten men with the highest anxiety scores on the NEO was fifty-seven, and that of the MGF of the ten men with the lowest anxiety scores was eighty-three, an even greater distinction.

  This unexpected discovery—that of the six ancestors, it was only the maternal grandfather’s longevity that was associated with affective disorder in the grandsons—is consistent with a linkage mediated via the X-chromosome. X-linked disorders such as hemophilia, color blindness, and baldness come through the mother’s father, when it is he who supplies the grandson’s only X chromosome. Such illnesses tend to skip the mother, who has a second, usually unaffected X chromosome to protect her. For half a century, researchers have speculated that there might be an X chromosome linkage in the etiology of depression.27 But this hypothesis has been confirmed only inconsistently in the search for a specific gene for bipolar disorder.28 It seems clear that bipolar disorder and major depressive disorder are genetically heterogeneous—that the genes for the two disorders are different.

  Table 10.3 Mean Age at Death of Maternal Grandfather and of Other Primary Ancestors in the Four Affective Disorder Categories

  Very significant = p<.001; Significant = p<. 01; NS = Not significant.

  To establish firm evidence for X-linked transmission of affective disorder would require full genealogical analysis of the affected families with knowledge of presence or absence of affective disorders in both male and female subjects over three or four generations—evidence that we do not have. Our findings do indicate, however, that for unknown—dare I say mysterious—reasons, early death of maternal grandfathers predicts an increased incidence of affective disorder in the grandsons. Still more exciting is that long-lived maternal grandfathers predict unusual psychological stability in their grandsons—evidence that positive mental health may be in part genetic. The association of very long-lived maternal grandfathers with men scoring low on the NEO anxiety score is particularly intriguing.

  My own guess is that someday soon an accomplished geneticist with a larger study, better characterized maternal grandfathers, and complete DNA analyses will win the Nobel Prize for explicating this phenomenon. At present it must be considered a preliminary finding, of interest only to the curious. Still, it is provocative, and would not have been discovered but for 60 years of follow-up. And it’s a perfect example of how a seductive gleam can wink out unexpectedly from unruly heaps of unsorted longitudinal data, to be revealed eventually by Time’s assay as either brass or true gold.

  11

  SUMMING UP

  All’s well that ends well; still the fine’s the crown;

  Whate’er the course, the end is the renown.

  —WILLIAM SHAKESPEARE

  LEARNING FROM LIFETIME STUDIES does not stop until the lives have been fully lived—and not even then, because archives of prospective data are an invitation and an opportunity to go back and ask new questions time and time again, even after the people who so generously provided the answers are gone.

  Eac
h time the Study of Adult Development was threatened with extinction—in 1946, in 1954, in 1971, and in 1986—the grant-makers asked, “Hasn’t the Study been milked dry?” There was a time when I thought that after the College men reached sixty-five and retired, there was nothing more to do but watch them die. Yet the Study always survived—to teach, to surprise, and to give. Granting agencies must be selective, it is true, but they must be selective the way foresters are. Longitudinal studies are the redwood forests of psychosocial studies. Fallen branches and felled trees are useful in the short run. Faithfully maintained and imaginatively harvested, the older the forest gets, the more it is worth. But once cut down it can never be restored.

  Obviously, the point of the Grant Study’s seventy-five years is not that it gratified the narcissism of a ten-year-old who wanted the most powerful telescope in the world, nor that it helped to resolve the grief of a boy whose father died too young. I am a real part of this Study, but a small one. It is very much worth noting that so many provocative findings have come out of it—findings that could have been discovered by no other means—that it has taken ninety-five different authors to elucidate them.

  It is certainly appropriate to assess the cost-to-benefit ratio of a follow-up study that endured for more than seven decades at a cost of twenty million dollars. And in some ways that is easy to do. The bean-counter in me points out that granting agencies have paid only $10,000 in award, on average, per peer-reviewed—and sometimes highly cited—book or journal article. That’s not an exorbitant cost as these things go. But the Study’s three greatest contributions, which justify its cost and give meaning to the extraordinary generosity, patience, and candor of the men who exposed their entire lives in the interests of science, are less easily subject to financial valuation.

  The first contribution is the absoluteness of the Study’s demonstration that adult development continues long after adolescence, that character is not set in plaster, and that people do change. Even a hopeless midlife can blossom into a joyous old age. Such dramatic transformations are invisible to pencil-and-paper explorations or even ten-year studies of adult development.1

  Second, in all the world literature there is no other study of lifetime alcohol abuse as long and as thorough as this one.

  Third, the Study’s identification and charting of involuntary coping mechanisms has given us at once a useful clinical tool, a route to empathy for initially unlikable people, and a powerful predictor of the future. Without this long Study of real lives, the importance of the maturation of defenses would still be out of fashion, dismissed as a relic of failed psychoanalytic metaphysics. In this final chapter I will briefly review each of these three contributions.

  DEVELOPMENT IS LIFELONG

  Adult development is a lifelong process. To investigate it properly means lifelong study—for which even seventy years is not really quite long enough. Many students of adult development, most brilliantly Warner Schaie in the Seattle Longitudinal Study, have tried to speed up the process by studying several groups of varying ages concurrently over ten or even twenty years.2 This method worked in Schaie’s study of the rise and fall of intelligence, which can be studied in populations. But it cannot work for the study of personality, which is unique to individuals. The study of old age in particular requires patience and empathy, not the restless intolerance of frightened fifty-year-olds such as I was in 1986. Without persistence, endurance, and restraint, we learn little about the delicate processes of growth that endure even as life is fading.

  Piaget’s and Spock’s delineations of child development changed parenting forever. Erikson’s appreciation of adult development as dynamic growth rather than decay was likewise a major paradigm shift.3 But without empirical evidence to back it up, it gave rise to more theory and speculation than knowledge. The Grant Study has changed that. Four major empirical works have come out of the Study under my name—Adaptation to Life (1977), Wisdom of the Ego (1993), Aging Well (2002), and this final summary volume—which clarify the second half of life a decade at a time. As the men went on maturing, as hypotheses were tested and retested over time, the books have become progressively less theoretical and more evidence-based. Future studies and wiser authors will take us further still. But the optimum (to borrow a concept from Arlie Bock) study of lifetimes will always require a hundred years of toil, generations of dedicated investigators, and stories of real lives supported by statistical verification. As I keep reminding myself, what people say doesn’t mean much. It’s what they do that predicts the future. It was the facts of people’s long-term love relationships, not their belief systems, that showed us what we needed to know first about their capacity to love, and then about their mental health.

  The realities of adult development cannot be discerned through speculation, through biography (by definition retrospective), or even through diaries (by definition biased). Yet theorists of adult development have had to depend upon just such sorts of materials, so scarce have alternatives been. The very distinguished Terman and Berkeley longitudinal studies rarely published case studies. Before the Grant Study, accurate individual records of adult development were rare.

  Sadly, the bellwether publication in the field, the Journal of Adult Development, while rich in theory and informative about given stages of adult development, pays almost no attention to real lives or to maturational changes, which are full of paradox. This can be seen in the quixotic truth that at the same time as the Grant Study was demonstrating incontrovertibly that adult development continues throughout the lifespan, it was offering support for the contradictory view of William James and other nay-sayers of whose work I once made light. Lifetime studies hoist us investigators with the very petards that we delight to deploy against our challengers.

  Let me illustrate that apparent contradiction, which lies in the vulnerability of all belief systems. In sitting down to write this book, I believed that it was only the tracking of behavior over time that could predict the future, not inventories or questionnaires or any of psychology’s other myriad “instruments.” Paul Costa and Robert McCrae believed that because a person’s NEO has not changed over thirty years, his character has not changed either.4

  I had no problem with the first half of that contention; indeed, we found that among the College men the NEO did not change very much over forty-five years (see Chapter 4).5 I even found, in running the numbers, that the NEO traits of Neuroticism (negatively) and Ex-traversion (positively) were significantly predictive of some Decathlon outcomes. But I did have trouble with the second half—that the lack of change in the NEO meant a lack of change in personality. The NEO traits—uncomfortably like the elegant and empirically derived anthropologic measurements with which this book began—are static. They do not address processes of growth, any more than a masculine body build, which may well endure over a lifetime, predicts the kind of officer its owner will grow to become. The NEO deals with personality as reflected in multiple choice questions, and although I tried, I could not correlate any of its traits with the five mature defenses (altruism, sublimation, humor, suppression, and anticipation) that do change over time, and in their changing exert such an important influence on the outcomes of real lives.

  But having hoisted Costa on his own petard—for failing to recognize the difference between static measurements and dynamic processes—I proceeded to hoist myself on mine. It occurred to me that if Extraversion (positively) and Neuroticism (negatively) could predict the Decathlon to a limited degree, it might be interesting to create a theoretical composite value: Extraversion minus Neuroticism. That is, what would happen if I removed from consideration the thwarting effects that Neuroticism had on Decathlon success? What happened was that the Extraversion minus Neuroticism values correlated at least significantly with every Decathlon event for which data was available, rivaling adaptive style and childhood in predictive power. As illustrated in Table 11.1, Extraversion minus Neuroticism estimated at age twenty-one could predict the Decathlon measured with data
gathered forty to sixty years afterward, and it could predict it as well as adaptive style assessed twenty years later! My ship of fine theories had crashed upon the rocks, but truth was well served.

  The moral of this story is that they were right and I was right. My rightness did not preclude theirs; theirs did not preclude mine. My rightness was not complete, and theirs wasn’t either, and the whole is greater than the sum of its parts. There’s more than one path to the top of a mountain, and there are prizes enough for everyone: Woods, Soldz, Bowlby/Vaillant, Costa/McCrae. But note that it was only in the context of lifelong study that the question of changing adaptive style could even be tested. Context is everything, and lifetime studies provide it most generously.

  Table 11.1 Statistical Strength of Association of Alternative Predictors of Flourishing*

  * 168 men were included in this table. Only the classes of 1942–1944 were rated for defenses, and some men have been lost to death.

 

‹ Prev