Book Read Free

Elderhood

Page 9

by Louise Aronson


  Sometimes when teaching, I ask the assembled students or practicing health professionals about what proportion of older Americans live in nursing homes. The answers usually range from 20 to 80 percent, exponentially higher than the actual figures of about 3 to 4 percent overall and 13 percent among the oldest old. In reality, most people over age sixty-five are content, active, and living independently. But we rarely acknowledge how well old age can and often does go, with years and decades offering new opportunities for work, fun, family, leisure, learning, and contributing. Instead, in everyday life, our attention is directed at baldness, stooped posture and slowed paces, wrinkles and canes and hearing aids. In medicine, we work with a biased sample. When older people are doing well and when they are ill but otherwise fairly well, we think of them as middle-aged, if we think of them at all. That leaves “old” associated with the common, disquieting extreme: old people who are ill, disabled, or almost dead. Perhaps it is because the effects of age are visible in even the healthiest older adults and most people do become ill or disabled in some way before death that we reduce the last, decades-long phase of life to a single, noxious state, despite clear evidence of its joys and variety.

  The English literary humanist William Hazlitt described prejudice as “the child of ignorance.”11 This comment rings true for certain sorts of prejudice. But we all have parents and grandparents, or friends and mentors, who are old. Sometimes, it seems, prejudice is born less of ignorance than of fear and dread. I am inclined to go with Voltaire, who said, “We are all formed of frailty and error;12 let us pardon reciprocally each other’s folly.” Obviously, we must do better.

  While elimination of prejudice is utopian, recent advances in the rights, achievements, and medical care of systematically marginalized groups offer precedents for how we can reduce bias and improve care. The first step toward a less ageist health system is acknowledging the problem. As Allport pointed out: “If a person is capable of rectifying his erroneous judgments in the light of new evidence he is not prejudiced … A prejudice, unlike a simple misconception, is actively resistant to all evidence13 that would unseat it.” Medicine has a problem, and so does our larger culture.

  No one disputes that older adults differ physiologically from younger adults, and abundant evidence shows that older people vary widely in their health, functional status, life priorities, and medical preferences. Yet, in the UK, health policy sometimes assumes old age inevitably brings incapacity and denies care known to preserve health and independence on the basis of age, not functional status. This phenomenon is known as “undertreatment,” which means depriving someone a treatment with a high likelihood of health benefit. In the United States, the opposite, called “overtreatment,” is the norm. Older patients often are cared for as if they were just like younger patients; drugs and treatments developed in studies of middle-aged adults are given to old patients irrespective of age, other medical conditions, incapacity, or life expectancy. Neither approach makes sense. One discriminates on the basis of age; the other denies the impact of age. Both lump all older people into a single monolithic category in complete disregard for the diversity of health and function that increases with advancing age. Both are forms of ageism. The old may differ from the young, and the care of older patients may differ from that of younger patients, but older people are no less deserving of high-quality medical care.

  Ageism within medicine is a manifestation of a larger problem. It is well known among those who care for old people that you endanger a center or program if you use the words aging or geriatric in its title. If you want patients, funders, institutional support, and referrals from colleagues, you must replace those terms with wellness and longevity. In other words, old people themselves, individual and institutional donors, medical centers, health systems, and health professionals all demonstrate a strong preference for euphemisms over terms that are more precise, more inclusive, and not inherently negative. Within geriatrics, we entertain ongoing debates about changing our name (to “complexivists” or “transitionalists”), largely because geriatrics has “negative connotations.” But surely a profession dedicated to the care of old people should not reject association with old age! Imagine pediatricians changing their specialty name so as to distance themselves from children, or surgeons starting to call themselves interventionalists. It’s absurd.

  At the same time, age phobia is understandable. Even people who remain healthy and active as they age will experience changes in strength, endurance, and appearance, and a significant proportion, though not all, will have increasing illness and disability. But these facts belie the more complex reality of the longest, most varied period of our lives. Some people become frail in their sixties while others remain healthy past their centenary. Much of what we accept as fact is actually a “glass half-empty” interpretation, an assumption that all age-related changes are for the worse. To be sure, the aging body disappoints and frustrates. Yet each stage has pros and cons. After all, it’s those of us between youth’s substandard thinking and judgment and old age’s physical frailty who pour billions of dollars into stress-relieving activities and products.

  Part of what makes old age hard is that we fight it, rather than embracing it as one stage in a universal trajectory. We also fail to properly acknowledge its upsides: the decreases in family and work stress or the increases in contentment, wisdom, and agency that accompany most years of old age. Sometimes people—from those working in medicine to families of older adults—attribute the bad outcomes of our flawed, biased approaches to old age in medicine and society to biological destiny. Sometimes that’s true, but at least as often, it’s not.

  6. TEEN

  EVOLUTION

  In a 2016 interview about his memoir, Bruce Springsteen, age sixty-six, was asked by fifty-six-year-old New Yorker editor David Remnick, “Why now?”

  Springsteen let out a long breath, making an “oof” sound, and chuckled. “I wanted to do it before I forgot everything, you know.”

  Remnick laughed heartily. The audience watching the live interview cheered and clapped.

  “So it’s getting a little edgy with some of that,” Springsteen added, “so I thought now was the time.”

  When that interview took place, Springsteen was coming off a sold-out tour, playing exceptionally long sets—over three hours of continuous, highly physical singing and cavorting. Night after night, he played in cities around the world. When his book came out a month later, it topped bestseller lists, and Springsteen launched a new tour, or rather two tours: one book, one concert. By virtue of either his age or stamina, a case could be made for Springsteen as still decidedly middle-aged, but the artist himself clearly felt that whatever old was had begun for him, and he saw, or thought he could see, where it was headed. Somehow neither he nor the New Yorker editor, ten years his junior, recognized the irony of positioning Springsteen partway down a nefarious spiral when the career details they were discussing suggested not just a new high point but a remarkable addition to his artistic skill set. After decades of renown as a musician, he was now also recognized as a talented writer, a fact that introduced new options and opportunities for his future.

  A writer doesn’t have to jump up and down or dance along a stage and into an adoring crowd. Then again, not all musicians do that either. Springsteen could sit at a piano, or on a chair cradling his guitar, or with just a microphone and a small spotlight, the audience’s entire focus on his face, words, song. That would not be a traditional Springsteen concert, but would it be worse, or just different? Would it tarnish his legacy and shrink his audience, or expand it, showing range and adaptability? He’s had ballad albums before (Tunnel of Love). The point is Springsteen has options, as many people do, though his are significantly different from most people’s. A different sort of concert, perhaps playing a modified or different sort of music, is just one of Springsteen’s options. He also could sit at home with a mouse and keyboard, or a pen and paper, or a voice recorder, or an assistant taking
dictation, and he could write. Such transitions are often framed as devolution, but that’s only the case if the frame is constructed from static expectations. Build it instead with an understanding of the human life cycle, and it looks more like evolution: a gradual process in which something develops into a different form.

  If not quite three score and ten, Springsteen was certainly within the long-accepted territory of “old.” For two thousand to three thousand years, from the time of Socrates and the Athenian Empire in the west, and much earlier in the Middle East and Asia, old age has been defined as beginning around age sixty or seventy. In the United States, sixty-five became the federal demarcation line between middle and old age with the launch of the Social Security program in 1935. The group that developed the program, the President’s Committee on Economic Security, chose sixty-five partly because it was consistent with data on prevailing retirement ages at the time and partly because it was the age already selected by half the existing state pension systems (the other half used seventy). Although retirement norms, longevity, and actuarial outcomes have changed since the 1930s, sixty-five has endured in many minds as either a strict divide or a marker of having entered the transition zone headed toward old.

  For most people, early, middle, and advanced old age are significantly different. In our current conceptualization of old, physical degradations and lost options are its sine qua non. That’s why, until those things become overwhelming, many people don’t think of themselves as old, even when most younger people would swiftly and definitively put them in that category. When people arrive at the stereotypical version of old, they sometimes no longer feel like themselves, although for most of us the transition to old happens gradually over decades beginning at age twenty. The changes are both positive and negative, though we tend to focus on the latter. Those losses and diminutions are imperceptible at first, then easy to disregard, then possible to work around, and, finally, blatant.

  Springsteen signaled that he was aware of the negative changes in his own mind and body. Once you reach a certain age, it’s hard not to ask: Will my mind go first, or my body? Will they both go, or will I get lucky? When will it happen and how quickly?

  Aging begins at birth. In childhood, the changes are dramatic. In those early decades, the fact that living and aging are synonymous is lost, couched first in the language of child development and then forgotten in the busyness and social milestones of young adulthood. After a friend moved to another state, I didn’t see her infant for nine months, at which point I found myself with a toddler, not a baby. Stages of child development are predictable and universal across cultures, except in cases of grave illness or disability. As we move through the life span, the boundaries between stages get blurred. Although people debate whether life begins at conception or birth, childhood starts with a big breath upon emergence from the womb, its beginning uniform. Its end is less clear. A ten-year-old is always a child, but eighteen-year-olds can be teenagers or young adults, depending on their behavior. Some people achieve physical, emotional, and intellectual maturity in their teens, others in their twenties. Females tend to get there before males. Still, most people become adults in the same several-years-long window of time.

  With the arrival of the twenties, development seems to abate, taking on the imperceptible pace of hair growth or melting glaciers. The changes that defined us as we moved from infant to kid and teen to adult appear to stop. But unseen or not noticed are not the same as not happening. Changing continues throughout life—physically, functionally, and psychologically. At some point, we cross into the territory of “middle age” and discover aging isn’t just a characteristic of that mythical land called old. Sometimes the evolution is welcome, bringing a greater comfort with self, a deep-seated confidence and greater security about what is and has been. At the same time, accumulating physical changes collude in ways that can complicate, distress, and impoverish. A person’s identity can feel challenged.

  Even in the decades when change seems slow, almost irrelevant, it is present, significant, ongoing. In my thirties, I had the straight white teeth of a person fortunate enough to have had braces in her teens and dentistry throughout life. By my early forties, my little front bottom teeth began to overlap as if so much time had elapsed that they’d forgotten their training at the hands of metal braces, headgear, neck gear, rubber bands, and retainers. As they overlapped, I saw along their edges the imprimatur of decades of morning coffee, the occasional glass of red wine, and the erosion of daily eating and drinking. Yet my dentist says my teeth look great. She can tell I faithfully brush and floss. What she really means, I know, is that they look great for someone in her fifties, not that they look as good as they once did or great in the absolute sense. At some point the caveat, the conditional clause, goes unspoken.

  At the age of apparent aging, the once distant land called “old” no longer seems foreign or exotic to me. Daily, my joints offer protests. Sometimes one has a solo; more often there’s a noisy blur of voices, the new background music accompanying my every movement. I regularly switch among my three pairs of multifocal glasses, each with a different function. I have a faulty gene, a history of cancer, and seven visible surgical scars, and am now missing several nonessential body parts. These days, when something goes wrong in my body, I don’t just consider how it might be fixed; I worry that fixing it won’t be possible and that my new debility will not only endure but beget a cascade of injuries and additional disabilities. In my head, I hear the childhood song about how the foot bone’s connected to the leg bone, the leg bone’s connected to the hip bone, and so on. Although it’s not yet clear how it will go down, I can now imagine me = old, even if I still sometimes register my relentless progress toward citizenship in that vast territory with surprise.

  Those physical changes are real but tell only part of the story. For me, the rest of the saga goes something like this: though I have yet to take up permanent residency in old age, I have acquired an intimate familiarity with its culture and customs, and I’m looking forward to it. I imagine its early years, and if I’m lucky, decades, much like the best parts of midlife: the solid sense of who I am and how I want to spend my time, the decreased volume of the sorts of ambitions easily confused with the hollow vanity of social recognition, the greater time and energy for generosity and attention to others, the confidence to stick to my convictions, the exciting new goals and profound sense of life satisfaction. Similar sentiments1 are found with aging the world over.

  It may be that after the great celebrations of childhood milestones, we feel surprised and uneasy about our quieter progression through later turning points. A friend in his late thirties thought it absurd that his peers didn’t want him to refer to himself as middle-aged when he so obviously was. I looked at him and agreed; he’s far from old, and also clearly no longer young—he’s somewhere in between. Out the other end of adulthood, my mother says aging isn’t really that bad until you hit eighty, then there’s a nosedive. She said this as we had dinner at the assisted living facility where she moved because of my now-deceased father’s needs, not her own. Seconds later, frustrated because we hadn’t been brought water, she jumped up, grabbed our glasses, and darted across the dining room to fill them. She wasn’t her old self, but she didn’t seem to me like a person in a steep downward plunge. Yet, to her, a threshold had been passed into a territory of greater risk and vulnerability.

  By far, the least fixed dividing line separates adulthood and old age. With good health and good luck, some people don’t seem to be or see themselves as making that transition until their late seventies and occasionally later still. By contrast, major stressors such as homelessness, poverty, or incarceration can cause accelerated aging, making others “old” in their fifties, with cellular changes and risks of chronic disease and death akin to those of more fortunate people many decades their senior. And still, use of the word old for fifty-somethings requires quotation marks. We define age as a definite place in life’s chronology, ot
her times as a bio-psycho-social state, and mostly as an amalgam of the two. Using that logic, a frail seventy-two-year-old is called old while a marathon-running seventy-two-year-old executive is not. In reality, both are old, and even if the executive continues her current activities in her eighties, she’ll be “old.”

  Because aging is a long, stealthy process, a person’s arrival at old age is less a switch thrown than a series of ill-defined thresholds crossed, the transition often first noted by others. Most people over thirty, and certainly those forty and older, will recall the first time “mister” or “lady,” “ma’am” or “sir,” meant them. As our third decade of life cedes to our fourth, aging seems to accelerate. By the time the fifth decade fades into the sixth, the resulting accumulation of physical changes that define adult development transitions from inconsequential to subtly manifest: the crow’s-feet or balding pate and tricky right knee, the friends with cancer, the talk among your peers about ill and dying older relatives. By the ebbing of the sixth decade, if not sooner, the changes are undeniable. Not long after that, they transition to conspicuous, each decade seemingly more profoundly marked than its predecessor. On a daily basis, nothing seems to change, but look back a year or five or ten, and the transformation is pronounced.

  There have always been old people. Egyptian hieroglyphs from 2800 B.C. depict a bent person leaning on a staff. For over nine hundred years beginning in 775 B.C., the Greeks put forth an array of theories about aging. As their ruins attest, the ancient Greeks had systems, roadways, and efficient processes for sewage removal. Hygiene was good and most hard labor was done by slaves. Aristotle may well have noticed that the slaves, with their long hours of physical work, poor access to food, and constant exposure to the elements, aged more quickly than the citizens in his circle. He suggested that aging occurred because of the loss of pneuma, an internal heat or vital spirit that was gradually consumed over time. Because there was a finite amount, older people had less, which made them more vulnerable to disease, and although slaves spent theirs more quickly than scholars, everyone eventually ran out.

 

‹ Prev