Book Read Free

Elderhood

Page 29

by Louise Aronson


  Neurologists speak of localizing the lesion, finding the anatomical change that explains a patient’s signs and symptoms. What lesion was responsible for my burnout, I wondered, and if there were many, did one in particular predominate? This was among the several questions I considered with the psychologist I paid out of pocket to see. Others included: Do I still want to be a doctor? And, if I don’t, what else can I do that would provide not just income but health insurance to two middle-aged people with “preexisting conditions”?

  My physical ailments were more easily treated than my psychological ones. Although I began to function again, I sensed my mental improvement was not because anything significant had changed but because I’d built up my reserves enough to again feign being fine.

  A few months after going on leave, back at work but still steering clear of many activities that had contributed to my crisis, I had an epiphany. The system is created by people, and I was furious at some of them. Could they not see how much of our health care is unnecessarily costly, inefficient, prejudicial, and malevolent? Or did they see and not care? Maybe, I thought, burnout wasn’t fundamentally about any of the things that appear in all the studies and articles about it. Maybe it was really about sympathy.

  More precisely, maybe burnout was a by-product of a lack of sympathy in our health care system and its leaders. I felt no compassion in the official responses to the burnout epidemic generally or to my particular distress. In its place, I was left to feel that I had failed, that I was weak and defective, that I was disposable, and that my concerns and needs were unseemly.

  Underappreciated in discussions of burnout is the impact of other people’s responses to those who are struggling. There are simple things that too often don’t happen. Sincere—not automatized, not institutional—acknowledgment of moral distress. Avoiding obvious, facile solutions that any intelligent, struggling person would have already tried and either found unhelpful or in violation of their core values and interests. Taking action toward changes that might simultaneously mitigate the damage and show kindness. Allowing for flexibility and differences in people, whether they are doctors or patients, nurses or families.

  With the right support, burned-out physicians might continue to practice; without it, they go on leave, stop seeing patients, commit suicide. Most weeks of 2017 and 2018 have brought new articles about burnout and new lists of nonclinical jobs doctors can do. Those efforts are steps in the right direction, but we also must address how our bosses and system have responded to the burnout crisis. It’s hurtful and disappointing to be kicked when you’re down, to be offered not sympathy and succor but insults and disrespect, to have colleagues you trusted and admired smile at you while rubbing salt into the wounds you showed them after gathering up what remained of your small courage. Trust is like Dresden in World War II or New Orleans after Hurricane Katrina; you can rebuild after the devastation, but it will never be the same.

  10. SENIOR

  AGES

  In 1960, the bespectacled, royalist-anarchist director of France’s Institute of Applied Research for Colonial Fruits in Paris made a radical claim that had nothing to do with bananas, papayas, or jackfruit. “In medieval society,” Philippe Ariès wrote in his classic, Centuries of Childhood: A Social History of Family Life, “the idea of childhood did not exist.”1

  As is often the case, the French took this news with more equanimity than the Americans. When the book appeared in the United States two years after its French release, the “discovery of childhood” inspired considerable excitement and equally impassioned criticism. Language was partly to blame: where Ariès used the French word sentiment,2 implying both a concept and a sense of feeling, idea appears in the English version of the book, losing the fuller meaning of his text. He wasn’t arguing that children hadn’t existed before the seventeenth century but that social, political, and economic changes ushered in an era in which childhood was newly recognized as a distinct and valued life phase. That he did not approve of this change3 was also lost in translation.

  Ariès’s claims and methods remain controversial. Yet his work helped make the institution of family a topic worthy of scholarly attention. He popularized the use of more varied sorts of historical evidence, and his work led to greater awareness that human experience of different life stages varies depending on where and when people live. In history, as in science, what is looked at determines what is seen and known. More traditional historical data—birth, death, and tax records, inventories, property transactions, and the like—tell one story, while letters, newspapers, art, literature, and textbooks tell another. Ariès’s biased presentation of such data notwithstanding, it seems likely that the truth includes both those stories.

  Take a person’s age. These days, we all know how old we are, but the practice of knowing for certain one’s birth date and age didn’t become universal in Western societies until the eighteenth century. Before that, some people knew their exact ages—Greek and Roman writers were precise, and presumably accurate, about how old they were—and most did not. A person was referred to as “a youth” or “an old person” based on how they looked and acted, not on how many years had elapsed since their birth. A forty-year-old might earn either label.

  We don’t need to span centuries to find transformed ideas about life phases. In the early 1950s, when my mother was in her early twenties, my grandfather worried that she was becoming an old maid. As her friends married, my grandparents watched with growing alarm as my mother dated and broke up with a variety of perfectly acceptable suitors. They were relieved when, at the ripe old age of twenty-four, she and my father became engaged. One generation later, I can count only a small handful of friends who got married in their twenties. For most of us, it happened in our thirties. My family’s social circumstances didn’t change over those decades; what was considered normal did.

  Widen the lens further, and the changes are even more dramatic. If my mother and I had been born in Europe in the latter Middle Ages or early Renaissance, we might have married at twelve. In those days, menarche equaled maturity, and the notion of adolescence, much less a female young adulthood of higher education, career advancement, and intimate relationships not leading to marriage, did not exist. By our early thirties, had we lived that long, we would have been grandmothers, not the parent of two small children, as my mother was, or a still-unmarried doctor, as I was. What’s normal depends not only on when you live but where and who you are in that place and time.

  The human brain naturally makes categories.4 Chinese, Iranian, and Greek authors write of boys, men, and old men. Since the age distribution of our species changed so quickly in recent years, our language and institutions for the years after fifty or sixty haven’t caught up. Nor have they accurately captured the variety of who we are or maximized the individual and social potential of people across a newly enlarged canvas of human potential.

  The French seem to have an aptitude for naming life stages. In the 1970s they began educational and engagement programs for retired people that were called Les Universités du Troisième Age, or Universities of the Third Age. Both concept and phrase jumped the pond to England before the term Third Age was generalized by the historian Peter Laslett, who felt it helped fill “the perennial need for a term to describe older people, a term not already tarnished.”5 Laslett also advanced what he himself termed a radical notion: that the Third Age is the apogee of personal life. He added that the stages, while usually sequential, are not divided by birthdays and supposed that a person could be in the First or Second and Third Ages simultaneously, if that person reached their apogee in youth (as female gymnasts do, for example) or while also working and raising a family. But the Third Age “emphatically” could not overlap with the Fourth.6 Here, then, is where Laslett’s schema falls apart: he defined the first two categories by customary age-related activities, the third by personal realization and a particular set of behaviors that transcend age, and the fourth by biology. There can be no clarity, and no
justice and equity, among age groups when they are identified using different metrics.

  The Third Age and the Fourth Age can differ in chronological age—young-old and old-old—but are primarily distinguished by their differences in health, activities, and consumer roles. People in the Third Age are “aging successfully,” while those in the Fourth are frail and dependent. Laslett called the postwork, postchildren life phase the “crown of life”7 and the “time of personal self-realization and fulfillment.” The Third Age, he argues, is made up of the years and decades recently added to the human life span. It should be used for the “founding, shaping, sustaining, and extending” of duties and institutions. He defines five challenges addressed by the Third Age: recognition of our changed demographics; supporting large numbers of people no longer required to work; cultivating attitudes and morale in the face of inaccurate stereotypes; developing an outlook, institutions, and organizations to give purpose to all those added years; and coping with the problem of the Fourth Age. Although he doesn’t fully explore the influences of a person’s economic and social situation on their ability to enjoy a Third Age, he does acknowledge the risk of foisting on the Fourth Age all the prejudices that now universally attach to anyone over sixty. He asserts the goal of delineating the two ages was instead to make the most and best of each.

  Some have argued that more attention8 has been paid to the Third Age than the Fourth, but this discounts geriatric medicine: to its credit and detriment, my specialty has traditionally paid much more attention to the Fourth than the Third.

  Third Agers are active participants in our mass consumer society. Although many are partly or completely retired, they retain agency—in fact, that ability to act for oneself is one of its two defining features. The most fortunate Third Agers buy anti-aging products and join gyms and social clubs. They travel and volunteer. The Third Age, then, is more a set of behaviors and attitudes,9 a lifestyle particular to middle-class and wealthy members of our consumerist culture and sociohistorical time. It’s the grown-up version of the 1960s emphasis on youth, beauty, choice, and self-expression. It’s also a continued effort to define oneself and one’s peers as something other than “old.”

  Not everyone past middle age is part of the Third Age, and many such people have agency but use it in ways that don’t fall under the Third Age concept. The Third Age can seem universal because it comprises the people who are most likely to write, speak, and create the art and marketing that have defined it.

  Laslett saw the Fourth Age as biologically determined and timeless. Live long enough at any time in our species’ history, and you would enter it and be subjected to its inevitable decline and “ignominy.”10 Chris Gilleard and Paul Higgs argue that the Fourth Age is the product of “the combination of a public failure of self-management and the securing of this failure by institutional forms of care.”11 The result is “a location stripped of the social and cultural capital that is most valued”12 in society. They assert that “the appearance of a fourth age … has been contingent upon developments in health and social policy13 during the course of the twentieth century.” It has also been the “bitter fruit” of Third Age efforts to create an image of older people as attractive, useful, and relevant.

  Laslett’s goal was to counter the “hostile and demeaning descriptions of the elderly which have denied them their status14 and their self-respect.” That is a worthy goal, though only insofar as it helps all older people achieve status and self-respect. If instead it allows the younger and fitter old to gain those essentials at the expense of those who are neither, then such efforts are counterproductive. In Matthew 12:25 Jesus says, “Every kingdom divided against itself is brought to desolation,” words echoed by Abraham Lincoln on June 16, 1858, in his “A house divided against itself cannot stand” speech. Similarly, old age divided between Third Agers and Fourth Agers is unsustainable; indeed, it has been of little benefit over its half century as a concept, except to offer false succor to those in the Third Age, followed by worsened degradation when they reach the Fourth. Prejudicial segregation breeds degradation, and we value some lives or parts of lives over others to our own peril. The most fundamental consideration must be the moral one: Will we treat all human beings as human beings regardless of differences, or treat some as lesser beings? The unattainability of absolute equality is no excuse for the ruthless devaluation of individuals or social groups.

  Advanced old age invokes negative associations: repugnance at bodily aging, fear of loss of function, a position of poverty and societal inferiority, and a sense of having moved beyond the realm of experience and agency inhabited by most people we count as human. From that place, a person can neither define nor assert themselves in reliable ways. Perhaps they can articulate preference but maybe not, and certainly they can’t make happen most things they want. In all-too-common worst-case scenarios, all they can do is scream or cry, try to sleep away the hours, kick or bite—and when those things happen, they are called “bad” or “difficult.” They are punished, abandoned, or institutionalized, and ignored, tied up, or sedated. Even people who have made arrangements in advance for a phase of such profound debility cannot be sure their wishes will be respected. How they are seen and treated—really, every aspect of their lives—is controlled by others. The only escape is death.

  I have used the word they for people in this state, for people in the Fourth Age. In some ways, this is accurate—I am not (yet) such a person but likely will be. Most of “we” will become “they” in the future for days or weeks, months or years, unless we find a new way to think about and address the Fourth Age, one that itself becomes routine and institutionalized, structural and universal. One that is both innovative and unprecedented. We think we can do this by manipulating the biology of old age, and maybe we can; but just in case those efforts don’t succeed, why not apply similar attention, funding, and creativity to our experience of the Fourth Age as well? Even if the Fourth Age is really a black hole as Gilleard and Higgs claim, a place visible only by its impact on other places, we could work to see and make that impact more positive with the expectation that the reflection remains accurate.

  My mother says she’d rather be dead than live in very old age with significant dementia or disability. If she even nears that state and anything else happens health-wise, she doesn’t want that other thing treated. She doesn’t say “even if it kills me”; she says she doesn’t want treatment in hopes that it will kill her. And she worries that she will arrive at that state and not get sick, that she will linger in a body that resembles her but isn’t quite her. It doesn’t matter that she might not know the difference. She finds that prospect horrifying, for herself and for us, her family, and she thinks the costs of her care at that point would be better spent on someone with the ability to appreciate it. I think of people who are happily demented but can think of far more who are in a state that might be called “lingering” in a life without any evident benefits. Most express misery; almost all appear to be suffering. But some families don’t see it that way, and some religions advocate the sanctity of life in all situations, which makes policymaking tough, even as it makes discussing this phase of life critically important.

  My father said he’d never want to live if he had dementia, but then he had dementia and was happy to be alive. “I’ve had a good life,” he’d say, proud and self-satisfied, holding forth from his position as center of attention in his hospital bed and, in his usual good-natured way, forgetting all the bad parts, “but I wouldn’t mind more.” Yes, he said to various procedures and surgeries. Definitely do that. But when he got to the point that may have been the point he had been referring to when he said he didn’t want to live with dementia—the point where he was quite clearly no longer having fun—he also could no longer articulate things like how he felt or discuss huge abstract concepts, like the meaning of life or when a person might have passed a threshold where they lose what matters most. It’s likely that my mother is thinking of both my father’s l
ast, bad year and also the few before that and their impact on the rest of the family. That caregiving was hard in so many ways, but it was also the important, meaningful work that defines a family. I would do it all again, with love and without hesitation. My mother knows that, and knows I’ll do the same for her, and still she fervently hopes it won’t come to that.

  PATHOLOGY

  In the spring of 2013 I was invited to speak at a conference organized by a physician and a medical humanities scholar. A daylong program was looking at how everything from disruptive technologies to storytelling could transform medicine in the coming decades. At the pre-dinner reception, a tall, lean man with graying hair approached me, introduced himself, and stuck out his hand. His name sounded familiar, but after a day’s travel and a few sips of wine, I couldn’t place it.

  The conversation became awkward. He reminded me who he was—a cultural historian who focused on aging, the head of a terrific local medical humanities program, and the editor of at least one anthology on my office bookshelf. Unfortunately, I hadn’t so much read his work as heard about it. At best, I’d read an article or introduction he’d written. In my scant free time, I didn’t crave the pedantry of a scholarly social science text but the beautiful sentences of literature.

  Our conversation plummeted from awkward to mortifying. It became clear that although I knew his name, I didn’t know his work, that he could see I didn’t, and was surprised and disappointed. After several minutes of pleasantries, we moved on to mingle with others.

  A few weeks later, two books arrived at my office. There were so many reasons why, if I was who I thought myself to be and portrayed myself as, I should have read his work. I sent what I hoped was a suitably gracious and grateful thank-you e-mail. Still feeling guilty, I shelved the books.

 

‹ Prev