Elderhood

Home > Other > Elderhood > Page 44
Elderhood Page 44

by Louise Aronson


  Across the country, health systems, communities, and in some cases entire states have added and touted programs and policies that give people greater control over their deaths. Strategies range from promoting discussions of values and preferences at the end of life to expanding services that support death at home and passing assistance-in-dying laws. Many of the movement’s leaders are people who started their careers in aging and then moved on to dying. Most medical centers now offer palliative care services, and hospitals have special rooms for dying patients that resemble the homelike birthing rooms introduced for childbirth nearly a half century ago.

  It’s not that we’ve cured dying, moving it into the history books alongside smallpox or polio. Too many people still lack access to these improved approaches, and some deaths remain hard, even painful, despite the best efforts of highly skilled care teams. Yet recent end-of-life care advances represent meaningful systematic and attitudinal responses to stories and evidence of widespread, pointless suffering at the end of life. Newer approaches to death move beyond the medicalized, institutional, uniform method that developed as medical progress prevented or delayed certain sorts of deaths. Ironically, what’s old is new again. The key step underlying many of the last two decades’ death “innovations” is the acknowledgment of death’s inevitability. More and more often, when it becomes clear that the high-tech approach will only extend and deepen suffering, the focus shifts to optimizing comfort and the patient’s and family’s unique experience of dying. Many more deaths take place at home now than fifty years ago, though not nearly as many as a century ago before death became a medical, rather than a life, event.

  The media often adds confusion to American death conversations by reporting deaths from what they call “natural causes.” They don’t mean an earthquake or hurricane. Invariably, the dead person was old, or old enough. When a younger person dies under similar circumstances—in bed overnight, say—an investigation is launched. People whisper about drugs or suicide. The word tragedy is used. Both biology and philosophy are in play here. Technically, it’s natural to die, one of life’s few universal requirements. Our assignation of natural to some deaths and not others is strongly connected to our notions of old age.

  In the early 1500s Leonardo da Vinci decreed a natural death as one resulting from a lack of nourishment, as blood vessels thickened and closed off with age. Although not exactly how we’d describe it today, his description of advanced old age fairly accurately captures one mechanism responsible for the gradual shutting down of organs and functions known as senescence. Three hundred years later, the American physician Benjamin Rush recognized that often what was written off as senescence was in fact one or more specific diseases masked by the more obvious biological changes of a long life.2 In his 1793 Account of the State of the Body and Mind in Old Age, with Observations on Its Diseases and Their Remedies, he wrote that “few persons appear to die of old age.3 Some one of the diseases … generally cuts the last thread of life.”

  In the modern world, where getting things done is prioritized, death can seem more manageable than old age. After all, aging takes place in mundane ways over many years or decades, while death usually has a horizon of days to months. I suspect that’s why people are often more comfortable with death than aging. Death is a sprint and aging a marathon. Certainly, that attitude is common among many in medicine, where increasing numbers of trainees and clinicians of all stripes have enthusiastically embraced palliative care. While this may seem a small issue within one profession, medical trends usually reflect larger cultural tendencies. Palliative care has transformed medicine by creating a group of professionals whose primary skill set is the management of physical and existential suffering.

  But it has also allowed other doctors to abdicate those territories, as if all clinicians in specialties with patient contact should not be proficient in such fundamental skills. How can it be that we have a medical system where most doctors are allowed to outsource to colleagues common, critical tasks such as having difficult conversations, relieving pain, and supporting dying patients? The answer is that as medical knowledge has expanded, we have parsed sickness not only into organs and diseases but into their subcategories. Doctors specialize not just in the heart but in rhythm disturbances, not just in the gastrointestinal tract but in hepatitis, not just in ophthalmology but in the retina.

  In many ways, this makes sense. It’s easier both practically and psychologically to focus on one thing, and to understand it well, than to manage not only that but also someone’s hip replacement, diabetes, low vision, and heart disease. It’s also the antithesis of what the health system claims to be focused on in this moment in history: “patient-centered care.”4 When Paula, who had early dementia, chronic obstructive lung disease, and a leg ulcer, fell and hit her head, in addition to her lung doctor and skin doctor and vascular surgeon, she was told she needed one neurologist for her dementia, another for her traumatic brain injury, and a third to manage her seizures.

  Palliative care not only divorces death from the diseases that cause it, it divorces dying from aging, as if those conditions are not inextricably intertwined. Most people who die are old, yet at my top medical center and elsewhere, the leading palliative care specialists unabashedly distance their work from aging and old age. Before we die, we live, and since most of us will live not just to old age but in it for decades, living there comfortably, meaningfully, and with as much ability to do useful things for ourselves and others as possible matters too. Dying as well as possible at any age requires care that takes into account a person’s concerns, physiology, and context, all of which varies significantly with age. That so many palliative care doctors for adults claim their work has as much to do with pediatrics as geriatrics speaks to the depth of ageism in medical culture and to the self-defeating tendency of tribes—think Sunnis and Shiites or Tutsis and Hutus—with more in common than not to wage battle against each other instead of against the larger forces orchestrating their competition.

  Ask people how they’d like to die, and most will say they hope to not wake up on the morning of the day on which a medical catastrophe would force them over the cliff from a fulfilling life into a barren gorge dominated by illness and grave disability. There are two problems with that wish. The first is that it’s almost always impossible to see the cliff edge until you’re already falling off it. The second is that many people who fall off terrifying-looking cliffs find lives well worth living.

  Francisco Gomes was seventy-nine when he began tripping over his own feet and bumping into furniture. His daughter accused him of being drunk when she brought her family to visit. He had been an alcoholic in her childhood, and she couldn’t believe he’d fallen off the wagon after twenty years of sobriety. It took him an hour to convince her he hadn’t been drinking, at which point she dropped her kids with a friend and drove him to the emergency department, where a CAT scan showed a brain tumor. Three months later, the tumor was out, and Francisco took up residence in a hospital bed in his daughter’s living room. He couldn’t walk, but his arms and mind were fine. He could get out of bed only with the help of a lift device, and sitting in a wheelchair wore him out.

  Three years later Francisco was still there, living not only with his daughter’s family but holding court from a bed in the living room. In the intervening years, their apartment had become the center of the neighborhood. Francisco read to kids from buildings up and down the block and helped them with their homework after school. He taught the mailman, an immigrant from China, better English, and their families became friends, throwing Sunday parties featuring both tortillas and rice.

  When she was diagnosed with esophageal cancer, Maggie Gillespie was still running the shop that had been in her family for decades, as well as volunteering in her grandson’s fourth-grade class. For years, she’d made clear that if she had a dire prognosis, she wanted comfort care only. The problem was that although her tumor was extensive, it was localized, and there was a chance that
with removal of most of her esophagus and local radiation, she might be cured, but she’d never be able to swallow or eat again. She agreed to a feeding tube as long as she retained the right to have it removed if things looked bleak. The surgery and early radiation treatments left her so weak and ill that she moved to the nursing home. That was where I met her seven months later. Since she wasn’t my patient, our first meeting took place on her day of discharge, a Saturday when I was the only covering physician. When Maggie approached me and introduced herself, I thought she was a patient’s daughter.

  She laughed at my confusion and lifted up her blouse to show me her new, permanent feeding tube. “I never thought I’d want one of these,” she said, “but I also always thought if I needed one I’d be totally out of it instead of just the same old me who can do everything but eat.”

  Sometimes we can’t imagine or predict what we will be able to put up with as we move through life. For example, a 2004 study of healthy people found that most said they would rather not have medical interventions5 to prolong a low-quality life. However, dying people who were experiencing low quality of life almost unanimously told researchers they would use any available medical interventions to prolong their lives, even if just by a few days.

  Among the frail octogenarians, nonagenarians, and centenarians I have cared for, sources of life meaning vary widely. While I could argue that there are as many types of meaning as unique individuals, it’s also true that those of us who do this work have noted common themes that transcend culture and social class and can frequently be boiled down to comfort, function, and relationships.

  Almost always, by the time those three things are gone, or become too difficult to access, so is a patient’s ability—physical, cognitive, or both—to control their lives and communicate their preferences. Also often lost at that stage is a physician’s ability to determine with sufficient certainty that death is what the patient truly does or doesn’t want, that there is no coercion by family or friends either toward or away from death, and that the patient’s sources of meaning haven’t shifted.

  Those caveats give me pause. Yet my years of geriatrics practice tell me that once most physical and sensory abilities are lost, whether the brain is failing too or simply trapped, unable to access other people, books, food, even television and most everything else, more often than not people are hoping for death, even as that prospect may still scare them. Like most things, death in old age is both similar to and different from death earlier in life.

  If we return to the notion that we tell some stories more often and accurately than others, there seems to be another reason for the current popularity of death compared with aging in public programs and discussions. The challenges and opportunities of death have been featured in films, bestselling books, TED talks, newspapers, blogs, and websites. Aging is finally getting more attention, too, but until recently it got much less, and too much of the sort that’s better described as catastrophic than transformative. Death, with its abbreviated trajectory, finiteness, and, depending on one’s beliefs, mystical or religious associations, lends itself well to romance, while aging, its longer and messier cousin, tends more toward realism. In literature, romance refers to extraordinary exploits that take place in mysterious and exotic settings and require honorable, dutiful actions to help those in distress. By contrast, realism offers a faithful portrait of life. Given a choice between romance and realism, many people choose the former. For me, among the great joys of a career in geriatrics, one of the few fields that doesn’t outsource our patients to palliative care as death nears, is that I get both.

  HUMAN

  Thomas Kuhn’s landmark treatise, The Structure of Scientific Revolutions, was one of the most influential books of the twentieth century. Its scholarly citations put Kuhn’s influence well ahead of the century’s other famous thinkers, including Michel Foucault and Sigmund Freud. Although the book is about science, in the decades since its publication, Kuhn’s notions of paradigms and paradigm shifts have become foundational to how we see and assess the world in all sectors of life.

  According to Kuhn, progress is not gradual and cumulative but occurs in revolutionary fits and starts. A paradigm, or widely accepted framework for understanding an important problem—a problem like health care—is primed for shift or overturn by periods of upheaval, uncertainty, and angst. As these crises worsen, revealing more and more flaws in the standard approach, people begin exploring different ways of thinking about the problem. Revolution occurs when enough people accept that the current paradigm is inadequate and reject it in favor of a new one.

  Maybe some people are happy with the twentieth century’s “normal science,” the science-and-technology-are-the-best-answer-to-every-problem medical paradigm. Certainly there are many people in many parts of the world who have it far worse than most Americans do. But that doesn’t mean that we don’t have serious problems. And it doesn’t mean that it isn’t time for radical transformation. Our current medical paradigm’s science-first approach has reaped huge benefits for individuals and society, but it has also had disturbing unintended consequences. We have costs no country can manage as a result of this paradigm’s payment, care, education, and research systems that favor novelty and discovery over implementation of the proven and dissemination of the useful, and high-tech procedural interventions and fields over preventative, social, and relational solutions. We have astronomical rates of patient bankruptcy and dehumanization, a distribution of specialists out of line with societal needs, a demoralized workforce, and epidemics of health disparities and illness caused by our current approach to medical care.

  If we need wellness centers, then “health care” is only addressing sickness. If we need programs, clinics, and special funding lines for women’s health, to name just one example, then the system is not set up to care for over half the population. If our professional schools need special courses and deans to address issues of diversity and disparities, our core curricula are not adequately addressing those issues and patients of all backgrounds. If we feel the need to use catchphrases like6 “patient-centered care,” what exactly is medicine? Shouldn’t patients always be the focus of health care? Something is missing in the current system and its underlying paradigm. Something important.

  The writer Jenny Diski, who died of cancer in 2016, described the problem from a patient’s perspective in her final book, In Gratitude. Of making treatment decisions, she says: “Everything is presented to me statistically, as probabilities. I can’t find the right question to break through that, to talk about the cancer that is me and mine, what it is, how it is, how it and I are with each other. Something that pans in on the singularity.” And of her radiation treatment, Diski says, “I didn’t doubt their ability to get me into position and to run the programme. But other things about the radiotherapy—such as my experience of it—seemed less skilfully thought-through … My dignity was left at the door of the treatment room each day, not because my breasts were revealed, but because as soon as I entered I became a loose component, a part the machine lacked, that had to be slotted into place to enable it to perform its function.”7

  This is what medical practice is like now too. Everything we do—indeed, our meaning and value to our institutions and to the health care system itself—amounts to numbers: from statistics reflecting productivity or guideline adherence to billing and costs—numbers signifying value, as if what matters in health and life can be numerically expressed.

  Medical centers and medical education break people down into their components: bones on floor three, joints on floor eight, hearts in this course, prostates in that one. The scientific approach requires control, and to gain control you break things down until they become manageable, and probably also because if we parse the body and diseases into categories, it makes our lives easier. Humans are complex and messy. Dealing with a whole one is slow, sometimes fraught, unpredictable, and often uncertain. It’s easier not to, which is what we teach medical trainees. This �
��hidden curriculum” is to medicine what side effects are to drugs: it’s built into the structure. You can’t separate the benefits from the harms without rethinking your entire approach.

  Not long ago, I sat in on an much-touted “case-based learning” session at a medical school I was visiting, an institution rapidly acquiring a reputation for educational innovation. In a small workshop room equipped with a communal table, whiteboard, and video monitor to which one student’s computer could be attached so all could see what she wrote on behalf of the group, second-year students asked smart questions about three clinical cases related to their coursework. They refined insightful queries about what was going on with the relevant organ systems, how certain drugs worked, and which symptoms had clinical import.

  But in two hours of case discussions, only one student murmured a comment in response to the pathos of a patient’s situation. After only the first year of what would be at minimum a seven-year process, these medical students had learned to ignore the evident suffering described in their hypothetical cases—a young boy’s fear as he gasped for breath during an asthma attack, a healthy middle-aged man’s sudden loss of limb and livelihood, and an older woman’s repeated vomiting as a result of food poisoning.

  After the session, I commented on this total focus on the pathophysiology and pharmacology to the course director, who had been lauded by my hosts as their best and most creative teacher. He said the curriculum was designed to encourage focus. It wasn’t possible to teach pathophysiology and clinical care simultaneously, he said, adding that such an approach was too confusing and overwhelming for the students. They needed these fundamentals first. The patient stuff, he said, was addressed elsewhere in the curriculum. He relayed all of this with evident thoughtfulness, commitment to his learners and to medicine, and without irony.

 

‹ Prev