Elderhood
Page 22
I rested my hand on my father’s arm to get his attention and said, “Dad, how much would you mind if I did a rectal?”
We doctors do many things that are otherwise unacceptable. We are trained not only in how to do such things but in how to do them almost without noticing, almost without caring, at least in the ways we might care in different circumstances or settings. A rectal exam on one’s father is exactly the same as other rectal exams—and also completely different. Luckily for me, my father was a doctor too. When I asked my crazy question, he smiled.
“Kid,” he replied, “do what you have to do.”
I found gloves and lube. I had him roll on his side. Afterward, I took my bloody gloved finger out into the hallway to prove my point.
I realize that walking to the nurses’ station holding aloft one’s bloody, gloved hand is not an optimal tactic from a professionalism standpoint—but it worked. A nurse followed me back into my father’s room, saw my panicked mother holding a bedpan overflowing with blood and clots, and called for help. Within seconds, the room filled, and minutes later, when the ICU team showed up, I stood back, a daughter again.
In retrospect, what is most interesting is how much more comfortable I felt performing an intimate procedure on my father than demanding the attention of the professionals assigned to care for him. Abiding by the unspoken rules of medical etiquette, I had quieted my internal alarms for more than two hours. Instead, I had considered how doctors and nurses feel about so-called difficult families. I had prioritized wanting us to be seen as a “good family” over being a good doctor-daughter.
Although many physicians would have made different choices than I did, the impetus for my decisions lay in a trait of our medical culture. When we call patients or families “good,” or at least spare them the “difficult” label, we are rewarding acquiescence. Too often, this “good” means you agree with me and don’t bother me and let me be in charge of what happens and when. This definition runs counter to what we know about truly good care as a collaborative process. From the history that so often generates the diagnosis to the treatment that is the basis of care or cure, active participation of patients and families is essential to optimal outcomes.
Most patients and families who are considered high-maintenance, challenging, or both are simply trying their best to manage their own or their loved ones’ illness. That we sometimes feel besieged or irritated by these advocates speaks to opportunities for improvement in medical culture. The physician and nurse representatives of that culture would benefit from a lens shift, seeing more vocal patients and families as actively engaged, presenting new, potentially important information, and expressing unmet needs. That won’t happen unless the health care system begins valuing and rewarding the time that clinicians spend talking to patients and families.
Many years later, the most vivid image I have of that night is not my father wobbling in the bathroom surrounded by cold, hard tile or a mustard-yellow bedpan filling with bright red blood. The image is this, a worst-case might-have-been scenario had I not been there: my parents, sleepy, snuggled together at the top of the gurney, my mother resting her head against my father’s chest, their eyes closed, faces relaxed. His systolic blood pressure, usually 130, dropping to 80, then 70. The monitors turned off or ignored. The lights dim. A short nap and they’d feel better. A little rest and maybe it would be time to go home.
OUTSOURCED
By the time I got back, it was too late.
When I had left two weeks earlier, Neeta was in the hospital after a fall and hip fracture surgery. A few days later she’d been discharged to a skilled nursing facility—one I wouldn’t have chosen. There they’d managed her delirium with sedating drugs, and she’d barely eaten or begun physical therapy. Now she was bedbound with a huge pressure sore, malnutrition, and a wound infection. Her only option became hospice care.
Already I missed how before the fracture, over the course of home visits, Neeta would age: telling me when I arrived she was ninety-two, then ninety-five, and ninety-seven, and so on, until she was over one hundred years old by the time I would leave. I loved the way she had called me “my love” or “darling.” She remembered she liked me if not my name. I had looked forward to her jokes.
“Why did the police arrest the belt?” she asked with a grin as I set up for a blood draw. “Because it held up some pants!”
Her son and daughters—who lived on the same block as her, who had continued working after they might have retired in order to provide her with years of in-home care twenty-four hours a day—were heartbroken that she was dying. “How could this happen?” her son asked when I called. “I thought the hospital and surgery were the dangerous parts.”
There were so many ways to answer his question. At nearly every step in her illness, Neeta had received care that was typical but not what she needed—nor was it optimal for a frail older adult.
Increasing numbers of hospitals offer orthopedic and geriatric co-management of elderly hip fracture patients. The surgeons repair the hip; the geriatric medicine specialists take care of everything else, from co-morbid conditions to care and life priorities. With this approach, patients get to the operating room sooner, have fewer unnecessary tests, are less likely to experience delirium, leave the hospital sooner, and are more likely to be walking and living meaningful lives46 a year after their fractures.
But what happened at the hospital was just one part of what went wrong. I had to keep myself from reminding her son that I’d warned him. Before leaving for vacation, Neeta’s son had said they were thinking about the facility where she ended up because it was located near their neighborhood.
“That way we can see her before and after work,” he had said.
Rather than repeating the bad rumors about the place he mentioned, I had lauded two better places. I should have been more emphatic about the consequences of choosing the wrong nursing home and stressed that no matter where Neeta ended up, they should not just hand over her care to the facility, assuming everything possible would be done as well as it could be, or even well enough. I should have explained that even when I fought to get my own father into one of the better places in the city after his surgeries, we remained with him, taking turns to ensure one of us was there 24/7 initially and then hiring someone to help overnight once the worst was over. I had spent enough time in hospitals and nursing homes to know that if we didn’t, bad things were likely to happen.
Neeta’s family also assumed the hospital doctors and discharge planners were equipped to offer the needed guidance. Speak human-to-human to most health professionals, and they seem well aware that most nursing homes make a casket look inviting. But with quality of hospital care and optimal resource utilization linked to patient discharge, hospitals put steady pressure on staff to discharge patients after a set diagnosis-dependent period of time. Since those doctors get little training in outpatient medicine, geriatrics, medical transportation, and home care, nursing homes are often the best and easiest way for them to discharge people who aren’t ready to go home. The trouble is, doctors don’t know much about nursing homes either.
A 2017 medical journal article looked at how hospital-based doctors handle transfers to nursing homes.47 They described pressure to speed up discharges, a tendency to use skilled nursing facilities as “safety nets,” the absence of a decision-making system or framework for matching patients and places, and little knowledge about the quality or patient outcomes of local facilities. This common reality explains how Neeta ended up at a low-quality nursing facility despite my warnings and her family’s best intentions.
In A Woman’s Story, the French writer Annie Ernaux described a key transition in her mother’s life, a series of events that happen every day to older people. On a warm day, her mother fainted and was taken to “the medical service of the old people’s home,” an option not available in the United States. Rehydration, food, and a few days later, feeling back to normal (and not realizing that her normal had become
less than what it had been), Ernaux writes that her mother insisted on leaving: “ ‘Otherwise,’ she said, ‘I’ll jump out the window.’48 According to the doctor, she could no longer be left on her own. He advised me to put her into an old people’s home.” The author takes her mother home. The subsequent events culminate in this comment: “And here her story stops for there was no longer a place for her49 in society.”
But it didn’t stop. The pages that follow detail confusion (mother), traffic accidents (daughter), fury (both), forgetting, hallucinations, strange eating habits, and a hospitalization where “the nurses had to tie her to her chair because she kept trying to escape50 from the ward.” From there her mother moved to the nursing home unit, a modern building behind the hospital that sounded lovely and smartly organized. There, her daughter found her one evening, “already asleep at half past six, lying across the rumpled sheets in her slip. Her knees were up, showing her private parts. It was very warm in the room … Within a few weeks, she lost her self-respect.”51
She may also have been drugged. Likely, she was bored. Those who can often refer to themselves not as “residents” or “patients” but as inmates. Walk the halls of most nursing homes, and you will see people “parked” in the hallways, sleeping or staring or screaming. The odor tends to be anything but homelike. Ditto the food. The people working there wear uniforms, not clothes. They are paid very little. Some are there because they enjoy working with older people; many more because they need a paycheck and this is their best option. Much of the time, neither those who live at the “home” nor the staff want to be there. Old people abandoned by society to such places often earn labels such as “noncompliant” or “difficult,” as if resisting unwanted care or trying to leave lockup in a regimented nonhome represents something other than rational self-preservation.
Neeta ended up in that sort of place, a place that essentially killed her. It looked and smelled like a subpar hospital and bore no resemblance to a home. People were tied down, vocalizing, ignored—not always, but often enough, and how often is too often? If you love a person, or believe human beings in need should be treated well, it’s hard not to answer Never.
People often assume that, until recently, advanced old age was different: old people remained at home cared for by their families. In fact, institutions providing basic care to older adults have existed for thousands of years. In some ways, they were needed more in the past than they are today, since, for most of human history, living into old age often meant outliving one’s children. Notably, while approaches have varied over time, old-age institutions have almost always come in one of two flavors: sympathy and antipathy.
Prior to the advent of retirement and pensions in the late nineteenth century in Europe and the twentieth century in the United States, anyone who wasn’t rich—most people—and who didn’t have family or friends to support them had to keep working. When illness or advanced age made that impossible, they became impoverished and homeless. In some eras, they suffered and died in that state. In others, they were thrown into workhouses or poorhouses with criminals and the mentally ill. The inability to work was seen as a sign of poor character, no matter a person’s age. Often, institutions got people off the streets but housed them in cold, dirty, crowded facilities with a small amount of food so living there wouldn’t appeal to “vagrants.” At other times, religious organizations or governments built facilities in an attempt to respond with compassion to the needs of their oldest citizens. These patterns recurred across time and countries.
During the Roman Empire—a period renowned for its efficiency—a system of homes called gerocomeia was established, beginning in Constantinople.52 Rather than being tucked out of sight as inconsequential, residents were visited annually by the emperor in recognition of the potential political power of older adults. In the early days of Christianity, monasteries often provided food, shelter, and care for the old and infirm. This practice of offering hospitality led to the name and institutions we know as hospitals, though initially they were more custodial than medical, more like modern-day nursing homes. From these initial local acts, the Byzantine emperors, church, and benefactors began building nursing homes alongside or near monasteries throughout the empire. This assured that no matter where a person lived when they became ill or old or disabled, they could get help.
Although old age always eventually brought the need for assistance, when religion dominated, it had spiritual value. Old people were closer to God. When the state took over from the church, advanced old age transitioned from a spiritually valued life stage to a social problem. Separation of frail older people from the rest of society was used as a means of control or a form of punishment, and it was often an act of expediency. Putting many people with similar needs in one place focused resources even as it facilitated systematic segregated dehumanization. The state managed people to civilize them, preserve the social order for everyone else, and demonstrate its capacity to handle all classes of citizens. Having the streets full of poor or old people, dirty and hungry, implied a failure of governance.
Comparing England with France over centuries illustrates how governments can affect the quality of older lives for better or worse. In England, the church-based system worked well until a new administration changed everything for reasons that had nothing to do with old people. During the Reformation, King Henry VIII abolished the monasteries in an effort to ensure the domination of Protestantism over Catholicism. With the monasteries went the nursing homes. Not until the Poor Laws53 centuries later did the British state again support local parishes in taking responsibility for their impoverished and elderly citizens.
The French didn’t set up their hospitality institutions, or “hospices,” until after a royal edict in the mid-1600s, when conditions in England were already in decline. The hospices were not initially hospitals in the sense of the word today—that didn’t happen until after the French Revolution in 1789—but institutions that simultaneously served as prisons, insane asylums, and residential homes for the disabled of all ages and the elderly. The grouping of older adults with criminals, the mentally ill, and the chronically disabled was telling. While people in those different categories are now separately housed (not that those various conditions are mutually exclusive), our general attitudes toward them are largely unchanged. Each is seen as costly, burdensome, and of little social value. From the start, their treatment has raised questions about whether their sequestration enabled the focus, help, and care they needed or served other purposes. While many have argued for the segregation of criminals to protect the public, the segregation of those with mental illness or debility helped families by transferring custodial and care responsibilities to the state. But if they offered protection, it was of the sort that allowed people without those conditions to live lives in which those with those conditions did not exist. People could go about their days without the “lesser” and “burdensome” others visibly present and pretend that they, too, might not end up in similar straits. In country after country, more and more families abdicated their responsibilities.
Sometimes, too, the so-called afflicted refused familial shelter and care, ending up on the streets. Again, we see how little changes. In the 1970s, in the wake of revelations of abuse and neglect in mental institutions, deinstitutionalization became the standard in the United States. Walk down the streets of San Francisco or any other American city today, and you will see people huddled in doorways, under overpasses, in sidewalk tents, and on traffic islands. We still haven’t found a satisfactory solution to a situation that transcends centuries and cultures. It’s hard not to suspect that our approach is fatally flawed.
In the twentieth century, with the huge progress in medical diagnosis and treatment, old age became medicalized, offering new ways to see old people as “problems.” Suddenly it wasn’t just the poor who ended up in nursing homes but anyone whose body was seen as requiring ongoing management. Those institutions, designed for societal undesirables, retained much o
f the workhouse era’s attitude toward residents in their structures and systems. No one saw much point in investing in or truly attending to the lives of people who were, as a nonagenarian once said to me, “past their use-by dates.”
And then, once again, the options changed for reasons ranging from good intentions to profiteering. Recent decades have brought a shift from public to private institutions and a flourishing of alternative facilities, from small mom-and-pop board-and-care homes to high-end assisted living centers that somewhat resemble a college dorm or cruise ship. This has made nursing homes even more repellent. It has also created new challenges. Ancient, frail people with resources and options increasingly spend the last months or years of their lives in assisted living, places generally not set up with them in mind. The less moneyed end up in unsafe living conditions54 without adequate food, help, or human contact, or spend down until they become eligible for Medicaid, and then only certain sorts of places will have them. Once there, they are subject to the very “deprivatization of experience”55 for which old age is reviled.
Although facilities are often the stuff of horror stories, the same is true of some old people’s lives with their families. In both situations, vulnerable people are behind closed doors with more able and powerful people. As much as the bodily changes of old age, that fate has made the final years of old age fearsome. The worst could happen to anyone. And it does: elder neglect and abuse occur in people of every social class, background, and geography. Today, there are more choices than ever for where to spend one’s old age but still relatively few that hold much appeal. Meanwhile, we close our eyes, individually and collectively, avoiding our friends and loved ones once they are debilitated, until one day we find “they” are us and it’s too late to do anything about it.
A photo I sometimes use in teaching shows a hallway in a nursing home with a line of older women in wheelchairs. Heads hang down, are held in hands, eyes are closed. Is this old age, or a way to cope with a certain brand of old age, one no more appealing to old, frail people than to the rest of us?