Elderhood

Home > Other > Elderhood > Page 36
Elderhood Page 36

by Louise Aronson


  When it comes to death, patients and families often don’t know what to expect; not having been to medical or nursing school, they rely on their nurses and doctors for guidance. Ironically, despite the medicalization of dying, most doctors have little training in death.

  Medicine still largely sees death as its adversary, instead of positioning itself as a tool to help ease that inevitable transition. Education about how to talk with patients and families about difficult decisions, bad news, and death only became standard in medical schools in the 2010s. It still isn’t a required part of residency training in most specialties or subspecialties.

  When we arrived at Cathy’s home, it was clear she had entered the phase known as “active dying.” Fortunately, it didn’t take much to make Cathy comfortable. It took recognizing that she could no longer swallow and moving to liquid medications that could be absorbed through her gums. It took knowing that giving her pills or food would only cause choking and suffocation, and that she no longer needed them. It took knowing to fold a flat bedsheet in half under her torso so she could easily be repositioned without any of her family members hurting her or themselves. It took knowing how to change a diaper on an adult by rolling her to one side or another. It took knowing that she probably was not thirsty but that her dry mouth and lips were uncomfortable, and glycerin sponge mouth swabs and lip balm could make her feel better quickly.

  It took experience and comfort with death. Not an advanced medical degree. Not liking death. Not looking forward to it. Just understanding that it’s a defining part of life and approaching it accordingly.

  TECH

  On each housecall, I stay longer than I should, longer than I want to, and longer than planned for by our home visit scheduler. I can’t leave because Dot is holding my hand, or because she won’t stop talking—telling me, not for the first time, about the time Aunt Martha cut off all her hair and they called her a boy at school, or how her daddy lost his job and the lights went out and the babies cried and her mother lit pinecones and danced and made everyone laugh. Sometimes I can’t leave because Dot just has to show me one thing, but getting to that thing requires that she rise unsteadily from her chair, negotiate her walker through the cluttered kitchen and narrow hallway, and find whatever it is in the dim light of her bedroom when I know she can hardly see in the bright fluorescence of the kitchen where I usually examine her.

  I can, and do, write prescriptions for Dot’s many medical problems, but I have little to offer for the two conditions that dominate her days: loneliness and disability. She has a well-meaning, troubled daughter in a faraway state, a caregiver who comes twice a week, a friend who checks in on her periodically, and she gets regular calls from the Friendship Line.

  It’s not enough.

  And she, like most older adults—like most of us—doesn’t want to be “locked up in one of those homes.” What she needs is someone who is always there, who can help with the everyday tasks she now finds so challenging, and someone who will listen and smile and hold her hand. What she needs is a robot caregiver.

  In an ideal world, each of us would have one or more kind and fully capable human caregivers to meet our physical, social, and emotional needs as we age. In an ideal world, the many people who need jobs would be matched with the jobs in need of many people.24 But most of us do not live in an ideal world, and a reliable robot may be better than an unreliable or abusive person and better than what most people get, which is no one at all.

  Caregiving is hard work. More often than not, it is tedious, awkwardly intimate, physically exhausting, and emotionally challenging. Sometimes it is also dangerous or disgusting. Almost always it is 24/7 and unpaid or low-wage and has profound adverse health consequences for those who do it. It is women’s work and immigrants’ work, and it is work that we have made so undesirable and difficult that many people either can’t or won’t do it.

  Many countries have acknowledged this reality by investing in robot development. In Japan, where robots are considered iyashi, or healing, the health ministry launched a program designed to both meet workforce shortages and help prevent injuries to humans by promoting nursing care robots that assist with transfers. The robots help with mobility and lifting, and they are programmed to be emotionally expressive, polite, even charming. There are also “socially assistive robots” that do things such as lead exercise classes, even recognizing their regular attendees, greeting them by their names, and engaging them in conversation. A consortium of eight European companies and universities collaborated on a programmable, touchscreen-toting, humanoid-appearing “social companion” robot that offers reminders and encourages social activity, nutrition, and exercise. In Sweden researchers have developed a robot that looks like a standing mirror–cum–vacuum cleaner, monitors health metrics such as blood pressure and activity, and allows virtual doctor visits.

  Although investigators in the United States are developing robot caregiver prototypes as well, we have been slower to move in this direction. The reaction to robot caregivers in our press, in professional journals and conferences, and among some of my medical colleagues has included skepticism, concern, and occasionally outrage.

  As Jerald Winakur, a San Antonio internist and geriatrician, puts it, “Just because we digitally savvy parents toss an iPad at our kids to keep them busy and out of our hair, is this the example we want to set when we, ourselves, need care and kindness? When we need to know we are loved, that our lives have been worthwhile, that we will not be forgotten?”

  Robot caregivers raise fundamental questions about what and who matters in society, how societal priorities are created and reinforced, and how we define progress. Although Winakur mentions screens as babysitters, we already have abundant evidence on the harms of that approach25 to kids’ social, emotional, intellectual, and linguistic development. Tech has adverse health consequences in adults, too, including increases in insomnia, vision and hand disorders, anxiety, narcissism, distractibility, and the need for instant gratification.26

  Hesitation about robot caregivers among some health professionals is not because American medicine eschews robotics. We have robots to assist in surgery, and basic “walking” robots—usually faceless or, in children’s hospitals, with decorative humanoid features—that deliver medications and other supplies. Some long-term care facilities are testing robots that help with lifting or cleaning, and robots increasingly are used in rehabilitation after strokes and other debilitating events.

  Of course, a robot that carries linen along hospital corridors, or cleans out your arteries as you enjoy an anesthesia-induced doze, or even one that helps transfer you from bed to wheelchair, isn’t the same as a robot meant to be your friend and caregiver. For most of us, it makes sense that a robot might address certain physical and functional needs. But could a robot—a machine—possibly play a role in the most human and existential parts of our lives?

  My initial response was no way. Yet, while the jury is still out, it seems increasingly likely that the answer will be yes. Search YouTube, and you can watch elderly Japanese people with dementia smiling and chatting happily with a robot that looks like a baby seal and responds to petting and talking. You also can see developmentally delayed children doing therapy with a cute, colorful robot that also collects information about their performance.

  Walk down any street, or sit in a restaurant, or enter a workplace, and you cannot miss the ubiquitous people fully engaged with the machines in their hands or on their desks. Admittedly, some are interacting with other humans via their machines, but nevertheless the primary interaction is human-and-machine. Despite compelling protests that such interactions do not constitute meaningful, empathic relationships, they seem to provide stimulation and satisfaction to billions of people. Maybe you are one of them, reading this on a device.

  Those who say a robot cannot provide the same comfort and caring as another human being are not considering three important facts. First, not all humans provide comfort, care, and stress relief
to their relatives or the people for whom they provide caregiving. Indeed, many have the opposite effect, sometimes despite good intentions, and other times in willful acts of negligence or abuse. Second, robot caregivers and human caregivers are not mutually exclusive. We are not choosing from a menu of two options but developing ways to use both the humans and the robots to optimize care. Robots must supplement, not replace, human care. Third, we do not have enough caregivers for the current numbers of older Americans. Of course, we could change that by providing a reasonable wage, education, training, rewards, and recognition for this critical work to make it more attractive and interesting to the many millions of people who need jobs or currently choose other types of work. With a rapidly aging population and declining birth rate, we need creative solutions to this urgent workforce crisis.

  In the next decade, scientists will refine current applications of robots and combine their physical assistance and social support functions to meet at least some of the complex needs of frail, older adults. According to James Osborne, director of the Quality of Life Technology Center at Carnegie Mellon, the current limitation is not the technology but finding a viable business model. Still, he adds, “I really expect there will be a robot helping me out when I retire. I just hope I don’t have to use all my retirement savings to pay for it.”

  In that new world, my patient Dot’s lonely life would be improved by a robot caregiver.

  Since the robot caregiver wouldn’t require sleep, it would be alert and available 24/7, perfect for Dot, who reads late into the night and wakes after noon. It would be there in case of crisis. Because Dot does sleep, the robot could do cleaning, laundry, cooking, and other household tasks during those hours. And when Dot awoke, she would be greeted by a kind, humanlike voice, a smile, and a “being” able to help her get out of bed and to the bathroom without injuries to either of them. After she washed her face, the robot might hand her a towel, wipe any water up off the floor so she wouldn’t slip, and make sure she was clean after she used the toilet. It would ensure she took the right medications in the right doses. At breakfast, the robot might cook a warm meal or bring in the freshly delivered meal and heat it for Dot as they chatted about the weather or news, both of which the robot would know or could provide by turning on its internal radio.

  And then, because Dot’s eyesight is failing, the caregiver robot would offer to read to her. Or maybe it would provide her with a large-print electronic display of a book, the lighting just right for Dot’s weakened eyes. “What does durian mean?” she might ask, and the robot would say it’s a South Asian fruit that smells like old socks and tastes like perfume.

  “No wonder she’s making a face,” Dot might remark of the story’s heroine, and they would both laugh.

  After a while the robot would say, “I wonder whether we should take a break from reading now and get you dressed. Your daughter’s coming to visit today and we want to be ready.”

  This reality is both disturbing for the human abdication of social responsibility it represents and a portrait of a safer, more pleasant life than Dot’s current one. It’s perhaps not surprising that most of the engineers of robots of all kinds come from the demographic groups least likely to provide actual human-to-human care. The more we use their devices, the more rich and powerful they become. In society, their ascendance has paralleled increasing income inequality and social strife. In medicine, it has ushered in the era of burnout and a time when patients describe their pain and suffering to the side of their doctor’s face as he or she types, and types, and types, into the electronic medical record. We forget that technology is not necessarily mutually exclusive with compassion, equity, and justice—unless we allow it to be.

  When someone becomes ill or frail, they usually also become less public. Perhaps they are mostly homebound by sickness or fatigue or debilities. Perhaps it is a choice. But sometimes, too, getting out requires help, and that help isn’t forthcoming. Or going out provokes stares or their twin, the deflected gaze, so they stay home, sparing others discomfort and themselves shame and humiliation. We throw a party and don’t invite them—it would be too much trouble; they probably couldn’t manage anyway; it would show up their current embarrassing state. So often by the time we might, through our own hardship, learn to fully appreciate the unnecessary hurt we’ve caused others, it’s too late.

  Allowing us to further abandon caregiving roles isn’t the only risk of technological caregiving. Other technologies, many already in use as part of the “quantified self” movement,27 and more in the pipeline, often reinforce the paternalism and lost autonomy of old age. Tech companies, sometimes trying in good faith to address the concerns of adult children of frail or cognitively impaired old people—and sometimes preying on them in terrifying ways—have created an array of devices to alert family and caregivers to old people’s health status and activities. Some monitor pulse and blood pressure, blood glucose, and sleep patterns. Others check whether the person got out of bed or opened the refrigerator.

  Some of these actions violate the privacy and rights of old people in ways that would spark outrage if done to middle-aged adults. Different generations have different notions of privacy, so this may change with time, but those who are old now and who will become old in the next few decades tend to see distant monitoring of their body and behavior as a violation of their privacy. If an older person can access and understand the message—that is, if they are literate, digitally literate, and do not have dementia, as most older adults do not—then why is someone else being informed about what is usually considered personal health information and no one else’s business?

  We must distinguish between this sort of infantilization and benevolent help in a life stage when people cannot always adequately care for themselves. Too often younger people assume incapacity in old people until proved otherwise, instead of the other way around. Too often, too, we assume the young way of doing something is the best or only way. Adding to the confusion is the interpretation of adequate, which often enough is in the eye of the beholder. It may look one way to a person who has never taken medications except when she feels ill and to her son who always follows directions. Here again, we hold old people to a different, higher, and sometimes unjust standard compared with younger adults.

  “You mean to tell me,” began a horrified, furious alcoholic former nurse, “that when I was sixty-four I could drink as much as I wanted and it was nobody’s business, and then overnight people can call Adult Protective Services if they don’t like how I choose to live?”

  Yes, I had to say. That’s just how it works. There are some good reasons for this, including that old age comes with physical and cognitive vulnerabilities not present in earlier decades and that acquired impairments in those functions rise significantly in the eighth decade of life onward.

  At the same time, the dividing line of sixty-five is historical and in many lives outdated. And there is the problem that reasonable people often disagree about the right approach to various situations. One person’s good decision is another’s bad decision. It’s hard to tease out judgments we may disagree with from ones that are simply wrong, and there’s also a huge gray area. A hard line can and should be drawn at harming others. But what about harming oneself? Often enough, drinking too much, eating too much, not washing, living in a filthy home, and taking risks are behaviors we allow once people reach adulthood. If a person has always lived a certain way, the sixty-fifth birthday seems an arbitrary moment in which to punish them for socially disapproved activities.

  Particularly as digital technology enables programming of household functions from afar and relaying of medical and personal information to family, caregivers, and health professionals, older adults will be at risk of losing autonomy and privacy. Too often tech innovations seem designed to assuage the anxieties of adult children at the expense of their parents. Innovators consider what old people actually need or want too infrequently. Whereas tech for younger adults focuses on self-monitoring a
nd tracking, tech focused on older adults often pulls in one or more others, without controls that can enable or refuse sharing, or guidance about the sorts of conversations families need to have to best balance the needs and worries of different members. It also focuses on the far end of old age, doing little to increase tech access for old people in the first decades of that life stage or even acknowledging the large numbers of tech-using old people.28

  Some technology shows tremendous potential. In no age group are people consistently able to remember to take medications, especially when they are needed several times a day. Reminder systems make sense for old patients not only because they are more likely to take medications, and to take many medications, but also because they have a higher likelihood of cognitive impairment. Exercise and activity apps appear to motivate people fairly effectively, and it’s worth considering how regimens can be made useful across ages and generations and levels of fitness, as well as what rewards provide the most motivation.

  Of course, for some people, monitoring may simultaneously lessen privacy and increase safety and independence. If a person can’t quite manage on their own, but a device and the help of distant others allow them to stay at home and manage better and more safely than without the device, that may be an attractive option.

  Different people value privacy and safety differently. It’s well known that adult children generally put safety first, while their parents often are willing to take risks in exchange for remaining at home or retaining control over their bodies and lives. A patient of mine complained that his son wanted him to wear monitors and install grab bars because he’d fallen at home, but when the father told the son he thought his motorcycle riding was dangerous, the son said, “That’s my business.”

 

‹ Prev