In old age, as at all ages, tech has previously unimagined benefits and terrifying harms. Acknowledging the moral range of potential applications is the first step toward using it to improve our elderhood, not replace nursing homes with virtual control and manipulation.
MEANING
Zeke Emanuel—brother of both Chicago mayor Rahm Emanuel and the Hollywood talent agent Ari Emanuel, the latter made famous by the program Entourage—is an oncologist, bioethicist, and one of this country’s leading public physicians. In an Atlantic essay called “Why I Hope to Die at Seventy-Five,” Emanuel said he would stop most medical care at age seventy-five.29 He would do things like get a hearing aid and take pain medications but would not try to prolong his life with preventative or heroic medical treatments.
Emanuel isn’t going to kill himself. He has opposed euthanasia and physician-assisted suicide for decades. Instead, he argues that past a certain point, medical care that once might have been helpful becomes counterproductive, with treatments more likely to prolong time people don’t want and less likely to provide them with more of what they do. I, too, have seen this, again and again, as have so many doctors.30 He wrote:
Here is a simple truth that many of us seem to resist: living too long is also a loss. It renders many of us, if not disabled, then faltering and declining, a state that may not be worse than death but is nonetheless deprived. It robs us of our creativity and ability to contribute to work, society, the world. It transforms how people experience us, relate to us, and, most important, remember us. We are no longer remembered as vibrant and engaged but as feeble, ineffectual, even pathetic.
The article combines important truths about old age with a very particular worldview, and it has several blind spots. Emanuel appears to assume decline and disability cannot co-occur with contributions to “work, society, the world.” He cares so deeply about his legacy of public achievements that he denies the possibility of meaningful relationships with people who are or have become enfeebled and further devalues the majority of human lives, ones in which his notion of “legacy” is irrelevant.
In a later interview on the same topic, Emanuel claimed that if you talk to experts in Japan, you will learn that everyone has dementia by age one hundred.31 (Unless the Japanese are very different from Americans, this is false, although with age the brain, like everything else, changes.) If dementia terrifies him—as it does so many, and if, given his values, it’s his worst possible future outcome—then he’s entitled to hedge his bets, changing course medically as he ages in hopes of dying before he might get it. His approach both makes good sense and assumes a life worth living is always mutually exclusive with cognitive decline or frailty. After decades spent caring almost exclusively for very old, frail people, I know three things: lives can have meaning despite significant decline and disability; different people draw the line in very different places as far as where they would like to die; and because of medicine’s shortsighted approach to “progress,” too many aged people are forced to go on once they’ve passed their natural and preferred thresholds as a result of medical “care.”
For Zeke Emanuel, meaningfulness has to do with the ability to do a certain sort of work, the sort of work he has always done and values most. That’s fine—for him. In the interview, he notes that few people are productive in their work after age seventy-five, a comment that isn’t completely accurate and doesn’t adequately take into account how longer human lives have begun to change our relationship to work as we age. It also discounts the value to society and in individual lives of the same sorts of work that often go unpaid or underpaid—so-called women’s work, most particularly caregiving and volunteering of all sorts. (A quarter of America’s forty million unpaid caregivers are themselves over age seventy-five, and most are women.) To Emanuel, “meaningful work” implies a paycheck and perhaps even an influence on the world. The sort of work he does, in other words, though not the sort done by most women and men. He also adopts the modern industrial notion that what counts most is productivity, raising the question of whether learning or art or relationship building qualify as productive activities.
His value judgments continue: “a life where the dominant thing is only fun or play, doing a crossword puzzle, reading a few books, seeing the grandkids once every month or something. That’s not a meaningful life. I don’t want that life. I don’t think anyone should find that life fulfilling.” Emanuel is entitled to his vision of his own life, but he gets himself into trouble with that last sentence. It judges others in ways that deprive them of what he is asserting for himself: the right to assign value to their own lives. It also discounts the daily reality for the majority of humans of all ages who are less economically and socially fortunate than he and the millions who take pleasure in such lives. Disturbing, too, from a life course perspective is this statement: “I would challenge them whether that really is meaningful or what they have done is narrowed down what constitutes meaningfulness for them to accommodate their limitations physical, cognitive and whatever.” If such adaptation were a problem, we’d all have to kill ourselves by age forty.
When someone with Emanuel’s authority and influence makes statements without also acknowledging his cultural vantage point or the social disparities and policy failures that have created the circumstances in which such sentiments seem reasoned and reasonable, he then constructs, allows, and enables the old age he wants to avoid—and not just for himself but for all the rest of us, especially those who aren’t in a position to judge or shape the lives of tens and ultimately hundreds of millions of their fellow citizens.
What Emanuel does to old age in his essay is what medicine does to old age in American society. Contrast that biology-as-destiny approach with the view of Linda Fried, the geriatrician head of Columbia’s School of Public Health. Writing in the same magazine just a few months before Emanuel, she said:
Too many of my patients suffered from pain, far deeper than the physical, caused by not having a reason to get up in the morning. Many of my patients wanted to make a difference in the world but, finding no role for themselves, were treated as socially useless and even invisible.32
Fried echoes Marjory Warren, who transformed medicine with her mid-twentieth-century Middlesex hospital old age ward. Fried has worked to accomplish something similar, starting the Volunteer Corps and leading other societal and policy changes that take advantage of older people’s experience and skills while creating opportunities for them to do meaningful work. At all ages, biology is but one part of a human being’s experience of the world.
Adaptability is generally considered evidence of an open mind, creativity, and resilience. The anthropologist Margaret Clark reframed aging as an ongoing process of simultaneous adaptation—not only to one’s changing body but equally to one’s specific social and cultural situations.
Interviewing both healthy community-dwelling older adults and ones admitted to a psychiatric hospital for late-life mental health problems, Clark found the two groups agreed on personal goals in old age33: having independence, social adaptability, adequate personal resources, and the ability to cope with external threats of changes; maintaining significant and meaningful goals; and having ability to cope with changes in self. Where the two groups differed was in how they thought they would achieve those goals. Healthy participants used values that would allow attainment even as they became frail (congeniality, using resources wisely, calm self-acceptance), while the hospitalized ones set themselves up for failure by judging attainment based on external factors including power, status, and recognition. Successful adaptation to old age, Clark concluded, required renouncing middle-aged and culturally dominant norms in favor of ones better suited to late-life abilities, resources, and roles.
The work of the medical anthropologist Sharon Kaufman further clarified the transition: “The old Americans I studied do not perceive meaning in aging itself; rather, they perceive meaning in being themselves in old age.”34
She explains that
people continue to revise and create their identity, that people’s self-perceptions are independent of age, and that distress ensues when self-image and others’ perceptions are in conflict. She also discusses that people are best able to maintain a sense of meaningful self when they continuously restructure their identity to unify who they were with who they now are. People often say they didn’t feel old until they had a fall or were hospitalized or several of their closest friends died in a short span of time. In response to those realizations, some people feel resigned or hopeless. Others reconstitute who and what they are.
Two patients of mine had to start using a walker in the same short space of time. Neither was particularly happy about it. Helena refused to go out. She didn’t want to be seen as old (something that seemed obvious to me, walker or not). Esther asked if we could reschedule our appointment. She was going to the movies with a friend; it took her a bit longer to get around, and she wanted to be sure to leave enough time. Later, she told me she now needed an aisle seat with a wall nearby to prop her walker and that I should definitely go see the movie. The aging body matters in what people can and cannot do, but identity, additions and modifications to how a person sees themselves, and social context are no less important in determining a person’s well-being.
We revise our behaviors, expectations, and self-image throughout life. In that regard, old age isn’t different from earlier stages.
IMAGINATION
It was to be my first commencement speech, so I did what most twenty-first-century people do when facing a situation that calls for insight, humor, and, especially, originality: I went to Google and YouTube to see what others had done.
That, as I later admitted to my audience, was a mistake. Some of the best graduation speeches have been given by people like Steve Jobs, J. K. Rowling, Ellen DeGeneres, and the Dalai Lama. Naturally, my reaction to those cultural icons’ funny, moving, insightful speeches was to check my e-mail and Twitter accounts and to realize that, although our dog appeared to be sleeping, any fool could tell he needed another walk. I couldn’t possibly work on my speech.
On the walk, I got lucky. Perhaps because exercise stimulates creativity, I realized that I should talk about the very thing those famous people, with their disparate accomplishments, had in common—not only with each other but also with most people who make a difference. They hadn’t succeeded simply because they were smart and hardworking. Success came because they saw the world in new, interesting ways. It came from imagination.
But I still had a problem. Although imagination had the right mix of import, surprise, and universality to make an ideal topic for a speech intended to launch young people into their adult lives, I worried that it would be a hard sell to my intended audience of graduating health professionals and medical school faculty members.
To some people, including many doctors and scientists, the need for imagination in medicine and science is obvious. It may be that such people themselves have powerful imaginations. Others, often also doctors and scientists, require more convincing. This isn’t because they lack the capacity for imagination so much as because they don’t always apply that word to their work when it’s warranted or because their own imaginations have grown weak from disuse in the years since they decided on their “serious” health science careers. Because imagination is hard to see or measure or test for, it has increasingly been associated solely with the humanities and arts, those second-class citizens of twenty-first-century life,35 and rarely discussed or cultivated during medical and scientific training. That parsimonious view of imagination couldn’t be further from the truth.
By imagination, I don’t mean fantasy or make-believe but something close to and necessary for creativity, insight, innovation, and empathy. For hard-core scientists and others who see themselves as not in need of imagination in their work or lives, it’s worth considering the wisdom of Albert Einstein, who said, “Imagination is more important than knowledge. For knowledge is limited to all we now know and understand, while imagination embraces the entire world, and all there ever will be to know and understand.”
If a person bases their work on knowledge alone, it will be limited; but if the same person engages their imagination, anything is possible. Imagination is the progenitor of hypotheses, new ideas, and original ways of seeing. It helps organize information, shaping what we think and feel about other people. Imagination isn’t just a tool used by writers, artists, chefs, designers, and advertisers. It’s the faculty or skill that led Steve Jobs to look at the ugly collections of metal and plastic then called computers and ask: Why can’t it also be beautiful, and fun, and small enough to fit in my jeans pocket? And imagination led Sidney Farber, a groundbreaking physician, in the 1940s to suppose that lessons learned in treating nutritional anemias might be applied to treating leukemias as well. Now most kids with leukemia are cured.
An intellectual use of imagination is essential for scientific and medical progress. But it’s not the only use of imagination that matters for health and in health care or for happy, successful lives.
Sometimes, it’s not even the one that matters most.
My cousin’s son, whom I’ll call Marc, is a college student who plans to go into medicine. He spent a recent summer working in a lab at a medical center where he also sometimes shadowed doctors. One August evening, at a family birthday dinner, Marc said he’d seen the most incredible thing in the emergency department that day. He was really excited, so we all put down our forks and leaned in to hear his story.
Apparently, a young man had been brought in from a local prison because he’d fallen from his bunk bed and could no longer move or feel his legs.
“It was the lower bunk,” Marc said, shaking his head.
The emergency department doctors stuck the patient with pins, poked and prodded his legs. Nothing. No response. “I can’t feel it,” the man kept saying. He was visibly upset. Then one doctor distracted him while another went behind him and jammed something really sharp into his back. He jumped, screeched, and moved his legs. He’d been faking. The doctors and nurses filed out of the room laughing.
“The whole thing took less than five minutes,” Marc said with a grin.
Now, this story has many lessons, and one of them is that you might not want to invite me to your birthday party. Here’s why: I was outraged. I told Marc the doctors had behaved unprofessionally and dangerously. I asked him to consider the harm they might have done if the patient had truly been injured. Next I suggested he imagine a different young man, perhaps a university student like himself rather than a prisoner, reporting a similar fall and complaint. How did he think that scenario would have played out? Might they have followed standard procedures rather than jumping to the conclusion that he was faking? I thought so.
Finally, I asked Marc to think of reasons why the patient might have faked an injury. Maybe he’d accidentally angered a gang member and feared for his life. Maybe he had a mental illness and a voice had told him to jump from the bunk and next time would tell him to do something far worse. And maybe, in a tough situation, he exhibited traits that someone paying attention might help him harness so when he left prison he would have the agency and know-how to put them to use building a life for himself on the right side of the law.
And maybe he was just a scammer. We’ll never know, since his doctors failed to engage their clinical, empathetic, and ethical imaginations.
But they weren’t the only ones. Hearing Marc’s story, I responded in just the same way, leading with my own dearly held biases and goals, and without thinking about the needs of the other people in the room, particularly the youngest one, who was telling his story to people he was supposed to be able to trust. I failed to use my imagination, and I should have known better.
I am an unlikely doctor. For the first twenty-some years of my life, I went to great lengths to avoid math and science. Early on in high school, I got permission to take one course over the normal load so I’d have as many As as other good stude
nts, even when I got a bad grade in algebra, which I was sure to get.
Years later, after the first day’s lectures at med school, I phoned home and explained to my parents that I understood about one of every four or five words uttered by my professors, and too often those words were articles or conjunctions. I suspected I might perform nearly as well if the instruction was being given in Cantonese, a language I do not speak.
I also soon discovered that I had all the wrong instincts. In my case-based learning small groups, the other students would all, swiftly and uniformly, offer up identical questions and next steps: How does that work? What’s the mechanism? What tests do we need? My responses were different: How are we going to tell his family? Or: What do we need to do to get her on the transplant list? The problem wasn’t that their reactions or mine were wrong, but clearly theirs were the ones desired by our course directors and teachers, and mine were not.
And then, I began—insidiously at first, but soon with increasing frequency—having similar responses to my peers. I asked the right questions and proposed the right next steps for scientifically rigorous clinical care. Thinking like a scientist was fun, and not just because I’d mastered a skill essential to my survival. I realized there had been entire sectors of life and thought that I’d been missing. What’s more, this new way of thinking gave me meaningful skills with which to make a difference in people’s lives.
I became a doctor, and that role has been one of the great pleasures of my life. But it has also been the source of some of my greatest frustrations. Because alongside medicine I learned that there were things a doctor did and things she did not do, things that should interest her and things that shouldn’t. Too many of the things in the didn’t and shouldn’t categories were the things I loved.
Elderhood Page 37