Elderhood
Page 31
People’s experience of old age varies widely, and always has. For most, it’s not the facts of being old that bring suffering—often, quite the contrary—so much as the threat or reality of socially contrived accompaniments such as lack of purpose, poverty, exclusion, and isolation. That the answer to how to create a better old age lies in the gap between hard data and dearly held beliefs isn’t new. Plato’s Republic opens with the elderly Cephalus telling Socrates that some of his contemporaries
harp on the miseries old age brings. But in my opinion … they are putting the blame in the wrong place. For if old age were to blame, my experience would be the same as theirs, and so would that of all other old men. But in fact I have met many whose feelings are quite different.28
This is good news for old age. While human legs and skin, hearts and brains, change with age, they are pretty much the same now as they were in classical Greece. It’s our beliefs, feelings, actions, and policies that can and do change, and that we can shape to support and celebrate old age in all its varieties.
BACKSTORY
To cut into another human being’s body and remove or rearrange parts takes a confidence that I do not possess. To take charge in a crisis and issue orders to other highly trained people secure in one’s near-instantaneous assessment of a complex situation requires hubris, speed, and the psychological, emotional, and cognitive ability to trust that one understands the situation well enough and that well enough is good enough. I don’t have those things either, or, more accurately, not enough of them to want to make crisis management my career.
In kindergarten, I was playing four square with friends when our ball rolled into the middle of our public grade school’s huge asphalt playground, packed full of other kids. The smallest in my friend group, I usually held back until someone else went to fetch it, but this time they all looked at me. I walk-ran into the swirling fray of older children, and then, perhaps with the ball in hand and perhaps not (I don’t recall), I crouched down and pulled my coat over my head to protect myself from the blows of passing legs and bodies. The bigger kids weren’t trying to attack me—they just didn’t think about me—and as their team game moved across the playground, I felt vulnerable and unable to escape their threat. Luckily an older girl rescued me.
As a girl, climbing trees also scared me, but I thrilled in it too. The problem was that I couldn’t get down. “Jump!” my friends would call, or “Slide!”—strategies they used to move down quickly. I couldn’t or wouldn’t. Both seemed too fast, too dangerous, too risky, and too terrifying. ER docs, I have heard, are the ones most likely to parachute from planes. That holds no appeal for me, no thrill, only terror and a question: Why?
In high school, taking the PSATs, I was about halfway through when they called time. In those days (and still now), when people said a book was a quick read, I changed their hours into my days, then multiplied by two, three, or more, depending on the length and difficulty of the material. I read slowly. When I say this, sometimes people respond, “I do, too, but this one’s a quick read.” Not for me.
In college, I tried out for the new women’s rugby team, but found I could not run full force into another woman for sport. I’d like to think I could do it if the other person was trying to kill herself or others, but I’ve never been in a situation that allowed me to find out. It’s not that I never feel the urge for violence. As a child, such urges were almost always directed at my younger sister; as an adult, they take the form of diatribes and fantasies of retribution against people who have harmed my family, patients, friends, or vulnerable strangers, and on occasion people I feel have slighted me. But although I sometimes hit my sister when we were little, I’ve never actually been violent with another person as an adult, except if you acknowledge that certain actions and activities of doctors qualify as violence.
Some things scare me. Doing even a minor task badly disturbs me to a point that I’m told is extreme. Also, while I sometimes make mental connections quickly, I struggle with coordination and can be slow with decisions. If a person is trying to die and the doctor needs to be thinking through protocols plus the particulars of the situation and moving the various parts of the patient’s body to positions and for procedures all at the same time, I’m not the best doctor for that job. I’ve done it—any trained doctor has. But it doesn’t draw on my strengths or passions. I prefer having enough time to do a good job sorting through complexity. Also, where some feel a rush of affirming adrenaline that carries them for hours, I cannot separate my actions, however well executed, from the ethics of what was done, how it might have been done better, and its consequences in the lives of patients and families.
On the other hand, some things that scare other people don’t bother me a bit. For years, I packed a backpack and set off for places most Americans didn’t go, certainly not by themselves or with an equally young female friend. I went to the Thai-Cambodian border, to remote areas of Indonesia, Senegal, Mali, Guatemala, and Nicaragua. I took trains and buses across China to the Pakistani border, standing up to officials who told me there was no train and sleeping in spit-stained hallways when necessary. I hitchhiked through Scotland and hid my companion and myself from a group of drunk men who’d followed us from a bar to a local farmer’s field where no one would have heard our screams. On an island off the coast of Belize, I asked a young man about his family and future plans and told him about mine so he saw me and my friend as human beings and did not rob us as he did all the others on our boat that day.
Similarly, for years, I did housecalls, working on my patients’ turf without missing the power and control a doctor has in a hospital or clinic. I don’t need that, which may be why I haven’t owned a white coat since residency. In their homes, even during a doctor’s visit, people are human beings first and patients second. I love that. I went into neighborhoods and residences I would not have come to know had I not been a doctor. “If we see a gun,” I would tell residents rotating with me, “we keep driving.” Often, I learned more of clinical relevance from seeing how people lived than from hospital discharge notes or my physical exam. I’d rather creatively negotiate those complex realities over weeks or months or years in relationship with another human being than use a new surgical tool or adjust the dials of a machine. Like most people in a position to do so, I have chosen work that makes the most of my interest, values, and strengths.
Chances are this is not every geriatrician’s truth. But it’s a big part of mine, for worse sometimes, I know, but also I’d like to think, more often than not, for better.
Five months after my snap and burnout, I returned to work—sort of. I didn’t go back to my main leadership roles or to my housecalls clinic, though I got updates on my former patients from colleagues. I missed my patients, felt guilty about my hasty departure from their lives, and also was not yet ready to again take on responsibility for any part of other people’s lives. I was still sorting out my own life and medications and trying to figure out whether I could still be a doctor and remain healthy, mentally and physically.
For the rest of that academic year, instead of returning to the places and patterns that had led to my ill health and burnout, I worked part-time on grants and projects. In what had once been my clinical and administrative time, I did various teaching jobs and started this book. As months passed, I felt better and better except when I thought about the roles and systems that had contributed to my burnout. Clinical practice in the second decade of the twenty-first century seemed to demand too many of the things I didn’t enjoy or believe in and too few of the ones I did. I suspected that I would need to do what so many other doctors have done lately: stop practicing forever and find a completely different sort of work inside or outside medicine.
At all ages and stages, life changes. A year and a half after the snap, in the spring of 2017, I was feeling much better—really good, in fact—when a colleague told me about a job on a new hospital unit for old people. It seemed the perfect solution to my practical and moral q
uandaries about outpatient medicine—I could again care for patients, but those patients would be in hospitals, one of our health system’s preferred settings. It wasn’t an ideal solution, but it was an option that might make it possible for me to see patients without compromising their care or my values. They told me the job would open in the fall. The idea of stretching my clinical muscles again made me nervous but also, especially, excited.
LONGEVITY
We live decades longer than we did throughout most of human existence—and, in recent decades, mostly better, with less poverty29 than previous generations of older adults, and fewer years of disability30 for people without obesity or significant chronic disease. In 1750, only one in five Americans lived to age seventy; now more than four out of five do. This longer life expectancy, combined with dramatic declines in birth rates, has made old people a steadily increasing percentage of the population. In 1800 they accounted for 2 percent of the U.S. population; in 1970 they were 10 percent; in 2017 the number had climbed to 15 percent. As often happens with minority and feared populations, as the numbers of older people grew,31 so, too, has society’s animosity toward them. Although almost one in six of us is old, aging remains the subject of jokes, fears, discrimination, and denial.
Part of this is the natural response to change, as well as mixed messages and legitimate concerns. We hear often of “senior moments” when a person can’t access the word or information they want. We only rarely hear of the equally if not more numerous moments when older people call on well-documented insight and emotional intelligence to make smart decisions. Conversely, we don’t blame age when a younger person can’t bring up the word that would complete their thought or sentence. Meanwhile, the Stanford economist John Shoven has made the case that “the current practice of measuring age as years-since-birth, both in common practice and in the law, rather than alternative measures reflecting a person’s stage in the life cycle32 distorts important behavior such as retirement, saving, and the discussion of dependency ratios.” He argues that if we define old age by the percentage of an age group that dies annually, old age may be getting older. Yet there is no evidence that the once anticipated “compression of morbidity,”33 a decrease in the number of years a person spends with disease and disability, has come to pass despite our much-ballyhooed health and medicine progress.
Shoven suggests we become old when we have a 2 percent or higher chance of death in the next year, and we become very old or elderly34 when that figure reaches 4 percent or higher. Using that criteria, in 1920 men and women became old in their mid- to later fifties, respectively. Now it’s age sixty-five for men and seventy-three for women—on average, with, as usual, whites doing best, blacks doing worst, and brown people in between. The implications of Shoven’s work are considerable. Is it reasonable to work for forty years and retire for thirty or forty? Almost certainly not, especially if we’re not “old” and all the more so because purpose, social engagement, and money are key contributors to well-being. Working longer, even (perhaps especially) if we work different jobs or fewer hours in our older years than in our younger ones, is likely to increase our life satisfaction while decreasing our rates of chronic disease and disability. This is just one of the societal and public health interventions that, unlike disease treatments offered by medicine, might move us toward true compression of morbidity—in other words, toward lives that are both longer and healthier.35
We speak of the “silver tsunami” as if the unprecedented and permanent increase in both the numbers and proportion of older adults came about suddenly, without warning, and portends destruction and devastation on our society. But people in developed countries have panicked about their aging populations for a century. Americans worried in the 1930s and ’40s and again in the 1960s and ’70s, spawning ageist ads, books, movies, and lamentations (much like the ones of today) about how the aging population would ruin our country and the world. Fifty years ago, alongside movements to eliminate discrimination against women and African Americans, scholars and citizens’ groups like the Gray Panthers worked to enlighten the public about aging myths and to offer more positive images of old age. They advocated for the rights and needs of old people, just as so many groups today do, meeting similar support and resistance. At times that pushback was warranted, as when the activists seemed to assert that old people differed from the young only in age—an obvious falsehood. Those were the years, too, in which the notion of the Third Age originated, in part to signal the potential of the longer human life span but primarily to distinguish the functional old from their debilitated peers and elders. All these groups differed in specifics and strategies but not in basic arguments and intentions from similar arguments throughout human history.
All societies have included old people, even ones in their eighties, nineties, and hundreds. At times, older adults have represented a significant portion of the population in some countries, not the 15–20 percent or more we see today, but around one in ten.36 Even past societies with far less technology, money, and other resources than we have now often supported large numbers of older people. What is new are the numbers and proportion of older people in the population. In the late twentieth century, living into old age became the norm in developed countries. In the United States in 1900, the average life expectancy was forty-six years; by 2016 the average had reached age seventy-nine. If you make it to eighty, you have a good chance of making it to ninety or beyond. Still, although more and more people are living into their second century, it’s rare for humans to live past twelve decades. Anthropological evidence suggests our species’ life span hasn’t changed37 over at least the last ten thousand years.
Still, once upon a time, most people were young. If you created a graphic representation of the population, you saw a pyramid: lots of young people on the broad bottom of a triangular structure below progressively narrower bars for each subsequent period of life, indicating declining numbers of people alive at older ages. Lately, the pyramid has taken on a more rectangular shape, with increasingly similar numbers of citizens in most age groups.
The biggest increases in longevity in human history occurred over the twentieth century. Most people believe this was because of medical progress, but, more accurately, most of the credit goes to increased global wealth and public health: sanitation, better nutrition, and immunizations. You can still see the impact of those factors on world maps of longevity. In places that have those things, people live far longer than in the places without them, a reality that existed before most of the great advances of modern medicine. You can see it, too, among the subpopulations of the United States. While many of our poorest citizens have access via Medicaid to modern treatments not available in Afghanistan or sub-Saharan Africa, they still get sicker earlier and die younger for reasons of environment, social stress, and lack of access to healthy foods, hope, and opportunity.
The longest-lived humans today live in Okinawa in Japan, Sardinia in Italy, or Loma Linda in California, so-called blue zones.38 Okinawans eat healthy, low-calorie diets and maintain low normal body weights, with average BMIs around twenty. Isolated, inbred Sardinians seem to have a genetic advantage, since men are as likely as women to become centenarians, and people who move away in young adulthood still live exceptionally long lives. In Loma Linda not everyone achieves unusual longevity, just the large numbers of Seventh-day Adventists who outlive their neighbors by five to ten years. They abstain from alcohol, cigarettes, and other drugs, have strong spiritual lives, a close community, a vegetarian diet—and lower levels of stress hormones. Similar to worshippers of most faiths, they live longer than people who aren’t religious.39
In the United States there is much talk of a “baby boomer blip,” as if that generation were moving through an enduring population pyramid like a mouse through a snake, creating an aberrant bulge as it passes. But older people will make up a larger percentage of all populations for the foreseeable future. A more accurate image of population trends shows the admittedly abu
ndant boomers as the lead edge of an enduring shift in who and how old we are as a species.
The old people of today must inform our future planning, but they cannot be our sole guides. Right now, most old people are white. They also have less education than boomers. Both of those things are changing: since 1985, the share of older Americans with college degrees has tripled, to about a third of sixty- to seventy-four-year-olds. Medically, “the old” are changing too. The current oldest generation, members of the “greatest generation,” tend to minimize pain. Sometimes, to get them to admit to it, clinicians have to use euphemisms like discomfort or ache, and even then many are reluctant to take strong medications. Contrast this with the baby boomers, most of whom don’t hesitate to say what they’re feeling and what they need. And drugs don’t scare them: they were young in the 1960s. Because I have mostly cared for the oldest and frailest patients, I haven’t had to ask about use of cocaine, heroin, acid, or mushrooms in years. I expect that to change in the near future.
In response to this unprecedented demographic transformation, the organization and priorities of every sector of life have shifted. But fundamental change, and particularly change that upends established beliefs and societal institutions, is slow, even when the need for it is obvious.
Unlike life expectancy, which changes from year to year, the human life span (maximum longevity) seems fixed throughout history. Despite the claims made for the exceptional longevity of Russian Georgians or Bolivian mountaineers, there is no reliable record of any human surviving past the age of 122.40 Age-related mortality increases from maturity then plateaus in advanced old age. And still people die—all of them.
There are scientists and think tanks working on that challenge, but the jury is still out on whether they will succeed and also whether extending human lives further would constitute progress. On this one, for the time being, I’m inclined to agree with a comment the comedian Sarah Silverman made to Jeff Bezos the day his company introduced technology that would eliminate all cashier jobs across the United States. The combined income for American cashiers that day was about $210 million, or less than 1 percent of the $2.8 billion Bezos made that day. In the announcement about the new technology, there was no mention of what would happen in the lives of the already poor and now potentially unemployed citizens affected by its introduction. Silverman tweeted: “Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.”41