Memory's Last Breath

Home > Other > Memory's Last Breath > Page 22
Memory's Last Breath Page 22

by Gerda Saunders


  On days like this, I always amuse myself by thinking that by the time I overdose on an internet death cocktail—most likely procured with the assistance of my family, but without them actually giving me any help with ingesting it—I will have built up an impressive precedent of medicinal mishaps that should go a long way toward proving that my death was just a plain old miscalculation by a zombie-like addled-pated woman.

  “Who am I?”

  For those of us acculturated to the Western perception of personhood, the unspoken assumption behind this question is that we have “a kernel of identity, a self,” for example, the self we are encouraged “to get in touch with” or “be true to.” When, in graduate school or otherwise, one is trotted through postmodern concepts of the self—whether from a philosophical, anthropological, Freudian, or literary critical perspective—you quickly get disabused of the idea that personhood indicates a single, separate, unified self. A concept that postmodernist scholars in all of these disciplines have in common, is that

  the self divides the moment we start looking for it: There is the self we’re trying to find plus the self that is doing the looking plus the self within this game of hide and seek that is being played. Even the practice of placing an alarm clock out of reach in the bedroom implies that we have at least two selves—a responsible nighttime self and a lazy morning self.

  In this view, we are not born with a self that merely has to unfold through the years to reveal our true being, but our self is rather brought into being by the people we encounter on our life path. Our self is relational, our very self-image is constituted in interactions with people we have not chosen—not our biological or foster or adoptive parents, not our first caretakers outside our immediate family, and not the people in authority we come across in visits to the pediatrician or witchdoctor, Montessori school or madras, grocery store or street market, church or synagogue.

  According to the French psychoanalyst Jacques Lacan, who interpreted Freud’s writings in accordance with anthropological and linguistic developments that occurred after the father of Psychiatry had died, the symbolic structure of our society—or, “the Other,” to use the phrase in its Lacanian meaning—brings to bear on us the customs, conventions, and language that define each individual self. The Other starts shaping a newborn before she has even had the opportunity to develop her own understanding of who she is. The Other, as represented by the family unit at the outset of a child’s life, determines the nationality she holds, the languages she speaks, the religious and other values she embraces. While, as an adult, the individual may choose to change her name or conduct her daily life in a language other than her mother tongue or leave behind her family’s religion and other values, the influence these factors had on her during her formative years cannot simply be erased. I came out as an atheist to myself when I was seventeen and to everyone else in my twenties, but have nevertheless been shaped by the narratives of the Bible and the white, puritan South African version of Christianity that I had taken in with my mother’s milk.

  To illustrate the fact that our self is brought into being by the Other, Lacan and other postmodernists refer to a self as a subject rather than an individual, since the term individual arose in the Renaissance to represent the idea that a person could deliberately fashion his own identity. An individual is the end result of a person’s deliberate and willful manipulation of cultural codes to perfect himself as the person he wants to be. The concept of the individual is equivalent to an idea still bandied about, a way of thinking that I associate with conservative social attitudes: that every individual in society should be able to pull herself up by her bootstraps regardless of poverty, social class, or other obstacles toward educational and career achievements. Unlike the concept of the subject, this view negates the influence of societal forces on a person’s ability to access educational and career opportunities.

  Calling someone a “subject” rather than “individual,” however, does not imply that the Other only holds back an individual by coercing her to follow communally shared expectations. On the contrary, a term introduced by anthropologist McKim Marriot in 1976, that of the dividual, illustrates a form of subjectivity characterized by a deep connection among single persons and their community, bonds achieved through the give and take between single persons and the Other rather than top-down imposition of a mode of being on those persons by the Other.

  While the postmodern concept of the dividual dates from the late 1970s, the idea that a self is constituted through other selves is an ancient one. Before the Christian Era, it was embedded in the Buddhist teaching “that anything which depends for its state on external factors must change when those conditioning factors change (anitya), and if no part of that thing is immune from dependencies, then to identify any essential protected nucleus of self must be mistaken (anātman).” It entered Christianity when the apostle Paul wrote, “we, being many, are one body in Christ, and every one members one of another.” It arrived in science with the publication of Carl Linnaeus’s Systema Naturae in 1753, in which organisms are classified hierarchically into kingdoms, phyla, classes, orders, families, genera, and species based on their structural similarities. Linnaean taxonomy raised “the question: if each species was created separately, as a literal reading of the Bible would imply, why were there structural similarities across species, genera, classes, etc.? Why was Linnaeus’ classification sensible?”

  A century later, Charles Darwin answered the question Linnaeus’s work had raised with the publication of On the Origin of Species. Darwin proposed natural selection as the mechanism that made possible the evolution of trilobytes to tree ferns and hadrosaurs to hominids from primitive life forms that appeared on earth 3.5 billion years ago. Once the concept of the dividual entered the realm of living organisms, it took just “one small step for a man, one giant leap for mankind” before it was extended to non-living material objects all through the universe. Dividuality rings from Carl Sagan’s praise song to the Universe in the TV series Cosmos: “The beauty of a living thing is not only the atoms that go into it, but the way those atoms are put together. The cosmos is also within us. We’re made of star stuff. We are a way for the cosmos to know itself.” It reverberates in Neil deGrasse Tyson’s desire “to grab people in the street and say, ‘Have you heard this? The molecules in my body are traceable to phenomena in the cosmos.’”

  In its route from Buddhism through Christianity, the biological sciences, postmodernist thinking, and astronomy, the “self” has become multiplicitous and heterogeneous and never complete—it is, rather, ceaselessly becoming.

  The answer to “Who am I?” has gained—or regained—cosmic proportions.

  There are those—ranging from conservative Christians to humanities scholars of all stripes—who don’t like this “self” that cannot be pinned down. The former believe that the postmodern self is “a revolution against the God of the Bible” by “people no longer willing to comply with God’s guidelines [who] want to be free to do as they please” and the latter that it leaves single persons “bereft of origin and purpose.”

  The postmodernist self resonates with the way I experience myself. I’m never done; always changing; dependent on external factors that include people, events, and the matter of which my body consists; and, more often than not, not under the sway of reason. In this framework, “Who am I?” can never be fully answered. “Know thyself” is a shadow on the wall of Plato’s cave. However, in a universe that can mathematically be described as the three-dimensional shadow caused by the collapse of a four-dimensional star that resulted in the Big Bang, a shadow gives us a lot to work with: in the same way that astronomers read the history of the universe down to when it was only a hundredth of a billionth of a trillionth of a trillionth of a second old, the postmodern subject can be read “through its discourse, its actions, its being with other selves, and its experience of transcendence.”

  In the meantime, I cling to my dividuality for self-ish reasons that would b
etter befit a Renaissance individual. In The Village Effect: How Face-to-Face Contact Can Make Us Healthier, Happier, and Smarter, developmental psychologist and journalist Susan Pinker cites neuroscience findings showing that “those who avoid dementia have the most complex and integrated social network.”

  Dementia Field Notes

  11-10-2011

  Peter and I are visiting Marissa and Adam in Chicago. Last night Marissa, Peter, and I were in the living room examining life in our usual jokingly serious way. Our topic, what would happen to Peter if I died before him. Adam was doing homework at his desk in the corner.

  Gerda: You’d better get a second wife.

  Peter: The kids will be far too fussy to accept anyone as my second wife.

  Gerda: The kids will be all too happy if you have your own company because else they have to deal with you being lonely and miserable.

  Marissa: The problem is, Dad, that your second wife will have to be able to talk technology and be interested in your electronic toys—

  Gerda: Unlike me.

  Marissa: I have an idea. We should outsource Dad’s second wife to India.

  We all elaborated on the idea of Peter finding a tech-savvy immigrant from India or anywhere else where someone might be interested in a “green card” marriage.

  Gerda: Those women can still cook. Maybe you can come to an arrangement of a number of years in which she would cook and care for you as “payment” for the green card and then go on with her own life.

  Peter: I love Indian food. But what about sex?

  “Sex” pops Adam out of his homework. He swivels his chair all the better to not miss a word.

  Gerda: That can be in the contract or not. What do I care?

  Peter: Maybe I love Mexican food even more.

  Gerda: As long as you don’t exploit one of my fabulous undocumented former students.

  Peter: But Layla* told us that her father always says, “Why do you fall in love with an undocumented boy? You should fall in love with someone who can make you legal!”

  Marissa: Mom, I think your last testament should require that Dad’s second wife must be older than me.

  On the day his father finally died of Alzheimer’s disease, novelist Jonathan Franzen noted that “in the slow-motion way of Alzheimer’s, my father wasn’t much deader now than he had been two hours or two weeks or two months ago.” Franzen is by no means the only relative of a dementia sufferer who has thought of his loved one as, in effect, dead while he was still alive. “Undead,” to be succinct. Not surprisingly, then, both scholarly and popular writing frequently refer to persons with dementia as zombies.

  In an essay titled “The Living Dead? The Construction of People with Alzheimer’s Disease as Zombies,” political scientist Susan Behuniak draws attention to the use of “the ‘undead’ metaphor” for people with dementia, which, she argues, magnifies the stigmatization already visited on such patients by the biomedical model of the disease. This model, she claims, positions someone with dementia “as a non-person, i.e. one whose brain has been destroyed by the disease and who therefore no longer exists as a person but only as a body to be managed.”

  In a survey of scholarly and popular writings about dementia, Behuniak notes that dementia is referred to, for example, as “death before death,” “the funeral that never ends,” “the mind robber,” and “a terror-inspiring plague.” Such observations are frequently directly connected with the term zombie as well. She goes on to compile a list of characteristics shared by zombies and people who have dementia, which includes “exceptional physical characteristics, lack of self-recognition, failure to recognize others, cannibalization of human beings… [and] overwhelming hopelessness that makes death a preferred alternative than [sic] continued existence.” She concedes that these attributes are accurate in describing people with dementia, since they often have a disheveled and badly groomed physical appearance and shuffling walk and are given to obsessive wandering. In regard to zombie cannibalism, she cites examples of a dementia patient who refers to his disease as “‘the closest thing to being eaten alive slowly’” and another who says in relation to his caretakers that “‘the unique curse of Alzheimer’s Disease (AD) is that it ravages several victims for every brain it infects.’”

  Behuniak’s examples come from publications that, like she does, advocate more humane treatment of dementia patients, such as nursing and gerontology journals and writings by AD sufferers themselves or their relatives. She emphasizes that the purpose of her essay is to show that language is a powerful social force that has been and is still used to set apart those among us who lack some essential aspects of what society believes “a full human being” should look like. The result is that those who do not qualify are deemed not worthy of the full self-determination to which “normal” people feel themselves entitled.

  Given my lifelong love affair with language and having taught the major civil rights battles of the past century in my Gender Studies courses, I am aware of the role language has played in the efforts of disenfranchised communities to gain full civil rights. Activists in the various movements have invariably regarded a campaign against language used derogatorily against their group as a necessary part of their fight. The almost total elimination from polite language of the n-word, “bitch,” “queer,” “cripple,” and so on attests to their success. Given the unruly nature of language, however, some have reclaimed the former derogatory words as badges of honor but always with the clear expectation that only some people have a “right” to these words. Since I myself am walking the teetering plank toward the neither-dead-nor-alive identity of someone with dementia, I have some things to say about Behuniak’s proposed banishment of “zombie” in this context. First, though, a confession: I know hardly anything about zombies through firsthand experience. I have never seen a whole zombie movie. The visual image the word evokes in my head derives from a handful of YouTube clips.

  How the topic of zombies came up during a discussion in one of my Gender Studies classes, I do not remember. However, what is clearly imprinted in my memory is that a young man at the back of the room, who hardly ever spoke, corrected me when I had apparently used vampires and zombies interchangeably. That I had made an impression on my student, albeit a negative one, became evident at the next meeting of the class. To encouraging hoots and clapping from the other students, he presented me with my own copy of The Zombie Survival Guide, which I of course accepted with all the grace I could muster.

  It had been a matter of pride for me to read—or at least skim—the whole book before the next class so that I could at least redeem myself with an informed comment or two. To my surprise, I found Max Brooks’s fiction more interesting than its literary forebears. It also fascinated me that George Romero’s genre-inventing film, Night of the Living Dead, had gone completely by me when it debuted in 1968 when I was in the middle of my BS degree at the University of Pretoria and that Dead Alive, which came out in 1992 when I was in the middle of my English PhD, had passed me like a ship in the night. I had clearly not been on the zombie ward’s phone list.

  The book was a glimpse into a cult the existence of which I had never heard of—lovers of the zombie genre who particularly delight in the humor and self-mocking of a kinder, gentler new zombiehood, in which even Disney’s Ariel and Snow White are zombie princesses. Such, then, is the background against which I declare myself a zombie princess in waiting. I put my dibs on the name “Princess Doña Quixote.”

  Setting aside my compensatory attempts at humor, I want to talk about something that bothered me in Behuniak’s essay, namely her characterization of the biomedical model of dementia as necessarily evil. First, her definition does not at all accord with those one finds in medical dictionaries and scholarly medical and psychiatric articles. For example, MedicineNet defines dementia as a disease that leads to “significant loss of intellectual abilities, such as memory capacity, that is severe enough to interfere with social or occupational functioning,” and th
e National Collaborating Centre for Mental Health as “a clinical syndrome characterised by global cognitive impairment, which represents a decline from previous level of functioning, and is associated with impairment in functional abilities and, in many cases, behavioural and psychiatric disturbances.”

  It seems to me that Behuniak’s misrepresentation and disapproval of the biomedical model is the result of a categorical confusion. Philosophy, and its application in the sciences when it comes to the meaning of words, distinguishes between two different kinds of definitions: descriptive, which depict a state of affairs, and normative, which delineates the ethical action that derives from a state of affairs. The definitions of the biomedical model that I reproduced above are descriptive and not normative. Normative expectations derived from a state of dementia, or how we should act toward people with dementia, abound in the very nursing and other medical literature from which Behuniak derives her examples of what a climate of care should look like in relation to dementia. Such normative definitions are known as biopsychosocial models and are similar to those advanced by Thomas Kitwood and the other personal-centered theorists Behuniak cites: they advocate approaches to dementia that assume “the person is present and approaches AD as a condition shaped and defined by the social and interpersonal contexts rather than by neurological changes alone.”

  Despite the widespread adoption during the 1980s of the biopsychosocial model of “health”—including “health” in the context of brain-destroying diseases—by large medical organizations in the United States as well as other countries where Western medicine is practiced, the change of language has not yet permeated a large enough part of medical practice, though it has made many inroads. The problem in the case of dementia caretaking, therefore, can be ascribed to a lack of financial resources for training and paying living wages as much as to negative language.

 

‹ Prev