Reclaiming Conversation

Home > Other > Reclaiming Conversation > Page 10
Reclaiming Conversation Page 10

by Sherry Turkle


  I sat there and stared for I don’t know how long. Last year at this time I had an appointment scheduled with an orthopedist to sort out the weird things going on with my hands. Maybe I should have had a sense of what was to come but I didn’t. Months later my family’s life was changed. I was diagnosed with ALS. No treatment. No cure. Good luck. At the start of 2013 I had no idea.

  Sid did press the button to see Facebook’s version of 2013. Not surprisingly, it did not capture what the year had meant to him. There was no helping it: The timeline “had to treat some things a bit too matter-of-factly. A picture of my son’s first birthday and then a post sharing my ALS diagnosis with a nice musical transition in between.”

  Sid begins his letter to me by suggesting that no automatic system could ever understand a life with ALS—the way that event changes the meaning of everything that came before and everything that comes after. But then, in his letter, Sid backtracks. Perhaps what Facebook offered “did capture 2013 for me.” That year had been about intolerably fast cuts. It had been about moving too quickly from the normalcy of birthday cake to a doctor’s office and a death sentence. Facebook had captured this, “but it is too stark a contrast to comfortably watch. . . . I couldn’t watch the video knowing that the next post in the montage might be too big of a change in gears.”

  Sid’s experience illustrates the complexity of using the products of algorithms to think about the self. To understand what might be really important to Sid, you needed a person who could imagine living with a terminal illness. A person might understand that too-stark contrasts would be painful. But the machine-curated images did get Sid to think about his year in a new way. Facing death is about the surreal contrast between buying balloons for a birthday party and the certainty of nonexistence. The fast cuts of the Facebook postings got Sid thinking about that. The Facebook algorithm wasn’t written to have this effect. This is what a human being did with its results.

  Reading Sid’s email, I thought: It is never bad to have a new evocative object. What matters is how we use it. But the objects of our lives do constrain how we tell our stories. On Facebook and Twitter, we want to tell stories that others will like, ones that will be followed. I am often told that “Twitter is my memoir; Facebook is my way of keeping a diary,” but as Melissa’s case made so clear, shared journaling, like all publishing, leaves us vulnerable to the natural desire to gratify our readers. And when we use devices that track our physical state to provide clues for self-understanding, we work with another constraint: We try to find a narrative that fits our numbers.

  In one common way of doing things, wearable technology collects data that track such things as our heart rate, respiration, perspiration, body temperature, movement, and sleep routines. These data can go right to a display on our phones where we can use them to work toward physical self-improvement. A readout of how many steps we’ve taken can encourage better exercise habits. In another kind of tracking app, physiological signs are used as windows onto our psychological state.

  Here, the desire to wear a tracking device responds to the same impulse that had nearly everyone in my generation buying a mood ring. The difference is that even though the ring was fun, it had no authority. The new tracking devices come with substantial authority. We develop a view of ourselves (body and mind) that is tied to what measurement tells us.

  While some tracking applications use sensors to read your body for you, others ask you to report your mood or degree of focus or the fights you are having with your partner. Over time, there is a subtle shift. In some sense, “you” become the number of steps you walked this week compared to last. “You” become a lowered resting heart rate over the span of two months. You move to a view of self as the sum, bit by bit, of its measurable elements. Self-tracking does not logically imply a machine view of self, or the reduction of self-worth to a number, but it gets people in the habit of thinking of themselves as made up of measurable units and achievements. It makes it natural to ask, “What is my score?”

  In the 1980s, I wrote of the movement from the psychoanalytic to the computer culture as a shift from meaning to mechanism—from depth to surface. At that time, as computation gained ground as the dominant metaphor for describing the mind, there was a shift from thinking about the self as constituted by human language and history to seeing it as something that could be modeled in machine code.

  Today’s “quantified” or “algorithmic” self is certainly part of that larger story but adds something new. Instead of taking the computer as the model for a person, the quantified self goes directly to people and asks each of us to treat ourselves as though we were computational objects, subject to a printout of our ever more knowable states. The psychoanalytic self looks to history as it leaves traces in language; the algorithmic self to what it can track as data points in a time series.

  Numbers and Narration

  Numbers are seductive. People like thinking about themselves in terms of readouts and scores. This is not new. We have always been drawn to horoscopes, personality tests, and quizzes in magazines. Benjamin Franklin famously included a self-tracker in his autobiography, measuring himself on thirteen personal virtues every day. The difference now is that there is, as we say, “an app for that”—indeed, for all of that and more. More and more of our lives—body and soul—can be captured as data and fed back to us, analyzed by algorithm. And in the process, we are usually asked to treat ourselves and the algorithm as a black box.

  We see the frustration of having a number without a narrative in Trish, twenty-one, who uses an online journaling program called 750 Words. Every day, Trish writes 750 words and the program analyzes what she has written. It compares her daily writing to what she has written before and to the universe of all the other people writing. It rates her words—or as she sees it, it rates “her”—on maturity, sexual content, on the violence in her writing and how much she uses swear words. And it gives her a reading of her preoccupations. When I talk to Trish, she is confused. One day last week, 750 Words told her that her daily writing exercise had shown her to be preoccupied with death.

  Trish is a study in contrasts. A competitive athlete and a philosophy student, she wants to go to drama school when she graduates from college. She became acquainted with feedback devices when she bought a Fitbit, a popular commercial product that provides readings on daily steps taken, calories burned, and sleep quality. From there, she became curious about programs that would give her other kinds of feedback. When I met her, she had spent six months working with 750 Words.

  On the day Trish was told that she was preoccupied with death, she had used her 750 words to describe a conversation with a friend that had left her feeling misunderstood. Trish said it felt good to write it down. But then, alone with the program’s readout, she felt frustrated. She didn’t understand what her misunderstanding with her friend had to do with death. She wanted to understand the algorithm.

  It’s shocking that I write about death more than others. Actually, I don’t mind the comparisons to the rest of the world’s writing. What’s hard are the comparisons to myself. It’s hard not to take it personally, so it gets me thinking. The death thing is really strange. It makes me wonder why it thinks that.

  What most frustrates Trish is that once the program gets her thinking, there is no place to take her thoughts or her objections. Trish says, “It’s not like the program is my therapist. There isn’t a relationship. I can’t talk to it about why it feels that way. I don’t feel that I’m thinking about death.” And even if 750 Words could tell her which words had “triggered” the program’s reactions, she is not sure that would help. Trish wants a conversation.

  Technology critic Evgeny Morozov argues the limitations of the kinds of data that Trish has been left with. A narrative has been reduced to a number. And now the number seems like a result. Morozov fears that when you have your readout from the black box, you are tempted to stop there. You are pleased. Or you
are upset.

  But as we become more sophisticated about the kinds of data that self-monitoring devices return to us, that first impulse need not be our last impulse. We can construct narratives around our numbers. Indeed, in Trish we see the impulse to do that. (“It makes me wonder why it thinks that.”) And in meetings of those who declare themselves to be part of the “quantified self movement,” people do bring in data from sensors and programs and attempt to construct stories around them.

  In this spirit, a recently divorced woman in her thirties posted a blog of self-reflection and called it “The Quantified Breakup.” In the days and months after her divorce, she tracked the number of texts she wrote and calls she made (and to whom), songs she listened to (classified happy or sad), places she went, unnecessary purchases and their costs. She tracked her sleeping and waking hours, when and for how long she exercised, ate out, and went to the movies. When did she cry in public and post on social media?

  Reading this material is arresting. Yet as I read her blog, it seems like raw data for another story about what the purchases and the tears and the songs mean. Does this experience bring her back to other times when she has felt alone? To other losses?

  What strategies worked then? What potential stumbling blocks can be determined from her history? What kind of support does she need? On the blog, there isn’t much of this kind of talk. But we do learn that when she tried online dating and met someone she liked, they exchanged “1,146 [texts] in the first four weeks alone, an average of 40.9 per day.” And then it was over. What can we make of this? What can she? The numbers of “The Quantified Breakup” need their narrative.

  I have a similar reaction to quantified self enthusiasts who have a death in the family and numerically track their period of grief with the expressed intent of not wanting to skip over any part of the mourning process. The impulse is admirable, moving. But one wonders if in tracking their grief, they keep themselves too busy to feel it. Does taking our emotional pulse and giving it a number keep us on the feeling or does it distract us because, once categorized, we have done something “constructive” with the feeling and don’t have to attend to it anymore?

  Does tracking mourning help us mourn or does it deflect us because if we feel we must start and end our story with a number, we limit the stories we tell?

  Natasha Dow Schüll, the anthropologist, is doing an ethnographic study of the meetings of the quantitative self movement. At these meetings, members of the movement who have been “self-tracking” stand up to tell their stories. Schüll writes: “The defining activity of QS [the quantified self movement] is its Show and Tell events, in which individual self-trackers get onstage and tell a story about what they tracked, what they learned, etcetera.” Schüll is impressed by the QS Show and Tell. She asks, “Aren’t numbers just an element in a narrative process?”

  The answer for me is not simple. Numbers are an element in a narrative process, but they are not just an element. When we have a number, it tends to take on special importance even as it leaves to us all the heavy lifting of narrative construction. Yet it constrains that construction because the story we tell has to justify the number. Your quantified data history can provide material for constructing a story. But here, our language betrays us. We talk about the “output” from our tracking programs as “results.” But they are not results. They are first steps. But too often, they are first steps that don’t suggest second steps.

  For if the program’s results make no sense to us, we have no place to go. So when 750 Words gave Trish a “result” that baffled her (she doesn’t think she has morbidity on her mind), it provided no further guidance and no interlocutor. Trish is left puzzled and does not know how to further understand why her words are associated with death—by the numbers.

  I talk about tracking and self-reflection with Margaret E. Morris, a psychologist at the Intel Corporation who for over a decade has worked on applications that help people record and visualize their emotional and physical health. When Morris considers her work over the years, she says that what strikes her about the feedback devices she has made is that “they are most powerful as a starting point, not an ending point,” and that “every one of them started a conversation.” In terms of making a difference to health and family dynamics, it was the conversation that brought about change.

  Morris says that sometimes the conversation was begun by a family member or friend. In one of Morris’s cases, a woman housebound by chronic illness was asked to report her moods to a mobile phone app. Several times a day, this program, called Mood Map, asked the woman to indicate her mood on a visual display. When she was sad, the program would suggest techniques drawn from cognitive behavioral therapy that might help her see things in a more positive light. In this case, it was the patient’s son who used the Mood Map to start a conversation. The technology gave him an opening to talk about his mother’s loneliness, something he had not been able to do on his own. Morris sums up: “To the extent that these technologies have an impact, it is because they spark conversations along the way.”

  Performing for and Deferring to the Algorithm

  Linda, a thirty-three-year-old business student, is more enthusiastic than Trish about her experience with 750 Words because Linda sees the program as dispensing a kind of therapy. She began using the program when she was under stress, coping with academic pressure, life in a new city, and not having as much money as she did when she was working. As she tried to get her life in order, Linda wanted to know how she was doing, and 750 Words’ algorithms promised that they would report on how affectionate, happy, upset, anxious, or sad she was. But after a few weeks with the program, Linda is disgruntled: “Who wants to pour out your heart and find out that you’re a self-important introvert? Who wants to be told that you’re sadder than most other users? And not only that, but you’re not as happy as you were last week?”

  But Linda also sees an upside. She says that after two weeks of the program’s “constructive criticism,” the program has begun to “train” her. She now writes what she thinks the program would like to hear. She makes an effort to be upbeat and to talk more about other people in her 750 words. Linda says that according to the program, she’s not as self-important as she once was.

  I’m in a group where Linda discusses her relationship with 750 Words. The question comes up as to whether Linda’s approach is making her a better person. Sure, she is gaming the system, but maybe the system is gaming her—in a good way. Is this therapy? Is writing a positive version of your day every day a bad thing? Someone says, “I believe in the idea of ‘fake it till you make it.’” Research shows that if you smile, smiling itself triggers the release of the chemicals associated with happiness. Linda believes that if she consciously talks more about other people, she may in fact become less self-absorbed. So what starts as an exercise in self-reflection ends up, at least for Linda, as behavioral therapy.

  Trish and Linda face the same dilemma: What to do if your feelings don’t match the readout. Cara, a college student who has been using an iPhone app called the Happiness Tracker, has a different problem. How much should you look to the “output” of a tracking program to clue you in on your feelings? Over several weeks, the Happiness Tracker has asked for Cara’s level of happiness as well as information about where she is, what she is doing, and who she is with. Its report: Her happiness is declining. There is no clear link to any one factor.

  When she gets this result, Cara finds herself feeling less happy with her boyfriend. The app did not link him to her declining happiness, but she begins to wonder if he is the cause of her discontent. Uncertain of her feelings, she ends up breaking up with her boyfriend, in partial deference to the app. She says what she got from the tracker was like “a tipping point.” It felt like something external that “proved” she was not on the right path.

  In “happiness tracking,” a lot can get lost in translation. Everything depends on how you interpret what
the app is telling you. If Cara had brought her “discontented” reading to a psychotherapist, she might have been asked if she and her boyfriend talked about difficult issues—not necessarily things to avoid, but things that upset her because they were painful to deal with. Perhaps it was with her boyfriend—because she felt safe—that she allowed stressful conversations to occur. That might be a good thing, not a bad thing. Perhaps the “discontented” reading was a sign that he was, on balance, a positive force even if his presence provoked feelings that the program registered as stress. All discontent is not equal. Some bring us toward new understandings.

  As things played out, Cara’s “happiness tracker” didn’t lead to this kind of reflection. Indeed, she saw the number she got from the program as a “failing grade” and it sparked a desire to get a better one. It pushed her into action. But without a person with whom to discuss the meaning behind the number, without a methodology for looking at her current feelings in relation to her history, she was flying blind.

  Insights and Practices: The Psychoanalytic Culture

  As a psychologist, I was trained in a technology of talk, the conversational technology of the psychoanalytic tradition, which would suggest a different perspective to the unhappy Cara. These days, classical psychoanalysis has had many of its ideas taken up in non-classical treatments, usually referred to as “psychodynamic.” Here I’ll call them “talk therapy,” with the understanding that this is the kind of therapeutic conversation I mean. In contrast with technologies that propose themselves as quantitative mirrors of self, talk therapy offers interpretive strategies to understand your life story. Here I mention two to give a flavor of the kind of conversations that talk therapy encourages.

  A first strategy is not to take words literally but to have patience with them. Wait and see where words lead you if you let them take you anywhere. The therapist creates a space for a kind of conversation that encourages you to say what comes to mind without self-censorship. An algorithm asks for specifications. In talk therapy, one is encouraged to wander.

 

‹ Prev