Unfriending My Ex: And Other Things I'll Never Do
Page 5
The constant distractions and all our time online are clearly affecting our brains and may even be leading to new challenges in learning. Matt Richtel’s 2010 New York Times article “Growing Up Digital, Wired for Distraction” featured a group of bright kids who were failing many of their classes because they did not have the attention span to finish the assignments, and in some cases even forgot to do homework. They were consistently plugged in—surfing the Web, texting, playing video games—and younger brains, which are still developing, get used to this behavior. Richtel wrote, “ ‘Their brains are rewarded not for staying on task but for jumping to the next thing,’ said Michael Rich, an associate professor at Harvard Medical School and executive director of the Center on Media and Child Health in Boston.” Dr. Rich and other experts are worried that staring at screens will rewire kids’ brains, with harmful and lasting effects. Teachers are concerned that their students can’t concentrate at all and that they are leaving high school with less-than-ideal reading, writing, and discussion skills. Some teachers are resorting to reading books aloud in class because students can’t focus long enough to read twenty pages of a chapter at night.
These detrimental effects are more obvious in developing brains but can be seen in adult brains as well. Nonstop distraction hinders productivity. According to a 2011 study by Cisco, 24 percent of college students and young professionals “experience three to five interruptions in a given hour, while 84 percent get interrupted at least once while trying to complete a project.” Further, a recent study of university students found that those who multitasked heavily in a variety of media—texting, instant messaging, Facebooking, and tweeting while at work or a social gathering—were less likely to process information in a meaningful way. They had slower response times, were more easily distracted by irrelevant information, were unable to switch tasks easily, and retained useless information in their short-term memory. In other words, they may not have been born with ADD, but it certainly seems as though they acquired it.
Perhaps we were never meant to multitask. After all, according to the aforementioned study of university students, “processing multiple incoming streams of information is considered a challenge for human cognition.” Further, psychiatrist and author Edward M. Hallowell describes multitasking as a “mythical activity in which people believe they can perform two or more tasks simultaneously as effectively as one.” It’s why your mom told you to turn off the TV while doing your homework, why some companies are now preventing their employees from using some social media sites, and why people have died while texting and driving. As Dr. Richard Cytowic explains on his Fallible Mind blog, “The same inefficiency that freezes up your computer bogs down a brain when it is forced to divide attention among multiple tasks . . . In a world of nonstop distraction, you may be able to juggle things for a while, but you can’t keep it up; it simply takes more energy and bandwidth than we have.”
Never giving our brains a break is dangerous; according to the New York Times article by Matt Richtel, scientists in California found that rats were only able to develop permanent or long-lasting memories after experiencing something new if they rested. No one likes to be compared to a rodent, but we all need to power down in order to process our experiences in a valuable way, to retain what we have learned and establish the memory. Other research shows that taking a quick rest will actually enhance our memory. As reported in Psychological Science, two groups of individuals listened to a story, after which one group played a video game and another shut their eyes for about ten minutes. The study found that “memory can be boosted by taking a brief ‘wakeful rest’ after learning something verbally new and that memory lasts not just immediately but over a longer term.” Apparently, whatever we do in the short time after we learn something new will determine the quality of our memory. We don’t necessarily need to take a nap—we just need to take a break from all the noise. We need more Thoreau-inspired experiences. We need to find our own Waldens. A University of Michigan study revealed that walking in nature helped people learn more effectively than walking through a busy urban environment, which may mean that our brains get fatigued from an onslaught of information. I can tell which chapters of this book I wrote at my apartment in New York City versus the ones I wrote out in the country at my parents’ house. I notice that I have a harder time finding my voice in the chapters written in the oversaturated and bustling city. You’ll probably notice too. Being in the silence of the country allows me to relax just enough so that I actually absorb what I am writing and how it sounds. This type of downtime is essential for our brains to work better, but in a constant state of stimulation, we’re not allowing ourselves to have it.
In addition to making us less responsive to people we love and perhaps a bit dumber, our addiction also makes us do some pretty crazy things. Thirty percent of people I talked to seemed alarmed when reading a sentence in which the word BlackBerry referred to a fruit, almost half the people know how to drive with their knees so that they can text and drive, and just over 20 percent admitted to only buying fingerless gloves because it’s too hard to text while wearing regular gloves or mittens. One Christmas, I actually cut the fingers off a beautiful pair of cashmere gloves my mother bought me so that I could freely type on my phone during my wintertime commute. I am still disturbed by this, though apparently I’m not disturbed enough to have refrained from specifically asking my parents for fingerless gloves the following Christmas.
I am admittedly one of those people who tend to lose things easily and frequently. This year, I made the decision to attach an adhesive pocket to the back of my iPhone to serve as a wallet. I may lose my wallet five times in a year, but it’s almost impossible to lose something that I’m checking every two to three minutes, so I finally arrived at the brilliant idea that if I actually turn my iPhone into a wallet, I won’t lose anything. My iPhone has functioned as my wallet for over a year now and I have yet to cancel any cards or take that arduous trip to the DMV to replace my license.
A few years ago, I sat on a panel at South by Southwest about teenage cell phone use in America. When one of the speakers mentioned that he missed the good ol’ days when people used to put down their phones during dinner and pay attention to their friends instead of texting or scrolling through Facebook, the room lit up with excited nods and chants of “Yes!” The audience included smartphone addicts like me, bloggers, and digital media professionals—basically all the kinds of people who annoy you at dinner because they can’t put down their devices. Yet all of us were agreeing enthusiastically that we hated how much our dinner companions and friends constantly ignored us. I wondered if some of the people nodding in staunch agreement were sort of guiltily admitting that they are often the ones who are too busy tweeting, Instagramming, e-mailing, Tumblring, Facebooking, BBMing, or Snapchatting to give their friends and family the attention they deserve—I know I was.
We hate ourselves for using these things so much, but we learn to live with the guilt—we are relieved instead of aggravated or insulted when others take out their phones at dinner, because it means we can too. It’s like when you want to cancel plans with someone but are dreading that awkward e-mail and then they send you a text canceling before you have the chance to! The best. That is how I feel when I see a friend take out her phone at dinner. What a relief. I can now reach for mine. We can remember when we were focused and attentive, and it bothers us, but that doesn’t mean we will stop.
While on the panel, I began to notice how the reactions differed throughout the audience. The group consisted mainly of people in my age group, between twenty-five and thirty-five, but there were also several teenagers, as well as a few people who were at least forty or fifty. When the complaints about tech and smartphone addiction were raised, those in their midtwenties and early thirties were by far the most passionate—responding as if we were all inmates of the same prison, aware of our lives beforehand, and dumbfounded by how we had let ourselves become captive to these devices that now run
our lives. In contrast, the younger members of the audience seemed less annoyed and at times almost nonchalant and generally unaffected. I guess it makes sense if you consider that these digital natives haven’t known life any other way. But what really surprised me was that the older people in the room, those who had spent much more time in their lives without such technology, were just as affected by its hypnotizing pull.
I guess I shouldn’t have been surprised. Of all the people I talked into joining Foursquare (my parents, seven friends, and two coworkers), my dad was the one who became the most addicted. Foursquare is the location-based social media game that crowns a person “mayor” of any location once they have visited and “checked in” at a place more than anyone else. It works with your phone’s GPS functionality, so you need to actually be at or very near the place at which you are requesting to “check in.” When someone checks in more than you, Foursquare sends you an e-mail saying that you’ve been “ousted” as the mayor. The other day I was ousted from my mayorship of the Amanpulo resort in the Philippines. It destroys me that I will likely never get it back and there is nothing I can do about it.
My father is a retired Wall Street sales trader with a serious competitive streak. He and my mother are happily married and live in Bridgehampton. They have the kind of connected relationship and home life I aspire to emulate. Nonetheless, thanks to Foursquare, he became wildly obsessed with becoming mayor of as many places in the Hamptons as possible. Most days he would wake up around six A.M. to play a round of golf, then drive through town, checking into Bobby Van’s, Candy Kitchen, Starbucks, Hampton Coffee Company (yes, that’s two coffee places), Pierre’s, and the bank, in addition to any other place he actually needed to be. He even became mayor of long-term parking at JFK International Airport for two months. He felt particularly proud of this mayorship. I think my favorite aspect of my father’s Foursquare addiction, however, was immediately after he realized that one of the perks of being the mayor of certain locations, like Starbucks, was that you got special deals. At Bridgehampton Starbucks, my dad learned, his mayorship granted him one free coffee per day. He would strut into Starbucks, order coffee at the counter, and when the cashier asked him to pay, he would whip out his phone, say something weird like “not so fast,” and flash his Foursquare deal for them, winning his free coffee. It was out of a Seinfeld or Curb Your Enthusiasm episode. He had gamed the system. He had won.
My mom and I weren’t concerned about this new obsession; we were more amused—this was so in line with my dad’s personality and we enjoyed teasing him about it. When I visited my parents shortly after introducing my father to the game, he took over our dinner conversation, venting his frustration that someone named “Ian Z.” was still mayor of Bobby Van’s. My dad just couldn’t seem to steal the mayoral title, even though he checked in at least three times a day. The day he finally became mayor was great: We had steak to celebrate, and Ian Z. sent my dad a friend request on Foursquare—maybe out of respect or maybe out of pure curiosity. Ian Z. must have felt the same way that Andre Agassi felt when Pete Sampras beat him: completely floored and humble and exhilarated. I thought that with this victory, my father’s tenth virtual mayoral title, his obsession would die down. I was wrong.
The next week, my father went to work out at the gym, where he was the mayor and was always greeted with open arms by its staff, who couldn’t seem to understand why a man who went to the gym only four times a week was mayor while they, who went every day, were not. Clearly they had no idea about his late-night and early-morning drive-bys. In any case, thirty minutes into the session, my dad’s trainer said, “Ray, I gotta ask you a question.”
My father, unsuspecting, said, “What’s up?”
“Well, the other day, I was heading into Citarella in Bridgehampton and I saw you drive into the parking lot, stop for about forty-five seconds, then pull out again and drive away. You weren’t checking in on Foursquare, were you? Because you know that’s cheating.”
My dad swore to quit Foursquare on the spot—well, as soon as he had stolen the last mayoral title (for Bridgehampton Cemetery—who wants to be mayor of dead people?) from his archnemesis, Ian Z. On, November 2, 2010, my father became the mayor of the cemetery, and he quit Foursquare the next day. Even though I was happy he had the strength to quit, I was also helplessly and absurdly proud that my own dad had become the virtual mayor of all the restaurants and most of the bars I went to in the Hamptons.
Foursquare and its virtual victory quest took over many of my loved ones’ lives for a period, not just my dad’s. A few of my friends would go out at night even when they didn’t want to, just so they could check into places and reinstate their mayorships, or would travel miles out of the way just to get new Foursquare “badges.” If we were a few visits away from becoming mayor, we would aim to go to a specific part of town just to check into whatever bar, hotel, or restaurant we wanted to be mayor of. Sometimes it was for bragging rights; other times there were incentives, like prizes that were blatant marketing ploys. We were addicted to the faux connection, to the distraction.
Just like Friendster and Myspace before it, many think Foursquare is quickly becoming irrelevant. Now that people can link their Instagram and Foursquare accounts and tag locations on their photographs, there is little reason to sign directly into the Foursquare application. Like many of its predecessors and many that will follow, Foursquare was meaningless, pointless, and completely addictive while it lasted. But like many of its peers, it has died down and may become obsolete, paving the way for the newer, shinier social media like Instagram, Tinder, and Snapchat. One day, those will be rendered obsolete as well when we find something else we love more tomorrow, chasing it down onto the subway tracks.
3
Facebook Is Ruining My Life
As I was reading Walden and reacquainting myself with Henry David Thoreau’s thoughts on the joys of solitude, I tried to remember the last time I’d spent any time by myself—truly by myself, with only my thoughts to occupy my mind, no iPhone or iPad or computer to distract me.
I thought about the time right after Samantha, my long-term girlfriend, broke up with me, and how I had done everything possible to avoid confronting my feelings. The healthy reaction would have been to sit by myself and reflect, as I had in high school or the beginning of college when going through other difficult times. But instead of dealing with how I felt, I self-medicated by staying constantly connected: over the course of ten days I e-mailed all of my friends, signed on to Gchat, texted, tweeted, FaceTimed, and checked Facebook hundreds, perhaps thousands of times.
What had seemed like a blessing of distraction was a curse in disguise. I realized that I had not experienced anything like Thoreau’s idea of solitude in six years—since I first got a smartphone.
Alone time is a chance to contemplate what’s going on in my life or where I am mentally or emotionally. It’s a time to figure things out, when no third parties are interrupting or hijacking my thoughts. I think I used to be more secure when there was more bandwidth for alone time. Spending time with just me made me like me more. I got to know myself better, and so I would know how best to handle challenges, disagreements, and times of strife. The more time I spent anxiously typing away on my smartphone and being my virtual self on social media, the less close I felt to my core, the part of me that made the best decisions, the part of me that was truly the best I could be. I always loved Thoreau’s words “I have a great deal of company in my house; especially in the morning when nobody calls.” Thoreau was not a hermit, he just understood the importance of a divide between oneself and the world at large. “Individuals, like nations, must have suitable broad and natural boundaries,” he wrote. He complained once about a friendship, saying, “We meet at very short intervals, not having had time to acquire any new value for each other.” Sounds familiar. Everyone we’ve ever met in our lives is just a click away, and if we don’t want to think about something difficult, we can text; write an e-mail; check Facebo
ok, Snapchat, Instagram; scour YouTube; play a video game; make plans—we don’t have to be alone if we don’t want to be. True solitude has become uncomfortable for us.
It’s been said that Thoreau was the most content man alive because he had found the balance and stability in total solitude. The ultimate transcendentalist—he believed in the goodness of man and nature—Thoreau lived a life without distraction (granted, this was 1845, long before the phonograph or the telephone) in natural surroundings next to Walden Pond in Massachusetts. Before Thoreau, many famous theorists and great religious figures sat in seclusion in order to connect with and speak to their spirit guides; the prophets, sadhus, and yogis conducted their visionary experiences and trances in the desert, a cave, or some other place that allowed for absolute solitude.
Ralph Waldo Emerson, another transcendentalist, described how being alone could bring you a deeper appreciation of friends and society: “The soul environs itself with friends, that it may enter into a grander self-acquaintance or solitude; and it goes alone, for a season, that it may exalt its conversation or society.” Emerson believed in friendship, but he also valued solitude. We need our alone time in order to be functional and emotionally aware in our relationships, at work, and in friendships; that is how we can become better people and be introspective, self-analytical, and reflective—all those things that make us human.
An emerging body of research in the field of clinical psychology suggests that we should be spending more quality time alone. In an article titled “The Power of Lonely: What We Do Better Without Other People Around,” Leon Neyfakh states that “spending time alone, if done right, can be good for us—that certain tasks and thought processes that are best carried out without anyone else around, and even the most socially motivated among us should regularly be taking time to ourselves if we want to have fully developed personalities and be capable of focus and creative thinking.” Proponents of solitude claim that if we want to get the most out of the time we spend with other people, we need to spend certain time away from them too. The ol’ saying “Absence makes the heart grow fonder” is more deeply true than any of us completely understood.