The Winter of Our Disconnect

Home > Other > The Winter of Our Disconnect > Page 19
The Winter of Our Disconnect Page 19

by Susan Maushart


  Q: Has The Experiment changed your relationships at all?

  SUSSY: Anni and I are like we used to be. We’re tight again.

  Q: Is that out of desperation, or . . .

  S: Probs! (Laughs)

  Q: Seriously, how’s it different?

  S: It’s like, we chillax. We tell each other stuff now, like we used to. She helps me. I help her. We play the dice game. We play our creepy little ring game....

  Q: What about Bill? How’s your relationship with him changed?

  S: Um, I want to kill him more, because of the sax. It’s just. Soooo. LOUD!

  We’ve got nothing against the Internet, but when people are surf-

  ing the Web, they’re missing the best part of life—being together!

  That’s why we created the first Web site devoted to helping people

  spend less time online and more time with each other. For starters,

  we’ve allocated just enough time to browse every link, but not a

  second more. So enjoy your three minutes, then get out there and

  make face time. Chop, chop. Time starts now.

  —“Make Facetime” promotion, Dentyne.com

  A website devoted to helping people spend less time online? Well, I guess I’ve heard of stranger things. Like that after-school show where the hosts are constantly urging kids to get outside and ride their bikes. Seriously, it makes Huxley’s Brave New World look like a press release for Pfizer.

  I, too, wanted to help people spend less time online, just like the folks who brought you minty-fresh breath. If only I’d thought of creating a website instead!

  When McCann Erickson created the “Make Facetime” campaign for Dentyne brand-owner Cadbury in September 2008, they were aiming straight for the kisser. The idea that the ads could induce under-twenties to swap their technology for a stick of gum and a good old-fashioned chin-wag was always going to be hard to swallow . . . not to mention impossible to digest. (“I think most college kids would roll their eyes,” commented one sociologist drily.) But the fact that it was tried at all is interesting. So is the site itself, which includes a couple of not-entirely-user-friendly social-networking utilities—well, they stumped this Digital Immigrant—and something called the Smiley Chamber of Doom, which shows animated emoticons being maimed and tortured. (I’m down with that. ☺) Oh, and it really does cut out after three minutes. Which is kind of amazing and also kind of annoying—especially if you happen to be taking notes.

  Closer to home, Dôme Coffee—the Australian-owned café chain—is running a series of magazine ads with a strikingly similar anti-tech /pro-talk theme. “One friend face to face beats 100 on Facebook,” reads a Confucius-like headline on a recent full-page ad. It features a photo of a cluttered lunch table and two broadly smiling women friends in their thirties ... staring at mobile phone screens. (I show the ad to Suss. “See anything wrong with this picture?”

  “Is it something about feminism?” she asks, warily.)

  The information paradox—that the more data we have, the stupider we become—has a social corollary, too: that the more frantically we connect, one to another, the more disconnected our relationships become. We live in an age of frenzied “social networking,” where up to a quarter of us say we have no close confidante, where we are less likely than ever before to socialize with friends or family, where our social skills and even our capacity to experience empathy are undergoing documentable erosion.

  Our, quote-unquote, family rooms are docking stations now. We have five hundred or six hundred “friends,” and no idea who our next-door neighbors are. We affiliate with “communities” based on trivia—a mutual appreciation of bacon, a shared distaste for slow walkers. And doing so in a spirit of heavy-handed irony hardly ennobles the enterprise. We have sold out social depth for social breadth and interactive quality for interactive quantity to become what playwright Richard Foreman calls “pancake people”: “spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.”2

  Or at least that’s one side of the argument. There are others who argue that our social connectivity is not fraying at all, but simply undergoing some much-needed rewiring. They point to the growth of online communities—from social-networking utilities such as the fascinatingly telegraphic Twitter, to the entire virtual worlds of Second Life and World of Warcraft. They show how new media are bringing families together with instantaneous digital contact via text, sound, image—or all three at once. (“Have you Skyped Grammy and Grandpa to say thank you for that birthday money yet?”) They remind us that for Digital Natives, time spent grooming one’s online relationships on Facebook or Twitter alone amounts to a sizable part-time job. That the happy confluence of wireless Internet and portable media means they are never alone, never out of touch. “Only connect” is what these people do.

  So, are we really more connected and less alone than ever before? Perhaps the truth lies somewhere in the middle. But I don’t think so—and maybe that’s why it’s all so confusing. My own observations suggest that the truth lies at both extremes. “Information explosions blow things up,” remember? In this case, the land mine seems to have taken out the Via Media (literally, the middle way) altogether.

  We are both much, much better connected, and in clear and present danger of forgetting how to relate. Well, I guess that’s why they call it a paradox.

  Watching as my kids adjusted to the aftershocks of life without social media drove the point home again and again. Maybe a night on Facebook really has become the moral equivalent of standing around the piano singing show tunes. But while the quality of each experience, and the skills and habits they call into play, are both certifiably “social,” they also happen to be certifiably antithetical. Messaging, poking, posting, uploading, and gifting faux livestock to your friends can be absorbing, entertaining, even challenging. But “getting together” it ain’t. I knew that before, of course. But The Experiment just sort of turned up the volume—not just for me, but for all of us.

  The impact on our relationships as a family was even more dramatic, as we found ourselves “tuning in” to one another in unexpected ways. We lingered more around the dinner table—and talked. We watched the fire together—and talked. We pulled out old photo albums—and talked. We played board games—and talked. We climbed into one another’s beds and read the paper—and talked. Are you getting my drift? Realizing that, to quote Anni, “There are people here. Let’s talk to them!” came as an epiphany to all of us, I think. For me, that provoked guilt and delight in almost equal measure. But hey ... isn’t that what being a parent is all about?

  Conversation, studies show, is good for the brain. “No, duh!” as Sussy would say. But these days, it seems, some of us need convincing. According to UCLA neuroscientist Gary Small, talking to people face-to-face—as opposed to face-to-Facebook—provides “greater stimulation for our neural circuitry than mentally stimulating yet more passive activities,” including reading.3 A 2008 study found that subjects who’d spent ten minutes chatting with friends scored better on memory tests than those who’d spent the same amount of time watching TV or reading a book, and the same went for those who’d engaged in “intellectual activities,” in this case, solving puzzles.4 Think about it. More time spent in face-to-face conversations could mean your child remembers where he left his laptop charger.

  Online chatting, on the other hand, has been linked to symptoms of loneliness, confusion, anxiety, depression, fatigue, and addiction. Says Small, “The anonymous and isolated nature of online communication does not provide the feedback that reinforces direct human interaction.” 5 A study published in the journal CyberPsychology and Behavior found that shy people spent significantly more time on Facebook than more outgoing individuals—although they had fewer “friends”—and enjoyed it more too. The possibility of “a reliance of shy individuals on online communication tools” concerned the researchers, psychologists at the University of Windsor
in Ontario, Canada.6

  Half a world away, Japanese psychologist Tamaki Saito has coined the term hikikomori to describe a new breed of young social isolates. The Japanese Ministry of Health defines hikikomori as “individuals [80 percent are estimated to be male] who refuse to leave their parents’ house, and isolate themselves away from society and family in a single room for a period exceeding six months.” But the definition leaves out an important fact: Hikikomori are often, paradoxically, the most “connected” individuals in Japanese society.

  Many hikikomori sleep by day and spend their nights watching manga, gaming, and surfing the Net, surfacing only to sneak into the kitchen for food while the family sleeps. An entire industry has sprung up to address the phenomenon—from parent support groups to online counseling services—but the epidemic continues to rage. In July 2009, Osaka police attributed to hikikomori a spate of street attacks by “apparently troubled people venting their frustration on total strangers.”7 One young man admitted he didn’t care who he killed. “I’d grown tired of life,” was his only defense. Experts believe hikikomori may turn to violence because of their lack of social skills. “Once they come to be considered weird, they prefer to be alone rather than feeling awkward among other people,” explains Toyama University academic Yasuhiko Higuchi. “They then commit an extreme crime after magnifying their stressful thoughts and having no one to talk to.”8 Many hikikomori have failed to form a proper relationship with their parents, he adds. Uh-oh.

  Pre-Experiment, I mention the term hikikomori casually to Bill. He looks up briefly from his game—where a strapping youth is beating the cyber-crap out of a hulking avatar with incongruously girlish hair—and looks down again. “You’re pronouncing it wrong,” he mutters.

  Turned out Bill knew all about hikikomori. In fact, he’d watched an anime series about them. “Really? Where’d you get that from?” I asked. “Um, Japan,” came the reply, the “duh” unspoken but implied. He’d downloaded the show from a file-sharing site. “So, what do you think of them, then?” I asked, in that annoying faux naive manner beloved of therapists, parole officers, and mothers.

  “I think they’re cool,” he replied evenly. (Like most fifteen-year-old boys, he could spot a cautionary tale at twenty paces.)

  “Are you kidding?” I sputtered. “They’re mentally ill! They have no life!”

  He looked up once more, between body blows. Maybe he didn’t say, “It takes one to know one,” but it was there in his eyes.

  I was still enjoying intermittent eye contact with my children (although Sussy, aka Thunder Thumbs, was developing an alarming facility for texting while doing just about anything: talking, eating, walking, and more than once, I swear, during REM sleep) but you didn’t need to be a detective with the Osaka police force to notice that our opportunities for sustained sharing had become increasingly nasty, short, and brutish. Their online world had become “the point”—of existence, I mean—and every other kind of interaction constituted a tangent. An interruption. I was conscious of how often I approached them with words like, “Can you just pause that for a moment and . . .” or, “After you sign out, would you . . .” or, “I don’t need you to log off, but . . .” It was as if life, real life, were a game they’d lost interest in after the first couple of levels.

  Looking around our family room, at the children sitting frozen at their screens, I would be reminded of Swiss architect Max Frisch’s definition of technology: “The knack of so arranging the world that we don’t have to experience it.”

  Relating socially, whether one to one or in groups, seems so fundamental to human nature. The notion that we might need to practice these skills—to practice being human, really—seems odd to me, and perhaps to you too. But neuroscientific evidence reminds us that the pathways in the brain that facilitate interpersonal skills, empathy, and sound social instincts are created, not born. In the case of individuals “who have been raised on technology, these interpersonal neural pathways are often left unstimulated and underdeveloped,” observes one expert.9 Despite their higher IQs and bulging thumb muscles, in other words, The Young and the Listless do show deficits in basic social skills such as empathic listening, and interpreting and responding to nonverbal cues in conversation.

  Some observers have gone so far as to suggest technology may be driving us all toward a kind of social autism—wrapped safely but suffocatingly in our digital bubble wrap, uninterested in and/or threatened by the world outside, and supremely ill equipped to deal with it. Alarmist though it may sound, it’s not entirely far-fetched. In fact, recent research suggests there may even be a link between chronic technology use and clinical autism.

  Whatever the cause, autism rates have skyrocketed during the digital age. Today, according to figures from the European Union Disabilities Commission, autism afflicts one in every fifty-eight children—an increase of up to 500 percent since records have been kept. Many theories have been advanced to explain the epidemic; almost all have been disproved. One that has not is the theory that started out as a parent’s hunch.

  Michael Waldman, an economist at Cornell University, was devastated when his two-year-old son was diagnosed with autism spectrum disorder. But he was also skeptical. He’d noticed that since the birth of their second child a few months earlier, his son had been spending more and more time watching television. Privately, he wondered whether the boy’s socially phobic behavior was not a “disorder” at all, but simply an aggravated case of tuning in and . . . well, tuning out.

  Waldman placed restrictions on the child’s media habits and had him retested. When his “condition” improved and then disappeared entirely, it seemed like a miracle. But economists, thankfully, don’t believe in miracles. Waldman cast about for a way to study his hunch about a link between autism and television viewing. And finally the answer came to him: rainfall data. Stay with me on this one.

  Waldman reasoned that kids in rainier climates watch more TV—which is true, by the way—and therefore that regions with higher-than-average precipitation might also feature higher-than-average rates of autism spectrum disorder. He compared California, Oregon, and Washington—the rainiest states in the United States—with the rest of the nation, and he found his answer. There was more autism in these states. He then looked at only those families who had cable TV subscriptions in these high-precipitation regions, and the correlation was higher still.10

  When Waldman’s study was published in the November 2008 issue of the prestigious Archives of Pediatrics & Adolescent Medicine, it provoked a perfect thunderstorm of abuse and criticism. But the data remain standing. A 2009 article in the Journal of Environmental Health concedes Waldman’s point about the link between rainfall and autism rates, but is more equivocal about causes. Perhaps the real culprit was not TV at all, but vitamin D deficiency, or increased exposure to household cleaners?11

  No one, certainly not Waldman, would argue that television or any other medium “causes” autism, a dizzyingly complex disorder involving sensory, motor, and cognitive difficulties, as well as social ones. But the possibility that chronic media use may act as an environmental trigger for kids with an underlying genetic vulnerability is being taken seriously indeed.

  But we don’t need to drag autism in, kicking and screaming, to explain our kids’ empathy deficits. Let’s not forget, narcissism comes naturally to teenagers. There’s even a specific region of the teen brain that controls their tendency toward selfishness. Digital Immigrants make use of the prefrontal cortex when considering how their decisions will affect others. Natives use their temporal lobes, which are slower and less efficient. Their underdeveloped frontal lobes make teens feel invincible (“Pregnant? Me? As if!”). At the same time, they ensure impaired judgment about almost everything: from how to choose a phone plan to how to choose a boyfriend. Our kids won’t always be this clueless, neuroscientists promise. In theory at least, later brain development will enable them to delay gratification, to accurately assess risk, and—eventually—to co
nsider the feelings of others.

  In other words, we can’t blame our kids’ digital distractions for all their ditziness. Nor is it necessarily true that kids either spend huge amounts of time with media or they engage in lots of nonmediated activities. The Kaiser Family Foundation’s Generation M study found just the opposite, in fact. Contrary to researchers’ expectations, it turned out that “heavy overall media users also tend to spend more time engaged in several non-media activities than do light and moderate media users.” Specifically, the 20 percent of eight- to eighteen-year-olds who were the biggest self-reported screen freaks were also the ones who spent the most time “hanging out with parents, exercising, and participating in other activities such as clubs, music, art, or hobbies.”12 Interesting. Especially given that the average amount of time kids spent on-screen in that study was 8.5 hours. You’ve got to wonder: When were the heavy users actually doing all that art and music and stamp collecting? In their sleep?

  At the age of fourteen, Joan of Arc was leading the French army to victory in the Hundred Years’ War. Sussy, also fourteen, struggles to change a fitted sheet. Eighteen-year-old Anni can be heard whimpering when she discovers the can of baked beans she wanted for lunch has no ring-pull. And my son the electronics whiz, who’s been putting together robots since he was eleven, claims he hasn’t quite gotten the hang of the dishwasher yet. Honestly. Who do these people think they are—somebody’s husband? And, more to the point, how did they get that way? Personally, I blame the guy who invented those Velcro shoe-fastener strips.

  Is it just in our household that teenagers struggle with skills and competencies that were once taken for granted by the smallest children? Evidently not. Some observers have suggested that Digital Natives—aka the Pull-Ups Generation—may be suffering from a kind of global life-passivity that goes way beyond garden-variety teen cluelessness. While acknowledging the universal truth that older generations inevitably view the younger ones snapping at their heels as degenerate, unmannerly, and incompetent—the technical term for this being “envy”—there does seem to be something new and scary going on here. In the United States, colleges have introduced undergraduate courses in basic life skills such as banking and doing laundry and ordering from a restaurant menu. (Remedial can-opening, anybody?)

 

‹ Prev