Book Read Free

Because Internet

Page 22

by Gretchen McCulloch

ashna: hello?

  dave-g it was funny

  how are u jatt

  ssa all

  kally you da woman!

  ashna: do we know eachother?. I’m ok how are you

  *** LUCKMAN has left channel #PUNJAB

  *** LUCKMAN has joined channel #punjab

  dave-g good stuff:)

  kally: so hows school life, life in geneal, love life, family life?

  jatt no we don’t know each other, i fine

  ashna: where r ya from?

  The conversation in this chatroom is really two interwoven conversational threads, one between ashna and Jatt (“do we know each other? how are you” “no we don’t, i fine”) and the other between kally and Dave-G (“only joking” “it was funny” “you da woman”). Public chatrooms had a niche appeal: the average internet user was more likely to encounter chat slightly later and with a person they already knew via instant messaging or chat apps, such as the first wave of ICQ, AOL Instant Messenger (AIM), or MSN Messenger, and the second wave of Gchat (which became Google Hangouts), Facebook Messenger, iMessage, and WhatsApp. But even between the same two people, overlap is common in chat—each person may introduce a topic around the same time, and then since they both hit send at once, they can both start responding to the opposite topic in parallel.

  What’s curious is that we’ve retained this essential paradigm of chat for nigh on four decades now, even as many individual chat platforms have come and gone. We’ve added new features on top (like better graphics support and the “is typing” indicator), but at their core, chat conversations still consistently happen in a stream, and with a high tolerance for multiple, interwoven message threads. Even the “is typing” indicator is a solid couple decades old. This is an eon in computer years: the mouse wasn’t even common yet in 1980, let alone laptops or touchscreens! And yet we know from the shared-boxes format of only a decade earlier that the initial format for chat wasn’t at all obvious. (Email, by comparison, is older than chat, but it also came with a clear postal analogue from the very beginning.) Chat is a format entirely of and for networked computers, predicated on the idea that you could have a real-time conversation between two or more connected screens. An offline equivalent to chat is barely even sensible, because the circumstances in which you’re able to have a real-time conversation with someone on a piece of paper but not able to see or speak to them are so limited. (Passing notes in class or during a meeting might be one exception, but even there you’re able to see the other person’s raised eyebrow or muffled giggle.)

  The chat format’s astonishing durability signals the true birth of a new form of communication. Chat is the perfect intersection of written and informal language. Let’s consider what we know about these formats. We can read faster than we can speak, and reading also lets us glance back and check something again, which means that writing naturally supports longer and more complex sentences: if you compare an essay and the transcript of a famous speech, the essay will have more subordinate clauses, while the speech will have more repetition. (If you’ve ever been forced to listen to a novice public speaker read an essay out loud, it’s not your fault you found it hard to follow.) As far as speaking in general goes, the more formal it is, the fewer interruptions it has. A public speaker can reasonably expect to hold the floor for their entire designated period, and anyone else who wishes to talk must either ask permission by raising a hand, or accept the title of “heckler.” But you can’t heckle a conversation that you’re part of already: that’s nonsensical, a back-and-forth is expected. Instead, our disparaging vocabulary goes in the other direction: a person who treats a conversation like a speech is long-winded. So when we look at informal writing, we should expect to find both a high information density and a lot of interruptions. Or in other words, exactly what people do in chat. Chat gets bonus extra interruptions in comparison to informal speech, because writing as a medium lets us handle those extra words. What the chatrooms discovered was that overlapping messages weren’t a bug, they were a feature.

  While emails and social media posts and website text can all lay claim to the title of informal writing by virtue of being unedited, chat is informal writing in its purest form. A post on social media doesn’t go through an editor, but when we know that hundreds or thousands of people may see it, it’s hard to say that we’re not editing ourselves at some level. An email goes to a controllable list of people, but with the amount of digital ink that’s been spilled on email etiquette, again, it’s hard to say that we’re not thinking through (perhaps in fact overthinking) our emails. But with chat, the audience is known and the time horizon is fast. The other person can literally see that you’re typing, so it’s better to just get something out there than worry about composing the perfect message. Chat isn’t as widely studied as those conveniently public tweets, but from what we do know, people use language more informally there than in public posts, using more creative respellings, expressive punctuation, acronyms, emoji, and so on: it’s the most hospitable environment for internet slang. It’s even still tolerant of typos: if you transpose some letters or your autocorrect fails you, you can self-correct in the next message and there’s not much time for misinterpretation.

  So popular is chat that it’s also expanded into domains that previously belonged to the email or the phone. Before smartphones, texting had been set up like a miniature email inbox, letting you read one message at a time, with separate screen views for received messages, sent messages, drafts, and composing a new message. When the screens got bigger and became touch-sensitive, the dominant model for texts became the chat stream rather than the email inbox. Reviews of the first-generation iPhone thought it worth noting that “as on many smart phones, a text message thread is displayed as one long conversation—a useful arrangement that allows you to pick which messages you’d like to answer.” When texts jumped over into a chat-style interface, chat became thoroughly ubiquitous: you can opt out of social media and use your email inbox purely for auto-generated confirmation emails, but if you use digital technology for communication at all, you’ll end up participating in some form of chat. So complete is this shift that a decade later I started hearing people using “text” as a generic term for chat in general, as in “text me on Twitter.”

  Many popular social apps are simply variations on a chat app that happened to gain a foothold in a particular area, like WhatsApp in much of the world outside North America, WeChat in China, and Line in Japan and Korea. The biggest change that the chat paradigm developed as smartphones caught on was the integration of multimedia. Snapchat and its imitators let you send a photo with words on top that disappears after a few seconds. WeChat, WhatsApp, and their imitators let you send a short audio clip which is integrated into the back-and-forth chat interface. Both can be more expressive, but can also be hard to use in dark or loud environments.

  Chat is also competing with email in professional contexts. For instance, Slack is a chat platform for talking with workplace teams. The first time I got to talk with my internet service provider’s tech support via chat rather than on the phone, it was a delight to be able to simply type in the correct spelling of my name and address rather than having to spell out each part aloud. With digital assistants that can set timers or respond to our queries about what the weather’s going to be tomorrow, chat is also becoming an interface for us to talk with the machine itself.

  The key feature of chat is its real-time nature, but what it means to be real-time has shifted as our internet norms have shifted. When the internet was a place for die-hard hobbyists to explore new people to talk to, in the days of chatrooms full of strangers, the room would announce your arrival to those who were already there (“________ has entered the chat”) and your departure to those who remained (“_________ has exited the chat”). When chat became a thing of people you already knew, but still tied to computers, th
e instant messaging would display a “buddy list” of your contacts that told you whether they were online or not: AOL Instant Messenger would play the sound of a door opening or closing whenever someone arrived or left, while later programs like Gchat merely displayed a subtle green dot. When chat became mobile, the relevant information shifted again: we’re almost always around a device now, but we’re not always free to look at our messages. So chat stopped displaying whether someone was “present” and instead displayed whether someone had seen the latest message. “Read” indicators track smartphones almost exactly, starting with BlackBerry Messenger (BBM) in 2005 for die-hards and Apple’s iMessage in 2011 for the mainstream.

  Being real-time is also chat’s greatest weakness. You can set aside a time to batch-reply to emails or check social media, but chat requires a certain generic availability in order to be useful. Chats or text messages, which have become pretty much indistinguishable, have the potential to intrude on whatever else you’re doing, especially on a mobile device. But this isn’t the first time we’ve faced technological interruption: once again, we can draw insight from the early days of landline telephone use. Before the phone, letters only arrived at designated times of the day, and no one knew if and when you read them; only certain people lived close enough to drop by unexpectedly. But a phone call could arrive from anyone, anywhere, and the only thing you knew about the caller was that someone wanted you—urgently. Perhaps unsurprisingly, then, a 1992 survey found that the overwhelming majority of people would answer a ringing telephone even during a serious argument with their spouse. I tried replicating the survey myself twenty-five years later, and found the exact opposite result: people overwhelmingly wouldn’t pick up the phone during a serious discussion with a loved one. Even if nothing particular was going on, people often reported checking to see who was calling before deciding whether to answer. I didn’t find the expected age gap, between people who’d acquired their phone norms with landlines and those who oriented towards cellphones: rather, the people who reported a strong inclination to always answer a ringing phone were in their eighties and nineties, not their forties and fifties. Many people evidently adjusted their phone norms in the years after 1992, as caller ID became widely available.

  Phone calls have come to represent the halcyon days when people actually talked to each other, but at the time, they had their own communication problems. In the 1970s, 80s, and 90s, a major problem in business communications was that only one in four phone calls resulted in the desired conversation: too often, the person you wanted to reach was out of the office or already on the phone. At first, your only solution was to simply wait a while and try again, or at best, leave a message with another person or on voicemail, hoping that you’d be around when they called back—but if you weren’t, now you had to try and call back again. And so on. This “telephone tag” could stretch for days or even weeks. No wonder people picked up every time they heard the phone ring, even if it was just to say, “Sorry, it’s not a good time, can I call you back in an hour?” There was no real way of scheduling a phone call that didn’t involve a phone call.

  Internet and mobile devices changed this set of norms, not even a full century old itself. If you have a less intrusive way to establish whether someone’s available before putting them on the spot with a call, why not take advantage of it? But by the same token, chat has taken on the position that phone calls once had as the most probable way of reaching someone, which in turn means that we must be reachable there instead. (Although sometimes we don’t want to be: “butler lies” are the polite social fictions which we use to manage our availability in chat, like “Sorry, just got this” or “Gotta get back to work.”) Op-ed articles from the 2010s reveal a generation gap around technological interruptions: younger people find that responding to a text message in the company of others is reasonable, because you can integrate it into the pauses of the conversation, but that unplanned phone calls are a gross interruption because they demand your attention instantly, completely, and unpredictably. Older people are perfectly happy to interrupt or be interrupted by a voice call, because they’re unexpected and therefore urgent, but find the sight of someone texting an imposition, precisely because you could have put it off until after the conversation entirely.

  This shift in norms is responsible for finally popularizing videocalling. The technology for the videophone has been available since the 1960s—it’s just a telephone spliced with a television, after all. Pundits kept predicting it, but it never seemed to catch on. The problem with videocalling was that it faced an insurmountable social obstacle: with a robust norm of always answering a ringing phone and no efficient way to plan a phone call except via the same medium, the risk was too great of catching someone unclothed or with a messy house in the background. Picking up a videocall out of the blue was simply too awkward to contemplate. But since every videochat program includes a text messaging feature, you can plan a videochat before committing to one (“hey, you ready to skype?” “just give me 2 min”) and this awkwardness vanishes: you have the option to decline via text where no one can see you, or a minute to scramble into a decent-looking shirt. Paradoxically, having access to the lesser intrusiveness of chat conversations makes it easier to have higher-bandwidth conversations in video.

  Posts and the Third Place

  The lure of cyberspace to its early arrivals wasn’t just as an easier way of passing notes, avoiding telephone awkwardness, or sending interoffice memos. It was the promise that somewhere out in the world, you could find other people who matched your unique weirdnesses, or at least understood your niche passions. But to send someone a message, you need to find them first, and for that, you need some sort of shared space that several people can drop in on.

  The idea of a third place is often invoked to explain the appeal of Starbucks: the first place is home, the second place is work, but people also need a third place to socialize that’s neither home nor work, like a coffeeshop. What Ray Oldenburg, the sociologist who coined the term in a 1989 book called The Great Good Place, had in mind was something more specific than just any convenient spot where you might stop by for a cup of joe. Oldenburg’s third places are first of all social centers, distinguished by an emphasis on conversation and playfulness, regular attendees who set the tone for newcomers, the freedom to come and go as you please, a lack of formal membership requirements, and a warm, unpretentious feeling of home away from home. Examples include pubs, taverns, and bars, cafés and coffeeshops, barbershops, community centers, markets, malls, churches, libraries, parks, clubs and organizations, main streets, public squares, and neighborhood activities like block parties, town meetings, and bingo.

  When I think of where my own third places have been, I keep coming back to hallways. In high school, we’d sit in the halls with our backs against the lockers at lunchtime or recess, certain corners occupied by certain regulars. In dorms, you’d signal your willingness to join into impromptu social activity by whether you left your door open to the hall. At conferences, the talks are merely a pretext for assembling people with shared interests so that we can run into each other in the hallways. In the best third places, it takes me half an hour to travel the length of a single hallway because I run into seventeen people who I absolutely must talk with—and sometimes I even set out for such a walk with no particular destination, because I know I’ll run into someone enjoyable.

  When I’ve tried to articulate the appeal of Linguist Twitter to linguists who aren’t on it, I’ve talked in terms of hallways: You know how the best part of a conference is the hallway? Imagine if you could have that hallway available at any time of the day or night! But perhaps I should have talked in terms of third places. The parallel is compelling, and not just for my corner of the internet: the emphasis on conversation and wit describes the memes and hashtag wordplay games, like #RemoveALetterRuinABook, that regularly sweep the trending topics. Unlike an email inbox or a chat with a specific person, you can drop in
on a social media feed at any time of the day or night and expect to see both regulars and newcomers. Trying to focus on work with social media available is, alas, like trying to work in a hallway of friends and acquaintances, but you can also credit social media’s serendipitous encounters with letting you hear about job opportunities and other useful morsels of news.

  When Facebook and Twitter started letting you post status updates, their appeal was explained in terms of ambient awareness of what your friends were doing, which could lead to spontaneous encounters or being able to pick up a conversation later as if no time had passed, without needing to catch up first. Facebook status updates in 2006 came with a few dropdown options meant to reflect typical college activities, like “is sleeping,” “is studying,” “is in class,” or “is at a party.” Even when you typed in your own status message, the “is” was obligatory and a period was automatically inserted at the end, clearly trying to push people in the direction of a particular genre of update. While early tweets didn’t have the same grammatical constraints, they still tended to be about the here and now, such as “walking on the sunny side of the street,” “digesting a burrito,” and “just setting up my twttr.”

  It’s true that people did use statuses for mundane daily updates, but ambient posts about what people had for lunch didn’t explain why Twitter was such an effective tool for coordination in times of natural disaster and political upheaval. Awareness that your friends were in the library or watching a movie didn’t explain why people spent an average of fifty minutes per day on Facebook in 2016, up from forty minutes per day in 2014. Moreover, as connectivity became ever easier and more mobile, it was no longer necessary to explain why you were away from the computer: you weren’t. Yet social media posts got more popular, rather than less, joined by mobile-first platforms like Instagram and Snapchat, which required that posts include an image or short video. Snapchat and later Instagram even brought us a new format for posts: the story which vanishes after twenty-four hours, a window into the fun, non-computery things you’ve been doing. A “normal” profile page gradually changed from being a list of static facts about you to a list of things you’ve posted recently. What does explain the appeal of posts in their various formats is thinking of them as a third place.

 

‹ Prev