Technically Wrong

Home > Other > Technically Wrong > Page 8
Technically Wrong Page 8

by Sara Wachter-Boettcher


  In one instance, the team was brainstorming ideas for a 404 page. On the web, a 404 error means “page not found,” so a 404 page is where you’re redirected if you try to click a broken link. They usually say something like, “The page you are looking for does not exist.” But at the time, the team was really focused on developing a funny, unique voice for MailChimp. So they decided to call it an “oops” moment and started brainstorming funny ways to communicate that idea. Pretty soon, someone had designed a page showing a pregnancy test with a positive sign. Everyone thought this was hilarious—right up until Kiefer Lee took it to her CEO. “Absolutely not,” he told her. Looking back, she’s glad he killed it. “We wanted to be funny and delight people. I think we were trying too hard to be entertaining.” 8

  Realizations like this made Kiefer Lee and her team turn a more critical eye to the way MailChimp was communicating—particularly around touchy subjects, like someone being banned from the service for sending spam, or a user’s credit card being declined. “We focus on clarity over cleverness and personality,” she told me. “We are not in an industry that is associated with crisis, but we don’t know what our readers and customers are going through. And our readers and customers are people. They could be in an emergency and they still have to use the internet.” 9

  Other tech companies haven’t caught up. In fact, in recent years, I’d argue, the trend has gotten worse, morphing into some truly awful design choices along the way.

  THE SHAME GAME

  You know when you try to shop online or read an article, but a little window pops up before you can even see what you’re looking at, trying to get you to sign up for the site’s email list? I’m sure you do, because you can barely use the internet without this happening. In tech, we call these kinds of pop-ups opt-ins, meaning you’re being asked to choose to receive marketing messages from a company. If you click the “no thanks” option, that’s an opt-out. But not only is the design intrusive and frustrating—Can I at least see your products before being badgered into daily emails?—there’s a new design trend that’s making them even worse: rather than tapping a button that says “no thanks,” sites are now making users click condescending, passive-aggressive statements to get the intrusive window to close:

  No thanks, I hate saving money.

  No thanks, this deal is just too good for me.

  I’m not interested in being awesome.

  I’d rather stay uninformed.

  Ew, right? Do you want to do business with a company that talks to you this way?

  I guess you could say these blamey, shamey messages are more “human” than a simple yes/no—but only if the human you’re imagining is the jerkiest jerk you ever dated, the kind of person who was happy to hurt your feelings and kill your self-esteem to get their way. That’s why I started calling these opt-in offers “marketing negging.”

  Negging, if you’re lucky enough to be unfamiliar, is a technique used by the “pickup artist” (PUA) community—a loose association of men who share tricks and manipulation tactics designed to help them pick up as many women as possible. The term comes from an abbreviation of “negative,” and the concept is simple: you walk up to a woman and use subtle digs and passive-aggressive faux compliments to “lower a girl’s social value in relation to yours,” as the popular PUA site Seduction Science puts it.10 The idea is to erode a woman’s self-esteem and make her feel like she can’t do any better. Maybe you tell her that her roots are showing. Maybe you tell her that she’s too pretty to wear such an unflattering outfit. Then, the theory goes, she’ll feel vulnerable, and be more likely to say yes to you. PUA websites (of which there are many, and which I cannot, in good conscience, suggest you visit) are littered with lines like these, and many of the men who use these forums obsessively test them out, reporting back on what works and what doesn’t.

  Negging is creepy as hell, treating women as objects to be collected at all costs. These shamey opt-out messages do pretty much the same thing to all of us online: they manipulate our emotions so that companies can collect our information, without actually doing any of the work of creating a relationship or building a loyal following. And, just like the men who frequent PUA forums and make spreadsheets of which pickup lines work most often, companies spend endless hours testing out different copy strings, and analyzing which ones are most effective.

  These sorts of jokingly nasty messages might seem like the polar opposite of the friendly, overly celebratory ones we’ve looked at so far in this chapter. But I would argue they’re not so different at all. What they all do is wrap tech’s real motives—gaining access to more information, making us more dependent on their services—in a cloak of cuteness that gently conditions us to go along with whatever’s been plunked down in front of us. There, there, dear. Don’t worry about what we’re doing with your account. Have a balloon. And if we’re not on board—if we don’t acquiesce to their negging or laugh at their jokes or reminisce when they want us to—then we’re just no fun.

  BLINDED BY DELIGHT

  It was the spring of 2012, and I was sitting at a glass conference table in a glassed-in conference room in a fancy building high above Manhattan’s West Side Highway, brainstorming ideas for redesigning a major bank’s credit card rewards program. The brief we were given at the start of the session: come up with fun, unexpected ways to curate and customize the offers that cardholders receive—you know, get double points for shopping at the Gap this month, or whatever.

  “What if we customized offers based on the weather!?” an account planner said.

  “Ooh, what about filtering offers based on your horoscope?” a designer suggested.

  “But no matter what, it needs to be delightful—more like Pinterest!” the creative director insisted.

  I looked around the room, bewildered. Why would anyone want their credit card offers to be dependent on the weather? What, precisely, would we do to make a 1-800-Flowers purchase particularly relevant to a Scorpio? And who would think that just because people might enjoy idly skimming haircut ideas or DIY craft projects on Pinterest, they also want to spend hours scrolling through endless “limited-time offers”?

  I didn’t say any of that, of course. I didn’t say much at all. Mostly, I wondered to myself, “How the hell did I end up here?”

  That part’s easy: a few months earlier, I’d quit my seventy-hour-a-week job at an agency to start my own consulting business. I was twenty-eight and scared as hell. I was also working on my first book (which I was certain I had no business writing) and planning a cross-country move (from a place I hated to a place I was sure I’d also dislike). My life felt like one big risky venture. So when an agency in New York contacted me, saying they needed a content strategist for a large project, I jumped at the opportunity—mostly because it meant not worrying about finding clients for a few months.

  So there I was, trying as best I could to advocate for the people we were supposed to be designing for: cardholders who wanted to understand how to get the most value out of their credit cards. But time and again, the conversation turned away from how to make the program useful, and toward that word I find so empty: “delight.”

  Delight is a concept that’s been tossed around endlessly in the tech industry these past few years, and I’ve always hated it. “Is there a formula for delight?” one article on a design website asks (um, no, there isn’t). “When a product is delightful, it just makes sense,” starts another (or, maybe more likely, in order for a product to ever be considered delightful, it first has to make sense). “Take your user experience to the next level by adding delight!” exhorts yet another (what, is delight an ice cream topping now?).

  And that’s the sort of design sensibility we’ve seen throughout this chapter: clever copy, peppy music, the insistence that your memories must be rehashed and shared. All of it is based on the belief that true delight can be reduced to the digital equivalent of whipped cream in a can: a layer of fluffy sweetness sprayed on top of the experience. But
as we’ve seen over and over, when teams laser-focus on delight, they lose sight of all the ways that fun and quirky design can fail—creating experiences that are dissonant, painful, or inappropriate.

  Humans are notoriously bad at noticing one thing when we’ve been primed to look for something else instead. There’s even a term for it: “inattentional blindness.” Coined by research psychologists Arien Mack and Irvin Rock back in the 1990s,11 inattentional blindness is a phenomenon in which humans fail to perceive major things happening in front of their eyes when their attention has been focused on something else.

  The most famous demonstration of inattentional blindness in action is known as the “invisible gorilla” test, in which participants watch a video of two groups of basketball players, one wearing white shirts and the other wearing black shirts. Before watching, they’re asked to count how many times a player wearing a white shirt passes the ball. Halfway through the one-minute video, a person in a gorilla suit walks into the scene and beats their chest, staying on-screen for a total of nine seconds. Half the participants routinely fail to notice the gorilla.12

  This study has been replicated and tweaked lots of times, and the results are more or less the same: when people focus on one task, their attention narrows, dramatically decreasing the likelihood that they’ll notice other details. In fact, more recently, researchers tried a similar experiment with radiologists—a group highly trained to look closely at information and identify abnormalities. In this experiment, a gorilla the size of a matchbook was superimposed onto scans of lungs. The radiologists were then asked to look for signs of cancer on the scans. A full 83 percent of them failed to notice the gorilla.13

  Designers, like radiologists, pride themselves on attention to detail: they obsess over typographic elements like line kerning and spacing; they spend hours debating between nearly identical colors; they notice immediately when an object is a few pixels off center. Of course they do; those are the sorts of things they’ve been trained to notice. They’d be laughed out of design school and off the internet otherwise. But if they’ve never been asked to notice how their work might fail people—or been made rudely aware of the problem after it has—they’re just as blind as the rest of us. So when a design brief says to focus on new ways to delight and engage users, their brains turn immediately toward the positive: vacation photos flitting by to a jazzy beat, birthday balloons floating up a happy Twitter timeline. In this idealized universe, we all keep beep-beeping along, no neo-Nazis in sight.

  In Design for Real Life, Eric Meyer and I urged designers to combat this kind of blindness by building in steps that force them to question how and when their fun-and-friendly features could fail. After all, most designers aren’t out there aiming to harm their users, and taking a moment to identify stress cases and find fractures in their work would ferret out a lot of problems. But these sorts of gentle nudges aren’t enough in a tech culture where “user engagement” trumps all—where it’s more important to gather user data and inflate valuation numbers before an acquisition than it is to care for the actual people affected by a design choice.

  Take any one of the examples in this chapter, and just underneath its feel-good veneer you’ll find a business goal that might not make you smile. For example, Twitter’s birthday balloons are designed to encourage people to send good wishes to one another. They’re positioned as harmless fun: a little dose of delight that makes users feel more engaged with Twitter. But of course, Twitter doesn’t really care about celebrating your special day. It cares about gathering your birth date, so that your user profile is more valuable to advertisers. The fluttering balloons are an enticement, not a feature. Delight, in this case, is a distraction—a set of blinders that make it easy for designers to miss all the contexts in which birthday balloons are inappropriate, while conveniently glossing over the reason Twitter is gathering data in the first place.

  The same is true for all those Facebook features designed to make you relive your past. Facebook has an internal metric that it uses alongside the typical DAUs (daily active users) and MAUs (monthly active users). It calls this metric CAUs, for “cares about us.” CAUs gauge precisely what they sound like: how much users believe that Facebook cares about them. Tech-industry insider publication The Information reported in early 2016 that nudging CAUs upward had become an obsession for Facebook leadership.14

  But the metric doesn’t gauge whether Facebook actually cares about users. All that matters is whether you feel like it does. Warm-and-fuzzy features like On This Day, which sends reminders about past posts, and Moments, which creates those peppy video collages, are aimed squarely at making you feel like Facebook cares about you on a personal level. And as long as that’s happening on the surface, Facebook is free to keep dealing in user data in ever-more-worrisome ways, including not just tracking what you say and do on its site, but also buying up dossiers on users from third-party data brokers (something we’ll look at more closely in the next chapter).

  Or consider those ridiculous email sign-up forms with their icky opt-out messages. It would be almost refreshing if companies wanted to gather your email address only so that they could send you more spam. But there’s a more insidious goal here too: companies that are seeking to be acquired (which is what a good percentage of tech startups want) are valuated higher when they have larger subscriber lists.15 Personal data, in this case, is an asset in and of itself—even if the quality of the list is low. That’s why these companies are so shameless: They’re not really trying to build loyalty. All they want is data.

  FAKE FRIENDS

  The neo-Nazi Tumblr notification that Sally Rooney received struck a nerve: as I write this, her screenshot has been retweeted nearly seven thousand times, and “liked” more than twelve thousand times—a pretty big feat, considering that most tweets die out within a few minutes of being posted. It even caught the attention of Tumblr’s head writer, Tag Savage. “We talked about getting rid of it but it performs kinda great,” 16 he wrote on Twitter, as Rooney’s screenshot went viral.

  When Savage says the “beep beep!” message “performs,” he means that the notification gets a lot of people to open up Tumblr—a boon for a company invested in DAUs and MAUs. And for most tech companies, that’s all that matters. Questions like, “is it ethical?” or “is it appropriate?” simply aren’t part of the equation, because ROI always wins out.

  All these cutesy copy strings and celebratory features create a false intimacy between us and the products we use. We’re not actually friends with our digital products, no matter how great their personalities might seem at first. Real friends don’t create metrics to gauge whether people think they care. They don’t try to tell you jokes when you’re in the middle of a crisis. They don’t force you to relive trauma, or write off hate speech, or any of the things tech products routinely do in the name of engagement. They simply care. Tech companies, on the other hand, use “personality” to manipulate us—to keep us clicking and tapping and liking and reading and saving and faving, while, just outside the screen, they flout regulations, steal data, and keep oppressive systems intact. It’s time we see through them.

  Chapter 6

  Tracked, Tagged, and Targeted

  It was December 2016, and I was sitting in the back of a blindingly white room on Mulberry Street in Manhattan’s Nolita neighborhood, next door to a clothing boutique and a place selling French tartines and overpriced salads. It looked like an Apple Store crossed with an art gallery—just the kind of pricey pop-up shop you’d expect to see in this upscale downtown neighborhood two weeks before Christmas. Black-and-white photos lined the walls. Shiny tablets beckoned from gleaming-white tables. A security guard stood at the door. Perched next to me on a row of tall, white stools was a series of customers, coats and shopping bags in hand, waiting for assistance from one of the many staffers clad all in white.

  It looked like an Apple Store, until you saw what was happening on all those screens. On one tablet, a video told the story o
f data brokers: the companies that make their millions mashing up data about you from as many online and offline sources as possible, and selling it to countless companies. Another displayed data from a “predictive policing” software program designed for law enforcement. The software combines historical crime data with a host of other factors—the weather, locations of bars, and even analysis of social media posts—to determine where and when crimes might occur. Yet another demonstrated how much could be learned about a person from just their email’s metadata (which is what the NSA was collecting until 2011): subject line, sender, recipient, time stamp, and the like. The answer? Often, enough to pinpoint where a person was and what they were doing at any given time.

  As you’ve likely guessed, this wasn’t a store at all. This was the Glass Room: an immersive installation that encouraged visitors to “consider how you use technology and how those behind technology use you.” Curated by the nonprofit Tactical Technology Collective and funded by Mozilla, makers of the Firefox internet browser, the Glass Room painted a bleak picture of just how much personal information can be gleaned from our daily technology use—from the links we click to the posts we like to the real-life places we go while our phones are simply sitting in our pockets—and how that data gets transformed from individual strings into massive tomes.

 

‹ Prev