Book Read Free

Work Won't Love You Back

Page 32

by Sarah Jaffe


  By this time, computers and games were becoming more firmly entrenched as toys for boys (or men who’d never stopped being boys). Women’s participation in computer science programs fell from nearly 40 percent in the 1980s to below 20 percent at present, as personal computers, mostly good for gaming early on, were marketed to little boys, cementing further the idea that it was men who would be the new programmers. Pop culture picked up on this trend, making heroes of white male computer geeks. Anyone who didn’t have a personal computer fell behind when it came to computer skills, erecting a class barrier to go with the gender barrier. Schools tended to accept, and companies tended to hire, people who looked like their idea of a “computer person,” which was, according to science and technology researcher Janet Abbate, “probably a teenage boy that was in the computer club in high school.” The assumption remained that computers, like art, were something one had to have a natural talent for; women were good at community and caring for others, and men were good at things that required an isolated, antisocial genius. The split between the two kinds of laborers of love solidified, keeping them from seeing that they both had similar struggles over long hours, capricious management, and a lack of control over the products of their work. That these gender roles were socially created stereotypes, not innate characteristics, seems not to have occurred to any of these supposedly brilliant men.20

  The dot-com boom of the 1990s saw personal computers become ubiquitous, big profits reaped, and then the first big bust, as overvalued companies, inflated with venture capitalists’ cash, deflated or popped. The Clinton administration largely built on the privatization and deregulation of the Reagan-Bush years, but gave them a veneer of cool, and the dot-coms epitomized this trend. During this period, sociologist Andrew Ross was studying the workers of New York’s “Silicon Alley” to understand these new workplace trends, which he dubbed “no-collar.” In the brave New Economy, workers embraced a certain antiauthoritarian perspective, trading in the old status markers of power suits and briefcases for hoodies and T-shirts. The workers adopted the work styles of the bohemian artist, bringing their expectations of creative labor to their new jobs in tech. They also brought a willingness to work in lousier environments in return for deferred financial gain (stock options, in many cases) as long as the work itself was stimulating, creative, “work you just couldn’t help doing.” Ross dubbed this phenomenon the “industrialization of bohemia.”21

  These workplaces were designed to incorporate the “playbor” of techies, whose tendency to color outside the lines otherwise might have become more obvious resistance. Let the coder wear his “RESIST” button to the Pentagon, let the developers play games on their work machines, then they’ll be happier to do their work. These “digital artisans,” as Ross called them, were made to feel that they had a level of control over the machines. But unlike the original artisans, whose tools were theirs to control, the tech workers were still laboring for a big employer pocketing the profits. After all, the original Luddites didn’t break machines because they opposed technology, but because the technology was designed to deskill them and make them obsolete. The fun-loving tech workplace, already beginning to be stocked with foosball tables and other games to play, made the programmers feel secure that they were powerful and could never be replaced. Yet companies were already increasing their workplace surveillance, and in many cases already trying to figure out ways to break up tasks and cut into the creative freedom of the programmers.22

  These workspaces, researcher Julian Siravo pointed out, take their cues from the spaces that techies themselves created. “Hackerspaces” took inspiration from the 1960s and 1970s protest movements’ tendency to take over public or private buildings for their own use; the emerging computer culture adapted this practice from student radicals and autonomia and began to create its own spaces in the 1970s and 1980s. Groups like the Chaos Computer Club in Germany established regular in-person meetings, which were imitated elsewhere. The spaces continued to pop up all over the world: communal, nonhierarchical locations in which members do a variety of programming and physical construction. Before the Internet, hackerspaces were necessary to share information and skills; after the Internet, they became places in which members are, Siravo wrote, “questioning radically the ways in which we currently live, work and learn,” taking a William Morris–like interest in challenging the divisions in capitalist production. But that freedom is something different in a space that people have designed for themselves in which to explore and create; in trying to replicate those spaces in a for-profit company, the big tech corporations have co-opted this exuberance.23

  The boundaries between work and leisure thus blurred even more in the new tech companies, bringing more of the things workers might have done in their spare time into the workplace. The growth of the Internet helped blur these lines even for workers outside of the tech industry, who were now expected to check email at home, or who might play a game or write a personal blog on company time—and, particularly with the growth of social media, sometimes face workplace consequences for things they did in their free time and documented online.24

  The lines blurred in another way, too: users’ online behavior, from the items they searched for on Google to their interactions during online multiplayer video games, created value for the tech companies. “Users made Google a more intuitive product. Users made Google,” Joanne McNeil pointed out. But that didn’t mean users owned Google. How was their labor—the labor of producing data, of producing a “user experience” that necessitates other users to be meaningful—to be calculated?25

  The values of the early Internet—openness, sharing, collaboration—meant something different on a privatized Web where profit was the name of the game. As the cliché goes, “if you’re not paying for it, then you’re the product,” but users on today’s Internet are something more than just the product—they’re more like a self-checkout counter where the thing they’re scanning and paying for is themselves. The users are being sold to advertisers, but they are also providing the labor that makes these companies profitable—labor that is unpaid, and indeed invisible as labor. Facebook and Twitter would be worth nothing without the people who use them—and the fact that millions do is the reason why these platforms are hard to give up. Yet thinking of those users—ourselves—as workers would require us to understand the “social” part of social media as requiring valuable skills as well, something that tech companies resolutely refuse to do. And, of course, it’s in their interest not to—if they had to pay for the value we create for them, those tech billionaires wouldn’t be billionaires.26

  THE CREATIVE WORK OF THE TECHIES, THEIR MUCH-VAUNTED “INNOVATION,” is the thing that is celebrated in these flexible, toy-filled workplaces, but this emphasis belies the fact that most programming work is, frankly, boring. It’s grueling, repetitive, requiring focus and patience—and often plenty of cutting and pasting or working from pre-prepared kits. Yet the myth of the tech genius obscures much of this labor. Think of how many of Apple’s fantastic devices, for example, are attributed to the singular brilliance of Steve Jobs, who couldn’t write a line of code, rather than the legion of engineers who did the real work. These tech prodigies were justified by such hype in hiring little clones of themselves, in never questioning how it was that everyone who was a genius was also white and male, never asking why the number of women who left tech jobs was double the number of men.27

  The reality is that the work—like most creative work, ruthlessly romanticized—is a slog. A New York Times story on Amazon’s work culture featured employees who’d been told that when they “hit the wall,” the solution was to climb it. They spoke of emails arriving in the middle of the night, and followed by angry text messages if they did not answer immediately. The staff faced an annual cull of those who purportedly couldn’t cut it. Employees “tried to reconcile the sometimes-punishing aspects of their workplace with what many called its thrilling power to create,” but the exhausting pace made th
em feel more like athletes than artists. Employees frequently cried at their desks, trapped in something bearing an uncanny resemblance to the ups and downs of an abusive relationship.28

  At Facebook, things were a little bit different—at least according to Kate Losse, who detailed her experience as one of the company’s early nontechnical employees in her memoir, The Boy Kings. But the sense of awe at the power in her hands was the same, at least before Losse’s eventual disillusionment and break with Facebook and its founder, Mark Zuckerberg. The work that Losse did—customer service work—was devalued from the very start by Zuckerberg, who fetishized hackers and Ivy Leaguers who he imagined were crafted in his own image. “Move fast and break things,” was his motto, and moving fast and breaking things were things that boys did. Losse nevertheless worked her way in, figuring, “You can’t run a successful company with boys alone.”29

  Losse befriended the “hacker boys,” including one particular teenager who was hired after he hacked Facebook itself. She joined them on trips to a Lake Tahoe house that Zuckerberg rented for his employees, as well as to Las Vegas and the Coachella festival. She even convinced Zuckerberg to splurge on a pool house where his employees could move in—the ultimate home office. When Zuckerberg offered to subsidize housing for anyone who moved within a mile of the office, Losse did that, too—even though, as a customer service worker, she at first was excluded from the perk. “It wasn’t enough to work [at Facebook], you had to devote as much of your life to it as possible,” she wrote. To that end, the engineers’ floor at Facebook HQ was littered with toys—puzzles, games, Legos, scooters. New toys showed up constantly to keep the boy kings amused while they worked late. “Looking like you are playing, even when you are working, was a key part of the aesthetic, a way for Facebook to differentiate itself from the companies it wants to divert young employees from and a way to make everything seem, always, like a game,” she wrote. But even at the many parties, the coders had their laptops along and managed to get work done.30

  In fact, they loved their work so much that they created new features and new projects without even being asked, and sometimes explicitly without permission. Facebook Video was one such project: it was done after-hours (if there were after-hours at Facebook) as an experiment—at least until Zuckerberg decided to publicly announce it, to much acclaim. At that point, the programmers who’d begun it as a lark worked to the point of collapse to make sure it would launch on time. “It was like my body wouldn’t ever work again,” one of them told Losse.31

  The coders who were breaking their bodies were at least lavished with perks and praise. Meanwhile, customer care was women’s work: low paid, undervalued, not really considered work at all. At Twitter, for example, complaints from users about relentless abuse on the platform have been met with a steadfast refusal to hire support staff. Startup founders, Losse wrote elsewhere, have often relied on friends or girlfriends to do any work that required emotional labor. Silicon Valley later outsourced it to other countries, such as the Philippines, or even to refugee camps in Gaza, where the disturbing work of purging social networks of violence, porn, and anything else that might prove offensive to users was done for a fraction of what US wages would be. One article estimated the number of such workers at over one hundred thousand. Astra Taylor called the process Fauxtomation, whereby actual humans perform jobs that most people probably assume are done by algorithm. It is the secret of Silicon Valley, nodded to by Amazon with its Mechanical Turk service—the Mechanical Turk was a gadget created centuries before the computer to, purportedly, play chess. Inside the Turk was a human making the decisions. Now Amazon’s “Turkers,” many of them inside the United States, do repetitive “microtasks” for pennies, but the myth of the genius programmer helps to mystify the work still being done by human hands and human minds.32

  The Silicon Valley workplace, created in the image of the boy king, seemed almost designed to erase the caring labor discussed in earlier chapters. No family, no friends, and no responsibilities outside of the office; within the office, all their needs are catered to, and toys are provided to make them feel eternally nineteen. (Facebook and Apple even offer egg-freezing to their employees, offering up a tech fix to the problem of work versus family, at least for a while, so that women, too, can abide by the “no families outside the workplace” rule.) It’s no wonder that the apps designed by all these man-children have been, collectively, dubbed “the Internet of ‘Stuff Your Mom Won’t Do for You Anymore.’” Need laundry done, dinner delivered, your house cleaned? There’s an app for that, and the app’s founders have no doubt been breathlessly hailed as technical geniuses, even though their real innovation is finding new ways to skirt labor laws. The result has been the gig economy—a patchwork of short-term non-jobs performed by nonemployees who are barely getting by.33

  Whether they be app-distributed gigs or jobs in Amazon’s warehouses, or even programming jobs themselves, the tech industry’s solution for the continuing need for humans to do deeply un-fun work has been “gamification.” Gamification is almost the antithesis of “playbor”—a way to pretend that the same old backbreaking manual work is “fun,” a game you can win. To make the work of packing boxes at Prime speeds less like, well, hard work, Amazon has introduced video games to the distribution center floor. The games have titles like “PicksInSpace” and “Dragon Duel,” and the employees can play alone or against one another—the latter bit designed to up the competition factor and perhaps encourage faster picking. One gamification expert explained that the games might “give a bump to workers’ happiness,” but can also be used to ratchet up productivity goals: “It’s like boiling a frog. It may be imperceptible to the user.” Uber has used gamification as well; so have call centers. And it’s being applied both in learn-to-code contexts and in the actual workplaces of software developers. Turn work into a game! What could be more fun? The problem, as artist and author Molly Crabapple acidly predicted years ago, is that “the prize is what used to be called your salary.”34

  The gamifiers are on to something—people hate drudgery, and no one expects to enjoy packing boxes or lifting them for an eight- or ten-hour shift. But it’s not being plugged into a game that makes work enjoyable or not. It’s autonomy that people value, and that is precisely what is being pitched with all those toys on the Facebook shop floor. “We trust you to get your work done,” the toys and perks imply. “You can decide how and when you do it and how and when you have fun.” With the feeling of autonomy comes the feeling that long work hours are a choice; they become a status symbol rather than a sign of unfreedom. As Miya Tokumitsu wrote, in Do What You Love, “The promise of worker autonomy is embedded in the ‘you’ of DWYL.”35

  But surveillance is as rampant in the tech industry as it is elsewhere. As early as the 1990s, Andrew Ross found that tech companies routinely monitored their workers. It shouldn’t be a surprise that companies like Facebook, who make their profits off extracting data, might want to keep an eye on their employees, or that the fallen WeWork, a real estate company that leased coworking spaces yet sold itself to investors as the techiest of tech companies, harvested a wellspring of data from the people who worked—and might have lived—in its buildings. WeWork pitched itself as “creat[ing] a world where people work to make a life, not just a living,” selling a version of the dream tech-industry workplace to the masses of freelancers on their own in the neoliberal economy. And the more time those workers spend at the office, the more data that can be extracted. Sleep pods, rare whiskies, steak dinners, and all the toys are designed to enclose the worker in the workplace, just as the social networks enclose users—they offer free tools that the user then feels unable to give up.36

  The company provides everything, in other words, that the tech worker needs to reproduce himself (and the worker is always assumed to be a HIM-self), leaving him free to focus solely on work. In this way, it fills the role less of his mother than his wife. The tendency of companies like Facebook to hire those boy king
s means that the company is often shepherding them from youth to adulthood, filling that gap, perhaps, between mother and marriage. As video-game programmer Karn Bianco told me, when it comes time for slightly older workers to consider having a family of their own, they must create distance from the company and its desire to be all things to them.

  And while, for now, programmers are lavished with benefits and treated as irreplaceable, the capitalists of tech are also betting that their status won’t last. The plethora of “learn-to-code” boot camps are designed not as altruistic ways to get the working class into high-demand jobs (even the ones that promise to teach girls to code to counteract decades of industry sexism), but to drive down the cost of labor. Programming might be destined not to be a prestige field for wizards and boy kings, but rather, as Clive Thompson of Wired wrote, “the next big blue-collar job.” Some of the boot camps are out-and-out scams, like one that promises to pay you to learn—and then takes a cut of your salary for the next two years. But all of them will have the effect of making coders more common, and thus making the work less rarefied—and less well remunerated.37

  Mark Zuckerberg also has a plan to bring in lots of short-term workers from overseas. His immigration nonprofit, FWD.us, was created to lobby for immigration reform. That sounded nice in the age of Trump, but Zuckerberg’s main concern was increasing the number of H1-B guestworker visas for skilled workers. H1-B workers are tethered to a particular job; if they quit or get fired, they have to leave the country, which makes them spectacularly compliant as well as cheaper to hire.38

  All of this means that tech workers might have more in common with the industrial workers of midcentury than they might think. Silicon Valley touts itself as the “New Economy,” but it still relies on products that have to be built somewhere, and the tactics of offering perks on the job don’t work quite as well on them. Elon Musk promised free frozen yogurt and a roller coaster to disgruntled employees at his Fremont, California, Tesla car factory—but the workers were complaining of injuries on the job because of the pace of production, and they didn’t want frozen yogurt to soothe their pains. They wanted a union.39

 

‹ Prev