Book Read Free

Hustle and Gig

Page 22

by Alexandrea J Ravenelle


  As the number of people seeking to supplement their salaries continues to increase, the sharing economy finds itself with a glut of potential workers. Basic economics teaches us that when supply increases, but demand remains static, prices must fall. While the terminology is different when applied to workers, the concept remains the same: having an excess of available workers lends itself to the opportunity to “churn and burn” without fear of losing one’s workforce. One can’t help but be reminded of the scene in The Jungle where injured workers are quickly replaced by those waiting outside the gate, clamoring for an opportunity to be hired next.

  Placing the sharing economy in the context of larger social changes demonstrates how the sharing economy is simply the next step in the erosion of the employee-employer social contract. But there are solutions, such as programs and policies that can address the need for workplace flexibility while also protecting workers.

  PLACING THE SHARING ECONOMY IN CONTEXT

  During World War II, large employers began offering health insurance as a way around wartime wage controls. After the war ended, workers and companies, especially the Big Three automakers (General Motors, Ford, and Chrysler), began to address grievances that had arisen during the 1940s. The so-called Treaty of Detroit was a five-year contract that protected automakers from annual strikes in exchange for extensive health, unemployment, and retirement benefits.18 Workers also received annual cost-of-living adjustments to wages and increased vacation time. While this was originally an agreement with just three companies, similar contracts soon followed in the steel industry. By the early 1960s, more than half of the union contracts in the United States had copycat provisions calling for cost-of-living adjustments.19 With labor unions representing roughly a third of all workers, other workplaces soon followed suit in order to keep workers happy and reduce the likelihood of unionizing.20 Linking benefits to a specific workplace became de rigueur. As noted by the Century Foundation, “The Treaty of Detroit reflected two choices that shaped work over the next several decades: first, a recognition by business that the security and well-being of its workers was in its own interest; second, a decision by labor that it was better off obtaining benefits linked to a specific employer than waiting for government to act.”21

  While the relationship between workers and employers is often thought of in more adversarial terms today, especially in regard to the sharing economy, this was not always the case.22 Before World War II, employers often followed the scientific management model of Taylorism—which focused on increasing productivity by simplifying jobs into discrete tasks, measuring productivity, and linking pay to performance—in an effort to turn “workers into cogs in an industrial machine.”23 After the war, companies adopted the gentler human-relations perspective, a management philosophy that was propounded by Elton Mayo and other sociologists and industrial theorists and based on extensive studies at Western Electric.24 Sometimes called the “happy worker model,” the philosophy was simple: the best way to increase productivity—and discourage unionization—was to keep workers happy.25

  The focus on worker happiness took a backseat to the bottom line beginning in the 1980s. A surge in steel and automobile imports, along with the 1981–1982 recession (described as the worst recession since the Great Depression, until the Great Recession) led to the perception that companies needed to be more focused on controlling costs down to the penny. The deregulation of trucking, airlines, and telecommunications, allowed for increased start-ups, but it also affected large, unionized companies. Finally, President Ronald Reagan’s firing of 11,500 striking air traffic controllers, and the disbanding of their union, paved the way for other companies to copy such hard-nosed tactics. While the Supreme Court had ruled in 1938 that companies could replace striking workers with permanent replacements, few companies had dared to do so before the 1980s. In short order, striking workers for the timber company Louisiana-Pacific, miners for Phelps Dodge, pilots for Eastern Airlines, and paper workers for International Paper found themselves replaced.26

  In the 1990s, white-collar workers also found themselves on the losing end of a changing workplace social contract. Job-cutting executives such as “Chainsaw Al” Dunlap, “Neutron Jack” Welch, and “Irv the Liquidator” Jacobs implemented mass layoffs in an effort to save billions of dollars.27 Workers found that layoffs—once limited to economic downturns and periods of financial duress—became routine, even when companies were thriving. In 1994, Proctor and Gamble reported that profits rose more than 13 percent in its second quarter after a “cost cutting” that included eliminating thirteen thousand jobs and thirty plants worldwide.28 A New York Times analysis of Labor Department numbers found that more than forty-three million jobs were erased in the United States in the period from 1979 to 1995, and that the job losses were increasingly those of “higher-paid, white collar workers, many at large corporations.” A poll conducted in conjunction with the paper’s coverage of the layoffs found that “nearly three-quarters of all households had a close encounter with layoffs” between 1980 and 1996, and that in a third of all households, a family member had lost a job.29

  In some cases, the lost jobs were replaced with automation as computers and software made certain jobs and procedures redundant. In other cases, work expectations were simply ratcheted upward as workers, anxious that they would lose their jobs in the next round of layoffs, pushed themselves to do more with less. The layoffs were also used to shed full-time employees, replacing them with outsourced services such as call centers, staffing companies, and perma-temps. The popular business titles of the age are instructive: The Overworked American, Mean Business, Lean and Mean, The White Collar Sweatshop, and The Disposable American.30

  This focus on production—and on the expendability of workers—was similar to the early industrial age and the efforts of companies to wring every last minute of work out of their workers. Corporations were no longer using the Pinkerton National Detective Agency to brutalize or intimidate workers, but beginning in the 1990s, software made it possible to track keystrokes of workers and monitor exactly how long it took to complete tasks and whether the workers took unauthorized breaks.31 And, as computers made it easier to track workers, companies also had an easier time determining when—and if—they needed workers. With the rise of just-in-time scheduling, more and more workers became temps or independent contractors. By the end of 1998, temporary staffing had become a fifty-billion-dollar-a-year business in the United States, with one in five U.S. corporations reporting using temps for at least 10 percent of their workforce. Temping, a short-term solution for students on summer break, retirees, and those between jobs, became a long-term career option. At one point, Microsoft alone employed five thousand temps, including fifteen hundred who had worked for the company for a year or more.32

  While some temporary workers found themselves in long-term employment contracts, they were still outside the social safety net in many ways. Often hired through a staffing agency, temps did not qualify for the same workplace perks as their colleagues, including retirement contributions, paid time off, and raises and promotions. Even their interactions with peers were affected by their temporary status. In a 1998 interview in Fast Company, Jeff Kelly, publisher of Temp Slave, described temping as similar to “being an alien. There’s such insecurity. No benefits. You never know when your assignment is going to end. Your coworkers treat you as if you’re a threat to their livelihood. It’s abominable.”33

  As white-collar corporate America began to rely on temps, service and retail work increasingly became part-time work managed via just-in-time scheduling, further increasing the uncertainty faced by employees. During the Great Recession, the number of part-time workers who preferred to work full-time skyrocketed from approximately 4 million to more than 9 million.34 By 2014, the number had dropped, but still remained almost double (7.5 million) what it was in 2007, before the start of the recession. In addition to working fewer hours than they would like, workers also found themselves with
less control over their schedules as employers moved to computerized scheduling systems. Such computer systems allow employers to slot workers into shifts that correlate with times of expected demand, and they lead to shift changes with little notice. The Bureau of Labor Statistics notes that 47 percent of part-time hourly workers ages twenty-six to thirty-two receive a week or less of advance notice for their schedule.35 Sometimes workers arrived at work only to be told that the computer indicated that sales were slow, and they weren’t needed for their shift.36 Such constantly-in-flux schedules can reduce worker abilities to arrange childcare, take college classes, manage a second job, or earn sufficient income.37

  Temporary workers, just-in-time employees, massive layoffs—the sharing economy is just the newest (technological) innovation in treating workers shabbily. It combines the no-obligations-attached workforce of temps with the on-demand convenience of app-based on-demand scheduling. For corporations, the sharing economy offers the best of just-in-time scheduling, temp agency outsourcing, and down-to-the-penny accounting. In the words of one CEO, “You can hire 10,000 people for 10 to 15 minutes. When they’re done, those 10,000 people just melt away.”38

  More and more of us may “melt away” in the future. In 2013, the global sharing economy market was valued at $26 billion, and some predict it will grow to become a $110-billion revenue market in the coming years, making it larger than the U.S. chain-restaurant industry.39

  This “new” economic movement is part of a larger trend toward reshaping the employer-employee social contract and changing expectations of what employers offer their workers. While promising innovation, the sharing economy is returning to an earlier industrial age where workers are without a safety net.

  GOOD JOB, BAD JOB, OR NO JOB?

  In “Why Are There Still So Many Jobs? The History and Future of Workplace Automation,” David Autor notes that the rise of automatic teller machines did not decrease the number of bank tellers but allowed them to move into more advanced positions as salespeople who could introduce customers to a variety of new products. The gig economy has also allowed workers to move into sales, as they market and sell their labor on digital platforms.40 The selling of the self is a continuation of the personal ownership message that Jacob Hacker discusses in The Great Risk Shift, where economic risk shifts from businesses and government to the average American worker.

  In the new economy, workers are told to improve their market value through training and networking, and to market themselves as a brand and business, “the CEO of Me Inc. . . . willing to temporarily assist other, larger businesses.”41 As a result, “a central concept in this economized, individualized view of the world is ‘responsibilization.’”42 Although citizens demonstrate some autonomy in choosing where to work and live and how to spend their time, there are “no rights without responsibilities.”43

  This message of personal responsibility, part of a neoliberal ideology, gets a further boost with the entrepreneurial ethos employed by the gig economy platforms. The companies claim the sharing economy allows workers to “be your own boss.” Work is arranged according to your schedule (“work when you want”) and controlled by the worker (“find jobs you love” and “only you decide . . . where to drive” or “who to host”). Workers are told they are empowered to pick their payday (“push a button and get paid whenever you want”).

  This message ignores the limiting component of choice. Workers are turning to these gig jobs out of a dearth of other options. With stagnating salaries and high levels of income volatility, there are few options for economic stability. Furthermore, the amount of money that workers can make is directly affected by the number of fellow workers who are competing for a limited number of gigs or potential rides. The platforms benefit from having a large, ready, willing, and able stable of potential workers in order to meet the on-demand needs of potential clients. But workers can make more money and have increased control over their own work when there are fewer workers. Decreases in the payment rate, coupled with increased commissions to the platforms, further decrease the “control” workers can exert over their destiny even as they are urged to take a higher level of responsibility for it.

  While services market themselves as offering peer-to-peer opportunities, making it easier for a busy parent to hire someone to do grocery shopping, or to hire—and track—an Uber driver picking a child up from soccer practice, the ease with which workers can be hired or discarded has not escaped the recognition of business managers. Numerous TaskRabbits and Kitchensurfing workers that I interviewed noted being hired by companies for everything from making dinners for corporate meetings to assembling Ikea furniture for start-ups. Uber and Airbnb are now being accepted as business travel expenses.44 In September 2017, Ikea announced that it was acquiring TaskRabbit.45 One of the ways in which I knew that I had obtained theoretical saturation with my research sample was that several TaskRabbits told me about the same task: stuffing bags for a Brooklyn coffee company that markets itself as offering fair trade coffee. None of workers I interviewed who did gig work for businesses questioned working for an established company, doing the type of manual or service labor that would normally fall under various workplace protections.

  In Good Jobs, Bad Jobs: The Rise of Polarized and Precarious Employment Systems in the United States, Arne Kalleberg notes that there have been seven major changes to job quality in the past few decades. These changes include increased polarity between good and bad jobs in addition to an overall increase in job precarity and insecurity. Many of the good jobs that have been lost have been replaced with jobs of lower quality and pay, and workplace flexibility policies continue to lag in implementation and usage.46 Kalleberg also notes the importance of human and social capital, writing, “Transformations in work have underscored the growing importance of skills for labor market success; workers with more human and social capital are better able to take advantage of opportunities created by the greater marketization of employment relations. While more-educated and higher-skilled workers may not necessarily have more job security with a particular employer, their more marketable skills enhance their labor market security, which, in turn, generally provides them with higher earnings, greater control over their jobs, higher intrinsic rewards, and better-quality jobs overall.”47

  In the sharing economy, it’s true that higher levels of skills and capital make a difference—Airbnb hosts and traditional Kitchensurfing chefs seem to be more likely to be Success Stories or Strivers than Strugglers and are more likely to identify as entrepreneurs. But the second half of Kalleberg’s claim doesn’t seem to apply here: in the sharing economy, higher levels of education don’t increase labor market security or lead to better quality jobs. If anything, the sharing economy appears to add to inequality by turning even low-prestige, low-education work—occupations of last resort—into part-time positions for the well-educated. As Juliet Schor has noted, the sharing economy makes it possible for college-educated workers to hire college-educated housecleaners.48 Marriage has long been identified as a so-called luxury good. As the wealthy and well-educated marry each other, those with lower incomes and education levels are seen as less desirable and are less likely to get married. Likewise, the sharing economy turns paid employment into a luxury good that is increasingly accessed by better educated, technologically adept workers with smartphones and dependable data networks.

  Arun Sundararajan, author of The Sharing Economy: The End of Employment and the Rise of Crowd-Based Capitalism, supports the sharing economy, arguing that it has generated positive spillover effects by putting underused assets to work and expanding economic opportunity. Yet, even Sundararajan has noted that sharing economy platforms blur the lines between the personal and the professional, and between employment and casual work, and may spell the end of traditional employment.49

  The end of “traditional employment,” with its mind-numbing nine-to-five grind, may not be a bad thing: while numerous companies offer flexible work arrangements, their usa
ge is often stigmatized.50 Work by Pamela Stone demonstrates that the so-called opt-out of well-educated women with children is caused by a lack of flexible workplace policies: flexible hours are important to ensuring the workplace tenure of women with small children and others with child or family care responsibilities.51 An increase in freelance work, with its focus on deliverables as opposed to face time, and higher hourly wages could be a boon to workers seeking time and location flexibility.52 Or, the lack of workplace security and dependable income could make “taken-for-granted models for organizing one’s life” essentially unattainable.53

  FREELANCE WAGE OR GIG ECONOMY MIRAGE?

  Sharing economy services such as Uber and TaskRabbit argue that their workers also command premium incomes. In 2014, an Uber blog post describing drivers as “small business entrepreneurs” noted that “the median income on uberX is more than $90,000/year/driver in New York and more than $74,000/year/driver in San Francisco.”54 However, tweets from Josh Mohrer, the general manager of Uber in New York City, and statements from Lane Kasselman, Uber’s head of communications for the Americas, have noted that drivers earn an average of $25 to $25.79 an hour after Uber’s commission. As Alison Griswold points out, “Even at $25.79, $90,000 is a tough mark to hit. You’d need to work 70 hours a week for 50 weeks a year.”55

 

‹ Prev