Book Read Free

Throwing Rocks at the Google Bus: How Growth Became the Enemy of Prosperity

Page 7

by Douglas Rushkoff


  Today, it’s MIT’s Brynjolfsson and McAfee who appear to be leading the conversation about technology’s impact on the future of employment—what they call the “great decoupling.” Their extensive research shows, beyond reasonable doubt, that technological progress eliminates jobs and leaves average workers worse off than they were before. “It’s the great paradox of our era,” Brynjolfsson explains. “Productivity is at record levels, innovation has never been faster, and yet at the same time, we have a falling median income and we have fewer jobs. People are falling behind because technology is advancing so fast and our skills and organizations aren’t keeping up.”45

  However, in light of what we know about the purpose of the industrial economy, it’s hard to see this great decoupling as a mere unintended consequence of digital technology. It is not a paradox but the realization of the industrial drive to remove humans from the value equation. That’s the big news: the growth of an economy does not mean more jobs or prosperity for the people living in it. “I would like to be wrong,” a flummoxed McAfee explained to MIT Technology Review, “but when all these science-fiction technologies are deployed, what will we need all the people for?”46

  When technology increases productivity, a company has a new excuse to eliminate jobs and use the savings to reward its shareholders with dividends and stock buybacks. What would have been lost to wages is instead turned back into capital. So the middle class hollows out, and the only ones left making money are those depending on the passive returns from their investments.

  Digital technology merely accelerates this process to the point where we can all see it occurring. As Thomas Piketty’s historical evidence reveals, the ever-widening concentration of wealth is not self-correcting. Capital grows faster than the rest of the economy. Or, in even plainer language, those with money get richer simply because they have money. Everyone else—those who create value—gets relatively poorer. In spite of working more efficiently—or really because of it—workers get a smaller piece of the economic pie.

  This income disparity is not a fact of nature or an accident of capitalism, either, but part of its central code. Technology isn’t taking people’s jobs; rather, the industrial business plan is continuing to repress our ability to generate wealth and create value—this time, using digital technology. In other words, the values of the industrial economy are not succumbing to digital technology; digital technology is expressing the values of the industrial economy. The recent surge in productivity, according to Piketty, has taken this to a new level, so that the difference between capital and labor—profit and wages—is getting even bigger.47 Leading-edge digital businesses have ten times the revenue per employee as traditional businesses. Those who own the platforms, the algorithms, and the robots are the new landlords. Everybody else fights it out for the remaining jobs or tries to squeeze onto the profitable side of the inevitable power-law distribution of freelance creators.

  But the beauty of living in a digital age is that the codes by which we are living—not just the computer codes but all of our laws and operating systems—become more apparent and fungible. Like time-elapsed film of a flower opening or the sun moving through the sky, the speed of digital processes helps us see cycles that may have been hidden to us before. The fact that these processes are themselves comprised of code—of rules written by people—prepares us to intervene on our own behalf.

  THE UNEMPLOYMENT SOLUTION

  A good programmer always begins with the question What problem are we trying to solve? So let’s look at our situation from the digital perspective: Are we looking for new ways to grow the economy? Or are we trying to figure out how to get people jobs? Sure, it’s a better goal than abstract, senseless, environment-depleting growth. But is it the ultimate aim here? Is this the most foundational question we can ask?

  Perhaps so. Both the business and the technology press are filled with stories about how computers and robots change employment. In politics, almost any issue comes down to an argument to create jobs. War, immigration, housing, energy, budget, fiscal, and monetary policy debates all find their footing in employment for Americans: How do we get people back to work? How do we bring jobs back from overseas? How does the price of oil affect jobs? How do we raise the minimum income without its costing any jobs? How can we retrain our workforce for the jobs of tomorrow? It’s as if the highest moral good and core human need is jobs.

  I’m not so sure it should be. People want stuff. They want food, shelter, entertainment, medical care, a connection to others, and even a sense of purpose. But employment—a job one goes to, clocks in, does some work, clocks out, and returns home—isn’t really high on the hierarchy of needs for most of us. Dare we even admit it, but who really wants a job? We are convinced that unemployment is necessarily a bad thing. Free-market advocates use high unemployment figures as proof that Keynesian-style government spending doesn’t really move the needle. Leftists use the same figures to show that corporate capitalism has reached its endpoint: investors make money in the stock market while real people earn less income, if they can find jobs at all.

  The seemingly endless “jobless recovery” makes no sense at all, particularly at a time when many of us are working longer hours as overextended freelancers or the nominally unemployed than we did when we had real jobs. It’s hard to imagine how this all looks to young people just graduating college, who now chase unpaid internships with more energy than those in previous generations sought paying work.

  But what if joblessness were less of a bug than a feature of the new digital economy?

  We may, in fact, be reaching a stage of technological efficiency once imagined only by science-fiction writers and early cyberneticists: an era when robots really can till the fields, build our houses, pave our roads, and drive our cars. It’s an era that was supposed to be accompanied by more leisure time. After all, if robots are out there plowing the fields, shouldn’t the farmers get to lie back and enjoy some iced tea?

  Something is standing in the way of our claiming the prosperity we have created. The toll collector whose job is replaced by an RFID “E-ZPass” doesn’t reap the benefit of the new technology. When he can’t find a new job, we blame him for lacking the stamina and drive to retrain himself. But even if he could, digital solutions require, on average, less than one tenth the human employees of their mechanical-age predecessors. And what new skill should he go learn? Even the experts and educators have little idea what gainful employment will look like just five years from now.*

  In fact, jobs are a relatively new approach to work, historically speaking. Hourly-wage employment didn’t really appear until the late Middle Ages, with the rise of the chartered corporation.48 Craftspeople were no longer allowed to make and sell goods; they had to work for these protocorporations instead. So people who once worked for themselves now had to travel to the cities and find what became known as “jobs.” They no longer sold what they made; they sold their time—a form of indentured servitude previously known only to slaves. The invention of the mechanical clock coincided with this new understanding of labor as time and made the buying and selling of human hours standard and verifiable.

  The time-is-money ethic became so embedded in our culture that putting in one’s hours now feels like an essential part of life. What do you do? Yet jobs were not invented to give us stable identities. They were simply a part of the growth scheme: a way to monopolize the creative innovation and hard labor of the earlier free marketplace. Now that the labor is no longer needed—or is so easily accomplished by machines—must we still keep the jobs?

  Not to work feels unethical. Even our society’s favorite billionaires are “self-made,” which, in a reversal of aristocratic values, lends an air of respectability to their wealth that passive inheritors now lack. But if we can separate the notion of employment from that of making a valuable contribution to society, a whole lot of new possibilities open up for us.

  Our industrial
capabilities have surpassed our requirements. We make more stuff than we can use, at least here in the developed world. Even middle-class Americans rent storage units for their extra stuff. Our banks are tearing down foreclosed homes in multiple U.S. states in order to prevent market values from declining.49 Our Department of Agriculture is storing, even burning, surplus crops to stabilize prices for industrial agriculture.* There is more than enough to go around.50 Why don’t we give those houses to the homeless, or that food to the hungry?

  Because they don’t have jobs. Letting them just have stuff does not contribute to the great growth imperative.

  Instead, we’re supposed to think of new, useless things for these folks to make, then market those things to the rest of us, so that we go buy them, dispose of them, and then create more landfill. All in the name of growth. It’s as if we expect consumers to fuel the production of unnecessary goods just so that people can put in more hours of work they’d rather not be doing. We’re not looking to create jobs because we need more things. We employ people because otherwise we have no way to justify letting them share in a bounty created without their labor.

  To most of us, this is just “the way things are,” and to question the arrangement goes against centuries of precedent. Fortunately, all the reasons against overturning the scheme are based solely on the growth requirements of the industrial economic operating system—not on reality. Alternatives to the dehumanization scheme and its impact on work in the twenty-first century and beyond require challenging the underlying assumptions of this system and drawing more-direct lines between what people need and what they can provide. Here are a few possibilities, presented less as fully fleshed-out policies ready to be implemented in one nation or the other than as examples of some of the kinds of thinking we need to be able to do and some of the sacred truths we must be willing to reevaluate. Underlying them all is the implicit suggestion that our biggest challenge may be learning how to say “enough.”

  1. Work Less: Reduce the 40-Hour Workweek

  We generally start any conversation about employment with the holy 40-hour workweek and work back from there, retrofitting the rest of our business and economic metrics to this fixed value. It’s time we accept the truth: we have gotten so efficient at production that we don’t really need everyone employed 40 hours a week anymore. We have to remap our time and labor in a way that’s appropriate for a postindustrial society. This does not have to happen all at once, but we do have to develop a path toward less work.

  Early efforts have been very promising for business and people alike. Juliet Schor, a sociologist at Boston College, believes we must overcome our fear of appearing fanciful or naïve and get on with the business of reducing work hours.51 Her research shows that more working hours do not lead to a better economy, a better environment, or a better quality of life. Countries that have just begun instituting worktime reduction already have smaller carbon footprints than those that haven’t. Schor has also shown how spending fewer hours on the job frees people to pursue the sorts of things they already do for free and that ultimately contribute even more greatly to the economy—from caring for the sick to teaching children. In the words of New Economics Foundation researcher Julia Slay, “What would the cost to your business be if your workers were never potty trained?”52 Such value is treated as subservient to the money economy, when it is simply labor unrecorded or, as Lanier would put it, off the books.

  Shortening the workweek has a profound effect on many interdependent systems. People have time to do things more slowly, such as walking to work, which uses less carbon. Shortening the workweek gives more people the opportunity to share available work, a form of engagement and participation that improves mental health and creates social bonds.53 It also reduces overtime and work overload, both of which are statistically linked to mental illness and cancer. Other studies show that working fewer days promotes more civic and community engagement. People’s perception of themselves as “citizens” and their time commitment to social issues54 both increase.

  Even in the United States, recent experiments in shortening the workweek have panned out better than expected. In 2008, Utah instituted a four-day working week for public employees by offering them the opportunity to shift from five 8-hour days to four 10-hour days. Fifty percent of the 18,000 people who participated reported that they were more productive, while a full 80 percent asked to maintain the new schedule after the experiment was over, citing benefits to their relationships, families, and general well-being. The reduction in overtime payments and absenteeism saved the state $4 million and reduced carbon emissions by 400,000 metric tons that year. And this was with no reduction in actual hours.55 In California, Amador County workers initially protested when their worktime was reduced 20 percent, from five days to four, in order to justify a 10 percent reduction in their pay. Two years later, when they were offered the option of going back to a 40-hour workweek, 79 percent voted to stay at the reduced hours and pay.56

  Just how strange would it be for successfully automating businesses to phase out work or at least wind down the hours? How about doing it without reducing employee participation in the profits? Not surprisingly, digital companies are some of the first to experiment with shorter weeks that don’t punish employees. Treehouse, an online education startup, adopted a four-day workweek and grows by an average of 120 percent a year.57 Productivity platform Basecamp has also instituted a four-day week because, as CEO Jason Fried explains, “when there’s less time to do work, you waste less time. . . . You tend to focus on what’s important.”58 The Basecamp platform has become an industry standard in the startup community, so maybe its approach to enterprise will spread as well.

  2. Rewrite the Employee-Company Contract: Share Productivity Gains

  What’s most important, even more important than the increased worker efficiency enjoyed by companies with shorter weeks, is the improvement in the health, well-being, and satisfaction of the human beings these companies were built to serve. While passive investors should enjoy the benefits of increasing productivity, so, too, should those who invested sweat equity. Most companies still use increased productivity as an excuse to cut jobs and then pay the savings back to the shareholders as dividends or stock buybacks. It’s Corporatism 101, but ultimately a flawed, short-term approach—especially when productivity gains are spread across so many industries at once. Companies are amputating their human resources while also spoiling their own and everyone else’s customer base by taking away their jobs. And all the while, digital productivity gets blamed for the obsolete business model it’s accelerating.

  Firms willing to consider changing previously unmovable pieces of the puzzle, such as work hours, also gain a competitive advantage in attracting and retaining the best talent. (You want to work Tuesdays, Wednesdays, and alternate Thursdays? No problem!) Moreover, keeping a reserve of available hours positions a company to take advantage of sudden bursts in activity, or a rush of new contracts, without having to hire and train new employees (only to fire them a few months later). In digital parlance, this means the company is more “resilient.” It is a less brittle strategy in that it distributes the available work hours to many people instead of overemploying some and unemploying everyone else.

  If, thanks to a new technology, workers become much more productive, a company doesn’t have to fire a bunch of them and pass all the savings up to the shareholders. It can instead share the spoils with those workers or—if accelerated productivity outpaces demand—pay them the same salary to work fewer days.

  A reduction in workdays is just one of many possible ways to contend with a paucity of available jobs. LinkedIn founder Reid Hoffman envisions digital technologies (like his networking platform) enabling people to abandon the end-to-end employment solutions of yesteryear and adopt a more temporary, improvisational approach to their careers.59 Instead of seeking a job and then giving years to an employer in return for money, professionals will en
gage with companies for a specific purpose—more like a campaign. These “alliances” will last an average of eighteen months, during which a new product or division might be launched, a financial problem rectified, or a creative challenge solved. The project itself becomes part of the worker’s portfolio, and the worker is engaged less as an employee than as a partner in the project.

  The devil is always in the details: Isn’t this a recipe for exploitation? When everyone’s essentially a work for hire, what happens to the collective bargaining power once offered by labor unions? Would COBRA cover people’s health insurance between engagements that might be years apart? What about pensions? Again, imagination and flexibility are required. New forms of organized labor—like the Freelancers Union—will emerge, and older, preindustrial ones like guilds will likely be retrieved. These sorts of changes don’t happen overnight but incrementally and after much trial and error.

  The beauty of such possibilities, from the perspective of charting a twenty-first-century career, is that they offer a glimpse of an employment path structured around the needs of real people today rather than the priorities of thirteenth-century factory owners who have long since left this realm. In nearly all these strategies, the underlying shift is away from hours served and toward value created. It’s less symbolic and more real, less based in legacy systems and more grounded in current productivity. Instead of tying workers and our entire economy to the industrial-age machine, we reprogram our economy from the ground up.

 

‹ Prev