Book Read Free

Films from the Future

Page 14

by Andrew Maynard


  The Disposable Workforce

  The first job I found myself in as a newly minted Doctor of Philosophy was not in a university lab, but in a government research center. In September 1992, I joined the British Health and Safety Executive as a research scientist (later moving into a similar role with the US National Institute for Occupational Safety and Health), and for the next thirteen years, I became deeply engaged in workplace safety. I was a full-on bench scientist for many of these years, conducting and leading lab-based research on airborne dust exposure (which, trust me, is more interesting than it sounds). But I also worked closely with health and safety professionals, as well as manufacturers and workers, and this gave me a deep appreciation of the risks that many people face in the places where they work, even when those workplaces use and produce advanced technologies.

  It’s often assumed that technology innovation make workplaces cleaner and safer places to be. This, sadly, is a myth, and it’s one that I suspect is propagated in part by images of pristine clean rooms and sleek automated production lines. In many cases, of course, new technologies have led to improved working conditions. Yet the reality is that manufacturing at scale is often dirty and dangerous, even if the technology being manufactured is not. And this is one area where Elysium does a surprisingly good job of reflecting the reality that, no matter how advanced our technologies are, there’ll still be someone slaving away somewhere in an unsafe workplace to make the products we use, if we’re not careful.

  Of course, we’ve known for thousands of years that working for a living can be bad for your health—especially if you mine materials out of the ground, grow produce, or manufacture materials and products. And partly because of this, there’s a long history of privileged groups using less privileged people to do their dirty work for them. It wasn’t the rich, ruling classes that got their hands dirty building the Egyptian Pyramids or the Roman plumbing systems, or who mined the coal that drove the Industrial Revolution. Rather, it was those who had little choice but to sacrifice their health and longevity in order to put food on the table for their families. It would be pleasant to think that we live in more enlightened times, where no one has to take unnecessary risks to earn a living wage. Sadly, this is not the case. Elysium may be implausibly futuristic in some respects, but it’s right on the nose with its message that, even in a technologically advanced future, there’ll still be dirty, dangerous jobs, and rich people who are more than willing to pay poorer people to do them.

  Thankfully, there have been substantial improvements in working conditions over the past 100 years or so—in some countries, at least. This has been spurred on by a growing realization of just how socially and economically harmful it can be to treat workers badly. But this is a surprisingly recent development in human history, and one where new technologies have not always been synonymous with better working conditions.

  In 1977, my grandfather died of pneumoconiosis after decades of working as a coal miner. Even though he’d long moved on from working down the pit, the coal dust he’d breathed day in and day out had done its damage, and the progressive and irreversible scarring that resulted from it eventually killed him.

  Coal miner’s pneumoconiosis, or “black lung,” is caused by the constant inhalation of fine, insoluble dust particles, and a gradual and progressive deterioration of the lungs as they become inflamed and scarred. It’s a disease that has most likely plagued coal miners for centuries. Yet it wasn’t until the early to mid-1900s, at the tail end of the Industrial Revolution, that it began to be recognized as a serious occupational disease.68 Despite massive advances in technological innovation over the previous century, uncertainty in the science behind black lung delayed action on this occupational killer. This was an uncertainty that suited the mine owners, and one that they seemed to be no hurry to address. In the 1800s and early 1900s, coal was the what fueled the Industrial Revolution, and mining corporations and manufacturers couldn’t afford to acknowledge they might have a problem.

  It wasn’t until the 1940s in the UK that substantial steps were taken to improve workplace conditions down mines, following a growing recognition of how serious a challenge lung disease was amongst miners. Even then, pneumoconiosis continued to be an issue. And in the 1990s, fifty years after those first substantive steps to improve working conditions, I became involved in a new wave of efforts to address occupational lung disease in coal mines.

  The mines I visited back then—all in the northeast of England—were dusty, but not oppressively so. Yet there was a palpable tension between trying to comply with exposure regulations and struggling to remain solvent. In 1991, similar tensions had led to a scandal in the US coal mining industry when it was discovered that dust was either being removed from samples designed to monitor exposures, or the samplers were intentionally being misused.69 The intent was to make it look as if dusty mines were complying with federal regulations, even if they weren’t in compliance, in an attempt to put profits over the lives of those mining the coal. Over 800 mines were implicated in the tampering scam, and the proposed fines that resulted exceeded $6 million.

  Similar concerns prompted some of my work in British coal mines, and one of my last visits down an English pit was to ensure samples weren’t being messed with (thankfully, they weren’t). The sad reality, though, was that, in this industry, and despite massive strides in understanding how to use technology to protect worker health, it was all too easy to cut corners in order to increase production. And even more sadly, despite living in one of the most advanced technological ages in human history, coal miners’ pneumoconiosis is once again on the rise. In spite of all the technological breakthroughs we’re surrounded by, companies are still sending people to work in environments that could severely shorten their lives, while not taking the necessary steps to make them safer, so that others can live more comfortably.70

  Coal mining is, of course, just one example of a workplace where tradeoffs are made between safety and productivity. In the US alone, there are close to 5,000 workplace-related fatalities a year, and in excess of 140,000 cases of workplace illness.71 In 2014, Jukka Takala and his colleagues published estimates of the global burden of injury and illness at work. From their analysis, there were 2.3 million workplace-related deaths globally in 2012, with two million of these linked to occupational disease.72 These are high numbers, and certainly not what might be hoped for in a technologically advanced society. Yet while technological innovation has made some workplaces safer, it has also displaced people into potentially more harmful working conditions; and the harsh reality is that, for many people, a dangerous job is better than no job at all. This is perhaps seen most clearly in the displacement of manufacturing to countries where wages are lower, regulations are weaker, and working conditions are poorer than they are in more affluent economies—for instance, in the manufacturing of clothing and electronics. Here, rather than saving lives, innovation is leading to people being potentially put in harm’s way to satisfy a growing demand for the latest technologies.

  Even with new and emerging technologies—for instance, the production of new materials using nanotechnology, or the use of genetically modified microbes to mass-produce chemicals in vast bioreactors—there is relatively little correlation between the sophistication of the technology and the safety of the environment in which it’s used. On the contrary, the more powerful the technologies we produce, the more opportunities there are for them to harm the first tier of people who come into contact with them, which includes the people who manufacture them, and in turn use them in manufacturing. This has been seen in an intense global focus on the workplace health risks of producing and using engineered nanomaterials73 (a topic we’ll come back to in chapter ten and The Man in the White Suit), and a realization that one of the greatest threats to workplace safety is not a lack of technological innovation, but ignorance of what might go wrong with novel technologies.

  But even where there is not a lack of understanding, greed and human nature co
ntinue to jeopardize workers’ health. In the case of Elysium, this tradeoff between profit and people is painfully clear. Max’s occupational “accident” has all the hallmarks of occurring within a company that sees its workforce as disposable, despite the fact that they are producing high-tech goods. The additional irony here is that those “goods” are robots that are designed to further suppress the earth-bound population. In this future society, the polarization between rich and poor has become so extreme that the poor have precious few rights remaining as they serve the lifestyles of the rich.

  How likely is this? If we don’t take workplace health and safety seriously, and the broader issues of social justice that it’s a part of, I’m sad to say that it’s pretty likely. The good news is that an increasing number of companies recognize these dangers, and are diligently implementing policies that go beyond regulatory requirements in order to ensure a healthy workplace. And they do this with good reason: The economics of accident and disease prevention make good business sense, as do the economics of fostering a happy and thriving workforce. Emerging thinking around concepts like corporate social responsibility and responsible innovation help here; so does innovative corporate leadership that actively strives to reduce social inequity and serve the needs of those who work for them.74 But the fiscal temptation to use cheap labor is sometimes a tough one to resist, especially when some people are willing to work for less and cut corners to get ahead of their peers. This is where preventing a future disposable workforce becomes the responsibility of everyone, not just employers or regulators.

  This is something of a moot point in Elysium, though, as Max and his fellow workers don’t have much of a choice in where they work and what they are required to do to make ends meet. Despite living in a highly automated future, they have work, but it’s not necessarily the work they would choose, given the chance. For them, automation didn’t deprive them of a job, but it did deprive them of choice. How realistic a reflection this is of the real world is debatable—this is, after all, Hollywood. Yet in one form or another, new technologies that lead to further automation are a growing issue within today’s society.

  Living in an Automated Future

  In September 2017, the Pew Research Center released the results of a comprehensive survey of public attitudes in the US toward robots and automation.75 The results should be taken with a pinch of salt, as these were opinions rather than predictions, and they come with all the usual challenges associated with asking people to predict the future. Yet they’re quite revealing when it comes to what people think about automation. Some of the results aren’t too surprising. For instance, some people who responded were worried about the prospect of robots replacing them in the future, and respondents generally didn’t like the idea of computers deciding who to hire and who not to. Other results in the survey were more surprising. For example, 56 percent of participants would not want to ride in a driverless vehicle, and of these, safety concerns were uppermost in their reasoning. And this is despite safety being one of the big arguments made for getting rid of human drivers.76

  As part of the survey, participants were asked what they thought the impacts of robots and computers would be on inequality. This was specifically framed in terms of what the outcomes would be if automation replaced many of the jobs currently done by people. Perhaps not surprisingly, the majority of participants (76 percent) thought that increasing automation of jobs would increase inequality.

  How this stacks up to how things are actually likely to play out is complex. As Erik Brynjolfsson and Andrew McAffee point out in their 2016 best seller The Second Machine Age,77 automation is radically changing the way we live and the work we do. The question that is challenging experts like Brynjolfsson and McAffee, though, is whether this will lead to a net reduction in jobs, or simply a change in the types of jobs people do. And it’s not an easy one to answer.

  Looking back over the recent history of automation, there have been pivotal shifts in the types of jobs available to people. There have also been industries that have been largely stripped of human labor. In the 1800s this was at the root of the Luddite movement (something we’ll revisit in chapter nine), as textile artisans began to see their skills being replaced by machines and their livelihoods taken away. And since then, every wave of automation has led to further job losses.

  But, at the same time, new jobs have been created. When I was finishing high school, and going through the tedium of career advice, many of the jobs that people now do hadn’t even been invented. Web designer, app coder, Uber driver, cloud computing expert, YouTube creator, smart-city designer, microfinance manager, and so on—none of these appeared in the brochures I was encouraged to digest. There’s no question that, over the past few decades, the job market has radically changed. And this has been driven by technological innovation, and to a large extent by automation.78

  To some, this suggests that we are nowhere near the limit of our capacity to create new things that people can and will pay for, and all that automation does is create new opportunities for enterprising humans to make money. This is not a universally held view, and there are many economists who worry that emerging technologies will lead to a serious net reduction in jobs. From the Pew survey, many others have the same concerns, and while this is based on impressions and gut feeling rather than hard evidence, it’s probably justified in one respect: Increasing automation will replace many of the jobs people do today, and unless they have the capacity to develop new skills and switch job and career paths, this will lead to job losses. And this in turn leads us to the challenges of ensuring people have access to the educational resources they need as technological innovation continues to transform our world.

  Education is one of those issues that is both critical to social and economic growth, and at the same time deeply contentious. Everyone, it seems, has an opinion on what a “good education” is, and how we should be “educating” people. As a teacher, and someone who’s married to one, it’s hard to escape the deeply-entrenched opinions and politics that surround education, and the sheer number of people who think they know what’s best, whether they know what they are talking about or not. And yet, despite all of the politicking, there is one cold, hard truth as we develop increasingly sophisticated technologies: If our educational thinking, approaches, and resources don’t keep up with the future we’re creating, people are going to suffer as a result.

  How to address this, of course, is challenging. But there are an increasing number of initiatives to address the emerging educational needs of the industrial and technological revolution we’re in. In my own institution at Arizona State University, for instance, there’s a growing recognition that bricks-and-mortar universities simply don’t have the capacity to serve the needs of a growing global population that’s hungry to develop the knowledge they need to thrive.79 In a future where unique skills are needed to ride the wave of radical technological change, we’re going to need equally radical innovation in how over seven billion people are going to acquire these skills. Online learning is beginning to fill some of the gaps here, but this is just a start. If we are going to avoid increasing automation and technological complexity marginalizing a growing number of people, we’re going to need to start thinking hard and fast about what we teach, how we teach, and who has access to it. More than this, we’re going to have to recalibrate our thinking on what we mean by “education” in the first place.

  In 2005, a new video-sharing platform was unleashed onto the world. Now, YouTube is the second-largest search engine globally, and the third most-visited site after Google and Facebook. It’s also where more and more people are turning to learn what they need in order to succeed. Over a billion hours of YouTube are watched every day, and while much of this is not educational content, a surprising amount of it is.

  As an educator, I must confess to being somewhat leery of YouTube, despite using the platform extensively myself.80 It remains a Wild West of educational content, where anyone can try to con
vince you of anything, whether it’s right or wrong. And yet, YouTube is increasingly where people go to learn,81 whether it’s how to tie a bowtie, put on makeup, plumb a sink, or ice an interview. This is a platform where people are sharing what they know with others, outside of the barriers, constraints, and politics of formal education. And it’s where users are learning how to learn at their own pace, and on their own terms. YouTube, and online video-sharing platforms more broadly, are a grassroots revolution in casual, user-directed learning, and one that I suspect is only going to increase in relevance as people discover they need new skills and new knowledge to succeed in what they are doing.

  Of course, YouTube videos are no substitute for a formal education. There is a depth and quality to learning from professionals within a structured environment that still has substantial value. And yet, there is a deep desire among many people to learn on their own terms, and to develop the knowledge and skills they need, when they need them, that isn’t being met by formal educators. And while educational establishments are trying to meet at least some of these needs with innovations like Massive Open Online Courses (or MOOCs) and “micro-credentials,” they are still barely connecting with what people are looking for.

  As YouTube and other video-sharing platforms democratize learning, how can we ensure that users have access to material that is useful to them, and that this material is trustworthy? The latter question in particular is a tough one, as pretty much anyone can upload their own content onto YouTube. Yet over the past several years, there’s been a trend toward trusted content creators providing high-quality educational material on the platform.

 

‹ Prev