The Age of Surveillance Capitalism

Home > Other > The Age of Surveillance Capitalism > Page 24
The Age of Surveillance Capitalism Page 24

by Shoshana Zuboff


  Investors deem Google “harder to catch than ever” because it is unmatched in its combination of infrastructure scale and science. Google is known as a “full stack AI company” that uses its own data stores “to train its own algorithms running on its own chips deployed on its own cloud.” Its dominance is further strengthened by the fact that machine learning is only as intelligent as the amount of data it has to train on, and Google has the most data.26 By 2013, the company understood that its shift into the “neural networks” that define the current frontier of artificial intelligence would substantially increase computational demands and require a doubling of its data centers. As Urs Hölzle, Google’s senior vice president of technical infrastructure, put it, “The dirty secret behind [AI] is that they require an insane number of computations to just actually train the network.” If the company had tried to process the growing computational workload with traditional CPUs, he explained, “We would have had to double the entire footprint of Google—data centers and servers—just to do three minutes or two minutes of speech recognition per Android user per day.”27

  With data center construction as the company’s largest line item and power as its highest operating cost, Google invented its way through the infrastructure crisis. In 2016 it announced the development of a new chip for “deep learning inference” called the tensor processing unit (TPU). The TPU would dramatically expand Google’s machine intelligence capabilities, consume only a fraction of the power required by existing processors, and reduce both capital expenditure and the operational budget, all while learning more and faster.28

  Global revenue for AI products and services is expected to increase 56-fold, from $644 million in 2016 to $36 billion in 2025.29 The science required to exploit this vast opportunity and the material infrastructure that makes it possible have ignited an arms race among tech companies for the 10,000 or so professionals on the planet who know how to wield the technologies of machine intelligence to coax knowledge from an otherwise cacophonous data continent. Google/Alphabet is the most aggressive acquirer of AI technology and talent. In 2014–2016 it purchased nine AI companies, twice as many as its nearest rival, Apple.30

  The concentration of AI talent at Google reflects a larger trend. In 2017, US companies are estimated to have allocated more than $650 million to fuel the AI talent race, with more than 10,000 available positions at top employers across the country. The top five tech companies have the capital to crowd out competitors: startups, universities, municipalities, established corporations in other industries, and less wealthy countries.31 In Britain, university administrators are already talking about a “missing generation” of data scientists. The huge salaries of the tech firms have lured so many professionals that there is no one left to teach the next generation of students. As one scholar described it, “The real problem is these people are not dispersed through society. The intellect and expertise is concentrated in a small number of companies.”32

  On the strength of its lavish recruitment efforts, Google tripled its number of machine intelligence scientists in just the last few years and has become the top contributor to the most prestigious scientific journals—four to five times the world average in 2016. Under the regime of surveillance capitalism, the corporation’s scientists are not recruited to solve world hunger or eliminate carbon-based fuels. Instead, their genius is meant to storm the gates of human experience, transforming it into data and translating it into a new market colossus that creates wealth by predicting, influencing, and controlling human behavior.

  More than six hundred years ago, the printing press put the written word into the hands of ordinary people, rescuing the prayers, bypassing the priesthood, and delivering the opportunity for spiritual communion directly into the hands of the prayerful. We have come to take for granted that the internet enables an unparalleled diffusion of information, promising more knowledge for more people: a mighty democratizing force that exponentially realizes Gutenberg’s revolution in the lives of billions of individuals. But this grand achievement has blinded us to a different historical development, one that moves out of range and out of sight, designed to exclude, confuse, and obscure. In this hidden movement the competitive struggle over surveillance revenues reverts to the pre-Gutenberg order as the division of learning in society shades toward the pathological, captured by a narrow priesthood of privately employed computational specialists, their privately owned machines, and the economic interests for whose sake they learn.

  V. The Privatization of the Division of Learning in Society

  The division of learning in society has been hijacked by surveillance capitalism. In the absence of a robust double movement in which democratic institutions and civil society tether raw information capitalism to the people’s interests—however imperfectly—we are thrown back on the market form of the surveillance capitalist companies in this most decisive of contests over the division of learning in society. Experts in the disciplines associated with machine intelligence know this, but they have little grasp of its wider implications. As data scientist Pedro Domingos writes, “Whoever has the best algorithms and the most data wins.… Google with its head start and larger market share, knows better what you want… whoever learns fastest wins.…” The New York Times reports that Google CEO Sundar Pichai now shares a floor with the company’s AI research lab and notes it as a trend among many CEOs: a literal take on the concentration of power.33

  Just over thirty years ago, legal scholar Spiros Simitis published a seminal essay on the theme of privacy in an information society. Simitis grasped early on that the already visible trends in public and private “information processing” harbored threats to society that transcended narrow conceptions of privacy and data ownership: “Personal information is increasingly used to enforce standards of behavior. Information processing is developing, therefore, into an essential element of long-term strategies of manipulation intended to mold and adjust individual conduct.”34 Simitis argued that these trends were incompatible not only with privacy but with the very possibility of democracy, which depends upon a reservoir of individual capabilities associated with autonomous moral judgment and self-determination.

  Building on Simitis’s work, Berkeley’s Paul M. Schwartz warned in 1989 that computerization would transform the delicate balance of rights and obligations upon which privacy law depends: “Today the enormous amounts of personal data available in computers threaten the individual in a way that renders obsolete much of the previous legal protection.” Most important, Schwartz foresaw that the scale of the still-emerging crisis would impose risks that exceed the scope of privacy law: “The danger that the computer poses is to human autonomy. The more that is known about a person, the easier it is to control him. Insuring the liberty that nourishes democracy requires a structuring of societal use of information and even permitting some concealment of information.”35

  Both Simitis and Schwartz sensed the ascent of the division of learning as the axial principle of a new computational societal milieu, but they could not have anticipated the rise of surveillance capitalism and its consequences. Although the explosive growth of the information continent shifts a crucial axis of the social order from a twentieth-century division of labor to a twenty-first-century division of learning, it is surveillance capitalists who command the field and unilaterally lay claim to a disproportionate share of the decision rights that shape the division of learning in society.

  Surveillance capitalists’ acts of digital dispossession impose a new kind of control upon individuals, populations, and whole societies. Individual privacy is a casualty of this control, and its defense requires a reframing of privacy discourse, law, and judicial reasoning. The “invasion of privacy” is now a predictable dimension of social inequality, but it does not stand alone. It is the systematic result of a “pathological” division of learning in society in which surveillance capitalism knows, decides, and decides who decides. Demanding privacy from surveillance capitalists or lobbying for an e
nd to commercial surveillance on the internet is like asking Henry Ford to make each Model T by hand or asking a giraffe to shorten its neck. Such demands are existential threats. They violate the basic mechanisms and laws of motion that produce this market leviathan’s concentrations of knowledge, power, and wealth.

  So here is what is at stake: surveillance capitalism is profoundly antidemocratic, but its remarkable power does not originate in the state, as has historically been the case. Its effects cannot be reduced to or explained by technology or the bad intentions of bad people; they are the consistent and predictable consequences of an internally consistent and successful logic of accumulation. Surveillance capitalism rose to dominance in the US under conditions of relative lawlessness. From there it spread to Europe, and it continues to make inroads in every region of the world. Surveillance capitalist firms, beginning with Google, dominate the accumulation and processing of information, especially information about human behavior. They know a great deal about us, but our access to their knowledge is sparse: hidden in the shadow text and read only by the new priests, their bosses, and their machines.

  This unprecedented concentration of knowledge produces an equally unprecedented concentration of power: asymmetries that must be understood as the unauthorized privatization of the division of learning in society. This means that powerful private interests are in control of the definitive principle of social ordering in our time, just as Durkheim warned of the subversion of the division of labor by the powerful forces of industrial capital a century ago. As things currently stand, it is the surveillance capitalist corporations that know. It is the market form that decides. It is the competitive struggle among surveillance capitalists that decides who decides.

  VI. The Power of the Unprecedented: A Review

  The titanic power struggles of the twentieth century were between industrial capital and labor, but the twenty-first century finds surveillance capital pitted against the entirety of our societies, right down to each individual member. The competition for surveillance revenues bears down on our bodies, our homes, and our cities in a battle for power and profit as violent as any the world has seen. Surveillance capitalism cannot be imagined as something “out there” in factories and offices. Its aims and effects are here… are us.

  Ours is not simply a case of being ambushed and outgunned. We were caught off guard because there was no way that we could have imagined these acts of invasion and dispossession, any more than the first unsuspecting Taíno cacique could have foreseen the rivers of blood that would flow from his inaugural gesture of hospitality toward the hairy, grunting, sweating men, the adelantados who appeared out of thin air waving the banner of the Spanish monarchs and their pope as they trudged across the beach. Why have we been slow to recognize the “original sin of simple robbery” at the heart of this new capitalism? Like the Tainos, we faced something altogether new to our story: the unprecedented. And, like them, we risk catastrophe when we assess new threats through the lens of old experience.

  On the “supply side,” surveillance capitalists deftly employed the entire arsenal of the declaration to assert their authority and legitimacy in a new and undefended digital world. They used declarations to take without asking. They camouflaged their purpose with illegible machine operations, moved at extreme velocities, sheltered secretive corporate practices, mastered rhetorical misdirection, taught helplessness, purposefully misappropriated cultural signs and symbols associated with the themes of the second modernity—empowerment, participation, voice, individualization, collaboration—and baldly appealed to the frustrations of second-modernity individuals thwarted in the collision between psychological yearning and institutional indifference.

  In this process the pioneer surveillance capitalists at Google and Facebook evaded the disciplines of corporate governance and rejected the disciplines of democracy, protecting their claims with financial influence and political relationships. Finally, they benefited from history, born in a time when regulation was equated with tyranny and the state of exception precipitated by the terrorist attacks of 9/11 produced surveillance exceptionalism, further enabling the new market to root and flourish. Surveillance capitalists’ purposeful strategies and accidental gifts produced a form that can romance and beguile but is also ruthlessly efficient at extinguishing space for democratic deliberation, social debate, individual self-determination, and the right to combat as it forecloses every path to exit.

  On the “demand side,” second-modernity populations starved for enabling resources were so enraptured by the plentiful bags of rice and powdered milk thrown from the back of the digital truck that little attention was paid to the drivers or their destination. We needed them; we even believed that we couldn’t live without them. But under scrutiny, those long-awaited delivery trucks look more like automated vehicles of invasion and conquest: more Mad Max than Red Cross, more Black Sails than Carnival Cruise. The wizards behind their steering wheels careen across every hill and hollow, learning how to scrape and stockpile our behavior over which they unabashedly assert their rights as conquerors’ plunder.

  In the absence of a clear-minded appreciation of this new logic of accumulation, every attempt at understanding, predicting, regulating, or prohibiting the activities of surveillance capitalists will fall short. The primary frameworks through which our societies have sought to assert control over surveillance capitalism’s audacity are those of “privacy rights” and “monopoly.” Neither the pursuit of privacy regulations nor the imposition of constraints on traditional monopoly practices has so far interrupted the key mechanisms of accumulation, from supply routes to behavioral futures markets. On the contrary, surveillance capitalists have extended and elaborated their extraction architectures across every human domain as they master the practical and political requirements of the dispossession cycle. This success now threatens the deepest principles of social order in an information civilization as surveillance capitalism takes unauthorized command over the division of learning in society.

  If there is to be a fight, let it be a fight over capitalism. Let it be an insistence that raw surveillance capitalism is as much a threat to society as it is to capitalism itself. This is not a technical undertaking, not a program for advanced encryption, improved data anonymity, or data ownership. Such strategies only acknowledge the inevitability of commercial surveillance. They leave us hiding in our own lives as we cede control to those who feast on our behavior for their own purposes. Surveillance capitalism depends on the social, and it is only in and through collective social action that the larger promise of an information capitalism aligned with a flourishing third modernity can be reclaimed.

  In Part I we have seen how Google built its extraction architecture in the online world. As competition for surveillance revenues intensified, a second economic imperative rose to prominence driving an expansion of that architecture into another world, the one that we call “real.”

  Now the story of surveillance capitalism moves in this new direction. In Part II, I invite you to rekindle your sense of astonishment as we follow the trail of this second economic imperative defined by the prediction of human behavior. The prediction imperative enlarges the complexity of surplus operations as economies of scale are joined by economies of scope and economies of action. These new disciplines drive surveillance capitalism far into the intimate reaches of our daily lives and deep into our personalities and our emotions. Ultimately, they compel the development of highly inventive but resolutely secret new means to interrupt and modify our behavior for the sake of surveillance revenues. These operations challenge our elemental right to the future tense, which is the right to act free of the influence of illegitimate forces that operate outside our awareness to influence, modify, and condition our behavior. We grow numb to these incursions and the ways in which they deform our lives. We succumb to the drumbeat of inevitability, but nothing here is inevitable. Astonishment is lost but can be found again.

  PART II

  THE
ADVANCE OF SURVEILLANCE CAPITALISM

  CHAPTER SEVEN

  THE REALITY BUSINESS

  Falling in love with Truth before he knew Her,

  He rode into imaginary lands,

  By solitude and fasting hoped to woo Her,

  And mocked at those who served Her with their hands.

  —W. H. AUDEN

  SONNETS FROM CHINA, VI

  I. The Prediction Imperative

  There could not have been a more fitting setting for Eric Schmidt to share his opinion on the future of the web than the World Economic Forum in Davos, Switzerland. In 2015, during a session at the winter playground for neoliberals—and increasingly surveillance capitalists—Schmidt was asked for his thoughts about the future of the internet. Sitting alongside his former Google colleagues Sheryl Sandberg and Marissa Mayer, he did not hesitate to share his belief that “The internet will disappear. There will be so many IP addresses… so many devices, sensors, things that you are wearing, things that you are interacting with, that you won’t even sense it. It will be part of your presence all the time. Imagine you walk into a room and the room is dynamic.”1 The audience gasped in astonishment, and shortly thereafter, headlines around the world exploded in shock at the former Google CEO’s pronouncement that the end of the internet was at hand.

 

‹ Prev