The Age of Surveillance Capitalism
Page 28
Despite its pervasiveness both in Silicon Valley and in the wider culture of data scientists and technology developers, inevitabilism is rarely discussed or critically evaluated. Paradiso’s conception of a “digital omniscience” is taken for granted, with little discussion of politics, power, markets, or governments. As in most accounts of the apparatus, questions of individual autonomy, moral reasoning, social norms and values, privacy, decision rights, politics, and law take the form of afterthoughts and genuflections that can be solved with the correct protocols or addressed with still more technology solutions. If information will stream “directly into our eyes and ears” and “the boundaries of the individual will be very blurry,” then who can access that information? What if I don’t want my life streaming through your senses? Who knows? Who decides? Who decides who decides? The answers to such questions are drowned in the thrum of all things continuously illuminated, registered, counted, controlled, and judged.
The best that Paradiso can offer up is a suggestion that “the law could give a person ownership or control of data generated in his or her vicinity; a person could then choose to encrypt or restrict those data from entering the network.”68 Paradiso imagines a society in which it falls to each individual to protect herself from the omniscient ubiquitous sensate computational systems of the new apparatus. Rather than paradise, it seems a recipe for a new breed of madness. Yet this is precisely the world that is now under construction around us, and this madness appears to be a happy feature of the plan.
Between 2012 and 2015, I interviewed 52 data scientists and specialists in the “internet of things.” They came from 19 different companies with a combined 586 years of experience in high-technology corporations and startups, primarily in Silicon Valley. I spoke with them about the prominence of inevitability rhetoric among the purveyors of the new apparatus, and I posed the same question to each one: why do so many people say that ubiquitous computing is inevitable? The agreement among their responses was striking. Although they did not have the language of surveillance capitalism, nearly every interviewee regarded inevitability rhetoric as a Trojan horse for powerful economic imperatives, and each one of them lamented the lack of any critical discussion of these assumptions.
As the marketing director of a Silicon Valley firm that sells software to link smart devices told me, “There’s all that dumb real estate out there and we’ve got to turn it into revenue. The ‘internet of things’ is all push, not pull. Most consumers do not feel a need for these devices. You can say ‘exponential’ and ‘inevitable’ as much as you want. The bottom line is that the Valley has decided that this has to be the next big thing so that firms here can grow.”
I spoke with a senior engineer from a large tech company that invests heavily in the “internet of things.” The response:
Imagine you have a hammer. That’s machine learning. It helped you climb a grueling mountain to reach the summit. That’s machine learning’s dominance of online data. On the mountaintop you find a vast pile of nails, cheaper than anything previously imaginable. That’s the new smart sensor tech. An unbroken vista of virgin board stretches before you as far as you can see. That’s the whole dumb world. Then you learn that any time you plant a nail in a board with your machine learning hammer, you can extract value from that formerly dumb plank. That’s data monetization. What do you do? You start hammering like crazy and you never stop, unless somebody makes you stop. But there is nobody up here to make us stop. This is why the “internet of everything” is inevitable.
A senior systems architect laid out the imperative in the clearest terms: “The IoT is inevitable like getting to the Pacific Ocean was inevitable. It’s manifest destiny. Ninety-eight percent of the things in the world are not connected. So we’re gonna connect them. It could be a moisture temperature that sits in the ground. It could be your liver. That’s your IoT. The next step is what we do with the data. We’ll visualize it, make sense of it, and monetize it. That’s our IoT.”
VIII. Men Made It
The relentless drumbeat of inevitabilist messages presents the new apparatus of ubiquity as the product of technological forces that operate beyond human agency and the choices of communities, an implacable movement that originates outside history and exerts a momentum that in some vague way drives toward the perfection of the species and the planet. The image of technology as an autonomous force with unavoidable actions and consequences has been employed across the centuries to erase the fingerprints of power and absolve it of responsibility. The monster did it, not Victor Frankenstein. However, the ankle bracelet does not monitor the prisoner; the criminal justice system does that.
Every doctrine of inevitability carries a weaponized virus of moral nihilism programmed to target human agency and delete resistance and creativity from the text of human possibility. Inevitability rhetoric is a cunning fraud designed to render us helpless and passive in the face of implacable forces that are and must always be indifferent to the merely human. This is the world of the robotized interface, where technologies work their will, resolutely protecting power from challenge.
No one has expressed this with more insight and economy than John Steinbeck in the opening chapters of his masterwork, The Grapes of Wrath, which describes the dustbowl farmers who are thrown out of their Oklahoma homes during the Great Depression and then head west to California. The families are forced off the land that they have tended for generations. They plaintively argue their case to the bank agents sent to impress upon them the facts of their helplessness. But the agents respond with “The bank is something else than men. It happens that every man in a bank hates what the bank does, and yet the bank does it. The bank is something more than men, I tell you. It’s the monster. Men made it, but they can’t control it.”69
This theme of supposed technological autonomy is a venerable one among technology scholars. Langdon Winner again proves to be a worthy guide when he reminds us that an unquestioning acceptance of technology has become a feature of modern life: “The changes and disruptions that an evolving technology repeatedly caused in modern life were accepted as given or inevitable simply because no one bothered to ask whether there were other possibilities.”70
Winner observes that we have allowed ourselves to become “committed” to a pattern of technological “drift,” defined as “accumulated unanticipated consequences.” We accept the idea that technology must not be impeded if society is to prosper, and in this way we surrender to technological determinism. Rational consideration of social values is considered “retrograde,” Winner writes, “not the ticket that scientific technology gives to civilization.… To this day, any suggestions that the forward flow of technological innovation be in any way limited… violate a fundamental taboo.… Instead we accept the change, later looking back upon what we have done to ourselves as a topic of curiosity.”71 To Winner’s “curiosity” I add another theme: remorse.
Surveillance capitalist leaders assume that we will succumb to the naturalistic fallacy as Steinbeck’s farmers were meant to do. Because Google is successful—because surveillance capitalism is successful—its rules must obviously be right and good. Like the bank agents, Google wants us to accept that its rules simply reflect the requirements of autonomous processes, something that people cannot control. However, our grasp of the inner logic of surveillance capitalism suggests otherwise. Men and women made it, and they can control it. They merely choose not to do so.
Inevitabilism enshrines the apparatus of ubiquity as progress but conceals the realpolitik of surveillance capitalism at work behind the scenes. We know that there can be alternative paths to a robust information capitalism that produces genuine solutions for a third modernity. We have seen that surveillance capitalism was discovered and honed in history, handcrafted by men and women to serve the interests of impatient capital. It is this same logic that now demands ubiquity, ready to colonize technical developments for the sake of its imperatives and growth. Inevitabilism operates in the service of the
se imperatives as it distracts attention from the ambitions of a rising economic order and the competitive anxieties that drive the surveillance project toward certainty, thus necessitating its ever-more-voracious claims on our behavior.
Inevitabilism precludes choice and voluntary participation. It leaves no room for human will as the author of the future. This raises questions: At what point does inevitabilism’s claim to ubiquitous extraction and execution shade into abuse? Will inevitabilism’s utopian declarations summon new forms of coercion designed to quiet restless populations unable to quell their hankering for a future of their choice?72
IX. To the Ground Campaign
Google’s declarations; surveillance capitalism’s dominance over the division of learning in society and its laws of motion; ubiquitous architectures of extraction and execution; MacKay’s penetration of inaccessible regions while observing unrestrained animals with methods that elude their awareness; the uncontract and its displacement of society; Paradiso’s ubiquitous sensate environment; dark data; the inevitabilism evangelists: there is one place where all these elements come together and transform a shared public space built for human engagement into a petri dish for the reality business of surveillance capitalism. That place is the city.
Cisco has 120 “smart cities” globally, some of which have embraced Cisco Kinetic, which as Jahangir Mohammed, the company’s vice president and general manager of IoT, explains in a blog post, “is a cloud-based platform that helps customers extract, compute, and move data from connected things to IoT applications to deliver better outcomes.… Cisco Kinetic gets the right data to the right applications at the right time… while executing policies to enforce data ownership, privacy, security and even data sovereignty laws.”73 But, as is so often the case, the most audacious effort to transform the urban commons into the surveillance capitalist’s equivalent of Paradiso’s 250-acre marsh comes from Google, which has introduced and legitimated the concept of the “for-profit city.” Just as MacKay had counseled and Weiser proselytized, the computer would be operational everywhere and detectable nowhere, always beyond the edge of individual awareness.
In 2015, shortly after Google reorganized itself into a holding company called Alphabet, Sidewalk Labs became one of nine “confirmed companies” under the Alphabet corporate umbrella. Whether what even Sidewalk CEO Dan Doctoroff, a former private equity financier, CEO of Bloomberg, and deputy mayor of New York City in the Bloomberg administration, refers to as a “Google city” succeeds, the company has interested the public by recasting our central gathering place as a commercial operation in which once-public assets and functions are reborn as the cornered raw materials earmarked for a new marketplace. In this vision, MacKay and Paradiso’s conceptions come to fruition under the auspices of surveillance capitalism in a grand scheme of vertically integrated supply, production, and sales.
Sidewalk Labs’ first public undertaking was the installation of several hundred free internet-enabled kiosks in New York City, ostensibly to combat the problem of “digital inequality.” As we saw with Google Street View, the company can siphon a lot of valuable information about people from a Wi-Fi network, even if they don’t use the kiosks.74 Doctoroff has characterized the Sidewalk Labs’ kiosks as “fountains of data” that will be equipped with environmental sensors and also collect “other data, all of which can create very hyperlocal information about conditions in the city.”
In 2016 the US Department of Transportation (DOT) announced a partnership with Sidewalk Labs “to funnel transit data to city officials.” The DOT worked to draw cities into Google’s orbit with a competition for $40 million in grants. Winners would work with Sidewalk Labs to integrate technology into municipal operations, but Sidewalk Labs was eager to work with finalists in order to develop its own traffic-management system, Flow.75 Flow relies on Google Maps, Street View vehicles, and machine intelligence to capture and analyze data from drivers and public spaces.76 These analyses produce prediction products described as “inferences about where people are coming from or going,” enabling administrators “to run virtual experiments” and improve traffic flow.77
Doctoroff postulates a city presided over by digital omniscience: “We’re taking everything from anonymized smartphone data from billions of miles, trips, sensor data, and bringing that into a platform.”78 Sidewalk refers to its high-tech services as “new superpowers to extend access and mobility.” Algorithms designed to maintain critical behaviors within a prescribed zone of action would manage these data flows: “In a world in which we can monitor things like noise or vibrations, why do we need to have these very prescriptive building codes?” As an alternative, Doctoroff suggests “performance-based zoning” administered by the ubiquitous apparatus through the medium of algorithms. These processes, like Varian’s vehicular monitoring systems, are indifferent to why you behave as long as they can monitor and control the behavior you produce. As Doctoroff explains it, “I don’t care what you put here as long as you don’t exceed performance standards like noise levels.…” This is preferable, he says, because it enhances “the free flow of property… that is a logical extension of… these technologies.”79 Why should citizens have any say over their communities and the long-term implications of how luxury high-rises, hotels, or a residential building going commercial could affect rents and local businesses as long as an algorithm is satisfied with noise thresholds?
When Columbus, Ohio, was named winner of the DOT competition, it began a three-year demonstration project with Sidewalk, including a hundred kiosks and free access to the Flow software. Documents and correspondence from this collaboration eventually obtained by the Guardian describe innovations such as “dynamic parking,” “optimized parking enforcement,” and a “shared mobility marketplace” that reveal a more troubling pattern than the rhetoric suggests. Sidewalk’s data flows combine public and private assets for sale in dynamic, real-time virtual markets that extract maximum fees from citizens and leave municipal governments dependent upon Sidewalk’s proprietary information. For example, public and private parking spaces are combined in online markets and rented “on demand” as the cost of parking varies in real time, substantially increasing parking income. Optimized parking enforcement depends on Sidewalk’s algorithms “to calculate the most lucrative routes for parking cops,” earning cities millions of extra dollars that they desperately need but that arrive at the expense of their citizens.
Cities are required to invest substantial public monies in Sidewalk’s technology platform, including channeling municipal funds earmarked for low-cost public bus service into “mobility markets” that rely on private ride-sharing companies such as Uber. The company insists that cities “share public transport data with ride-sharing companies, allowing Uber to direct cars to overcrowded bus stops.” The Flow Transit system integrates information and payment for nearly every kind of transport into Google Maps, and cities are obligated to “upgrade” to Sidewalk’s mobile payment system “for all existing transit and parking services.” Just as it requires public-transit data, Sidewalk also insists that cities share all parking and ridership information with Sidewalk Labs in real time.80 When asked, Doctoroff has emphasized the novel blending of public functions and private gain, assuring his listeners on both counts that “our mission is to use technology to change cities… to bring technology to solve big urban problems.… We expect to make a lot of money from this.”81
In April 2016 a “curated group of leaders” in tech, media, and finance met at the Yale Club in Manhattan to hear Sidewalk CEO Dan Doctoroff’s talk: “Google City: How the Tech Juggernaut Is Reimagining Cities—Faster Than You Realize.”82 His remarks provide a candid assessment of the “Google city” as a market operation shaped by the prediction imperative. He could not have been more direct in articulating Sidewalk Labs’ approach as a translation of Google’s online world to the reality of city life:
In effect, what we’re doing is replicating the digital experience in physical space.… S
o ubiquitous connectivity; incredible computing power including artificial intelligence and machine learning; the ability to display data; sensing, including cameras and location data as well as other kinds of specialized sensors.… We fund it all… through a very novel advertising model.… We can actually then target ads to people in proximity, and then obviously over time track them through things like beacons and location services as well as their browsing activity.83
Later that year, Sidewalk announced collaborations with sixteen additional cities, noting that achieving scale would enable it to improve its Flow software products. Doctoroff referred to these collaborations as “inevitable.”84
The vast and varied ground campaign already underway turns the prediction imperative into concrete activity. In pursuit of economies of scope, a wave of novel machine processes are honed for extraction, rendering people and things as behavioral data. For the sake of economies of action, the apparatus learns to interrupt the flow of personal experience in order to influence, modify, and direct our behavior, guided by the plans and interests of self-authorizing commercial actors and the buzzing market cosmos in which they participate. In nearly every case the agents of institutionalization present their novel practices as if they are one thing, when they are, in fact, something altogether different. The realpolitik of commercial surveillance operations is concealed offstage while the chorus of actors singing and dancing under the spotlights holds our attention and sometimes even our enthusiasm. They sweat under the stage lights for the sake of one aim: that we fail to notice the answers or, better yet, forget to ask the questions: Who knows? Who decides? Who decides who decides?