The Age of Surveillance Capitalism
Page 29
In light of these ambitions, it is not surprising that Doctoroff, like Page, prefers lawless space. Press reports confirmed that Alphabet/Google was actively considering a proposal for a new city and that more than a hundred urban planners, researchers, technologists, building experts, economists, and consultants were involved in the project.85 The Wall Street Journal reported that although it was unclear how the company would fund the tens of billions of dollars necessary for such a large-scale undertaking, “One key element is that Sidewalk would be seeking autonomy from many city regulations, so it could build without constraints.…”86
In October 2017, Doctoroff appeared with Alphabet Executive Chairman Eric Schmidt and Canadian Prime Minister Justin Trudeau to reveal that Toronto would be the site of Sidewalk’s planned development. Its intent is to develop the right mix of technology that it can then license to cities around the world. “The genesis of the thinking for Sidewalk Labs came from Google’s founders getting excited thinking of ‘all the things you could do if someone would just give us a city and put us in charge,’” Toronto’s Globe and Mail reported Schmidt as saying, noting that “he joked he knew there were good reasons that doesn’t happen.” Then, just as quickly, the paper related Schmidt’s reaction when he first learned that Sidewalk, and by extension Alphabet, had secured this opportunity in Toronto: “Oh my God! We’ve been selected. Now, it’s our turn.”87
CHAPTER EIGHT
RENDITION: FROM EXPERIENCE TO DATA
You take a picture of ’em, they’ll kill you.
They think you’re takin’ somethin’ away from ’em.
That you only got so much… stuff!… and if other
People are takin’ it all, then there ain’t none left for yourself.
—ROBERT GARLAND, THE ELECTRIC HORSEMAN
To photograph is to appropriate the thing photographed.
It means putting oneself into a certain relation to the world
that feels like knowledge—and, therefore, like power.
—SUSAN SONTAG, ON PHOTOGRAPHY
I. Terms of Sur-Render
We worry about companies that amass our personal data, and we wonder why they should profit. “Who owns the data?” we ask. But every discussion of data protection or data ownership omits the most important question of all: why is our experience rendered as behavioral data in the first place? It has been far too easy to overlook this important step in the chain of events that produces behavioral surplus. This chapter and the next draw our attention to the gap between experience and data, as well as to the specific operations that target this gap on a mission to transform the one into the other. I call these operations rendition. We have seen that the dispossession of human experience is the original sin of surveillance capitalism, but this dispossession is not mere abstraction. Rendition describes the concrete operational practices through which dispossession is accomplished, as human experience is claimed as raw material for datafication and all that follows, from manufacturing to sales. A focus on these intermediate practices illustrates that the apparatus of ubiquity is not a passive one-way mirror. Rather, it actively creates its own stores of knowledge through rendition.
The noun rendition derives from the verb render, a most unusual word whose double meanings describe a two-sided equation that perfectly captures what happens in the gap between human experience and behavioral data. On one side of the equation, the verb describes a process in which something is formed out of something else that is originally given. It designates the causal action of turning one thing into another, such as rendering oil from fat (extraction) or rendering an English text from the original Latin (translation). These meanings have also found their way into the vocabulary of digital technology. For example, a “rendering engine” converts the coded content of an HTML page for display and printing.
On the other side of the equation, render also describes the way in which the thing that is changed gives itself over to this process: it sur-renders. The verb rendre first appears in tenth-century French, meaning “to give back, present, yield,” as in “rendering an account” or “the tree renders its fruit.” By the fourteenth century, the word also incorporated the idea of handing over, delivering, or acknowledging dependency or obligation, as in “Render unto Caesar.” These meanings are active today when we say “rendering a verdict,” “rendering service,” or “rendering property.”
Surveillance capitalism must work both sides of the equation. On one side, its technologies are designed to render our experience into data, as in rendering oil from fat. This typically occurs outside of our awareness, let alone our consent. On the other side of the equation, every time we encounter a digital interface we make our experience available to “datafication,” thus “rendering unto surveillance capitalism” its continuous tithe of raw-material supplies.
This two-sided equation is a novel arrangement. As we saw in Chapter 1, the Aware Home project developed at Georgia Tech just a year before the invention of surveillance capitalism employed different practices that embodied very different assumptions: (1) that it must be the individual alone who decides what experience is rendered as data, (2) that the purpose of the data is to enrich the individual’s life, and (3) that the individual is the sole arbiter of how the data are shared or put to use. Nearly two decades later, the Aware Home is barely more than an archeological fragment reminding us of the road not taken toward an empowering digital future and a more-just division of learning in society. Down that road, it is the individual who knows, decides, and decides who decides: an end in herself, not a means to others’ ends. The lesson of the Aware Home is that there can be rendition without surveillance capitalism. However, the lesson of this chapter and the next is that there can be no surveillance capitalism without rendition.
Nothing is exempt, as products and services from every sector join devices like the Nest thermostat in the competition for surveillance revenues. For example, in July 2017 iRobot’s autonomous vacuum cleaner, Roomba, made headlines when the company’s CEO, Colin Angle, told Reuters about its data-based business strategy for the smart home, starting with a new revenue stream derived from selling floor plans of customers’ homes scraped from the machine’s new mapping capabilities. Angle indicated that iRobot could reach a deal to sell its maps to Google, Amazon, or Apple within the next two years. In preparation for this entry into surveillance competition, a camera, new sensors, and software had already been added to Roomba’s premier line, enabling new functions, including the ability to build a map while tracking its own location. The market had rewarded iRobot’s growth vision, sending the company’s stock price to $102 in June 2017 from just $35 a year earlier, translating into a market capitalization of $2.5 billion on revenues of $660 million.1
Privacy experts raised alarms, knowing that such data streams have virtually no legal or security protection. But Angle assured the public that iRobot would not sell data without its customers’ permission and expressed confidence that “most would give their consent in order to access the smart home functions.”2 Why was Angle so confident?
According to the company’s privacy policy, it is true that owners of the Roomba can control or stop the collection of usage data by “disconnecting your WiFi or Bluetooth from the app, for example, by changing your WiFi password.” However, as Angle told tech site Mashable in July 2017, even when customers do not opt-in to the mapping service, the Roomba captures mapping and usage data, but only usage data “is sent to the cloud so it can be shown on your mobile device.”3 What Angle neglected to say was that a customer who refuses to share his or her home’s interior mapping data with iRobot also loses most of the smart functionality of the “autonomous” vacuum cleaner, including the ability to use one’s phone to start or pause a cleaning, schedule cleanings, review “Clean Map reports,” receive automatic software updates, or start a “SPOT Clean to focus on a particularly dirty area.”4
Angle’s confidence-enhancing strategy goes to the heart of the larger rendition projec
t, for which surveillance capitalist purveyors of “smart” home products have developed a singular approach. On the one hand, they stress that customers can opt in to data sharing. On the other hand, customers who refuse to opt in face limited product functionality and data security. In these Requirimiento-style relationships, instead of the adelantados’ message, “Bend the knee or we destroy you,” the message here is “Bend the knee or we degrade your purchase.”
Under this new regime, something as simple as buying a mattress now requires careful legal scrutiny of the “abusive contracts” that nearly everyone ignores. Consider the Sleep Number bed, with its “smart bed technology and sleep tracking.”5 The company’s website features a beautiful couple snuggled in bed happily glued to their smartphones as they delight in data from their SleepIQ app. The bed’s base and mattress are “customizable” with features that raise or lower the angle of the bed and sensors that soften or firm up the mattress. Other sensors measure heart rate, breathing, and movement: “Every morning you’ll get your SleepIQ® score, representing your individual quality and length of sleep… your restful sleep, restless sleep and time out of bed… and what adjustments you can make.” The company suggests that you connect your sleep app to your fitness tracker and your thermostat in order to see how your workout or bedroom temperature affects your sleep.
A dense, twelve-page privacy policy accompanies the bed. Customers are advised that providing information is an affirmation of consent to use that information in-line with the policy, which employs the usual onerous terms: third-party sharing, Google analytics, targeted advertising, and much more. In addition, if customers create a user profile to maximize the effectiveness of the app, the company also collects “biometric and sleep-related data about how You, a Child, and any person that uses the Bed slept, such as that person’s movement, positions, respiration, and heart rate while sleeping.” It also collects all the audio signals in your bedroom. As with most such policies, customers are advised that the company can “share” or “exploit” personal information even “after You deactivate or cancel the Services and/or your Sleep Number account or User Profiles(s).” Customers are warned that no data transmission or storage “can be guaranteed to be 100% secure” and that it does not honor “Do Not Track” notifications. Finally, on page 8 of the document, the policy addresses a customer’s choices regarding the use of personal information: “Whether you submit Information to Us is entirely up to You. If you decide to not submit Information, We may not be able to provide certain features, products, and/or services to you.”6
This same coercive Requirimiento twist can be found in the lengthy, dense legal compacts associated with Alphabet-owned Nest thermostats. The terms-of-service and end-user licensing agreements reveal oppressive privacy and security consequences in which sensitive information is shared with other devices, unnamed personnel, and third parties for the purposes of analysis and ultimately for trading in behavioral futures markets, action that ricochets back to the owner in the form of targeted ads and messages designed to push more products and services. Despite this, courts have generally upheld companies’ claims that they bear no liability without a clear demonstration of economic harm to the consumer.
Nest takes little responsibility for the security of that information and none for how other companies will put it to use. In fact, University of London legal scholars Guido Noto La Diega and Ian Walden, who analyzed these documents, reckon that were one to enter into the Nest ecosystem of connected devices and apps, each with their own equally burdensome terms, the purchase of a single home thermostat entails the need to review nearly a thousand “contracts.” 7
This absurdity is compounded by the fact that virtually no one reads even one such “contract.” A valuable empirical study of 543 participants familiar with surveillance and privacy law issues found that when asked to join a new online service, 74 percent opted for the “quick join” procedure, bypassing the terms-of-service agreement and the privacy policy. Among those who did scroll through the abusive contracts, most went directly to the “accept” button. The researchers calculated that the documents required at least forty-five minutes for adequate comprehension, but for those who looked at the agreements, the median time they spent was fourteen seconds.8
Should the customer refuse to agree to Nest’s stipulations, the terms of service indicate that the functionality and security of the thermostat itself will be deeply compromised, no longer supported by the necessary updates meant to ensure its reliability and safety. The consequences can range from frozen pipes to failed smoke alarms to an easily hacked internal home system. In short, the effectiveness and safety of the product are brazenly held hostage to its owners’ submission to rendition as conquest, by and for others’ interests.
One can easily choose not to purchase a Roomba, a SleepNumber bed, or a Nest thermostat, but each of these is merely emblematic of the immense project of rendition as the first and vital step in the construction of the apparatus of ubiquity. Thousands of “internet of things” objects are becoming available. As La Diega and Walden conclude, in this new product regime the simple product functions that we seek are now hopelessly enmeshed in a tangled mixture of software, services, and networks.9
The very idea of a functional, effective, affordable product or service as a sufficient basis for economic exchange is dying. Where you might least expect it, products of every sort are remade by the new economic requirements of connection and rendition. Each is reimagined as a gateway to the new apparatus, praised for being “smart” while traditional alternatives are reviled for remaining “dumb.” It is important to acknowledge that in this context, “smart” is a euphemism for rendition: intelligence that is designed to render some tiny corner of lived experience as behavioral data. Each smart object is a kind of marionette; for all its “smartness,” it remains a hapless puppet dancing to the puppet master’s hidden economic imperatives. Products, services, and applications march to the drumbeat of inevitabilism toward the promise of surveillance revenues hacked from the still-wild spaces that we call “my reality,” “my home,” “my life,” and “my body.” Every smart product repeats our essential questions: What does a smart product know, and whom does it tell? Who knows? Who decides? Who decides who decides?
Examples of products determined to render, monitor, record, and communicate behavioral data proliferate, from smart vodka bottles to internet-enabled rectal thermometers, and quite literally everything in between.10 The business developer for a spirits company thus cites his plan for a “connected bottle”: “The more we learn about consumers and their behaviors, the better services we can connect to them.”11 Many brands are determined “to give packaging a speaking role in an increasingly interactive marketplace.” Global spirits distributor Diageo promises “smart sensor-equipped bottles” that can track purchases and sales data, and, most importantly, “communicate with consumers’ devices and switch gears—recipes versus sales promos—once the bottle is opened.” A producer of bar equipment states the case plainly enough: “It really is all about… allowing these [bar] owners to see stuff that they couldn’t see before and maximize their profits.”12
Today our homes are in surveillance capitalism’s crosshairs, as competitors chased a $14.7 billion market for smart-home devices in 2017, up from $6.8 billion just a year earlier and expected to reach more than $101 billion by 2021.13 You may have already encountered some of the early absurdities: smart toothbrushes, smart lightbulbs, smart coffee mugs, smart ovens, smart juicers, and smart utensils said to improve your digestion. Others are often more grim: a home security camera with facial recognition; an alarm system that monitors unusual vibrations before a break-in occurs; indoor GPS locators; sensors that attach to any object to analyze movement, temperature, and other variables; every kind of connected appliance; cyborg cockroaches designed to detect sound. Even the baby’s nursery is reconceived as a source of fresh behavioral surplus.14
An appreciation of the surveillance logic
of accumulation that drives this action suggests that this network of things is already evolving into a network of coercion, in which mundane functions are ransomed for behavioral surplus.15 A December 2013 letter from Google’s finance director to the US Securities and Exchange Commission’s Division of Corporate Finance provides a vivid glimpse of these facts. The letter was composed in response to an SEC query on the segmentation of Google’s revenues between its desktop and mobile platforms.16 Google answered by stating that users would be “viewing our ads on an increasingly wide diversity of devices in the future” and that its advertising systems were therefore moving toward “device agnostic” design that made segmentation irrelevant and impractical. “A few years from now,” the letter stated, “we and other companies could be serving ads and other content on refrigerators, car dashboards, thermostats, glasses, and watches, to name just a few possibilities.”
Here is at least one endgame: the “smart home” and its “internet of things” are the canvas upon which the new markets in future behavior inscribe their presence and assert their demands in our most-intimate spaces. Key to the story is that all of this action is prosecuted in support of a larger market process that bets on the future of our behavior and over which we have no knowledge or control. Each node in the network—the vacuum cleaner, the mattress, the thermostat—must play its part, beginning with the frictionless rendition of behavior, as the whole team of seething insistent “smart” things joins the migration to surveillance revenues. As we are shorn of alternatives, we are forced to purchase products that we can never own while our payments fund our own surveillance and coercion. Adding insult to injury, data rendered by this wave of things are notoriously insecure and easily subject to breaches. Moreover, manufacturers have no legal responsibility to notify device owners when data are stolen or hacked.