by Finn Brunton
what empty spaces it fills (as Scott’s “weapons of the weak” fill the space between consent and insurrection). It requires us to discuss what obfuscation accomplishes that other services and systems don’t accomplish, and what
it costs in difficulty, wasted data, and wasted time. In the context of data protection via optimal technology, business best practice, or legislation and governmental intervention, what makes obfuscation necessary? In view of the
costs obfuscation can impose, why should one turn to it? Describing these
costs, and making our argument in light of them, will clarify obfuscation
in general before we frame it in terms of the ethical and political concerns (in chapter 4) and then in terms of designing for specific goals and outcomes
(in chapter 5).
We have already addressed one of the alternatives from which obfusca-
tion must distinguish itself: individuals’ opting out of any platform, service, or interaction that would misuse their data. This is a solution that seems to be free of moral compromise—they disagree and therefore decline, causing no
trouble. Though such opting out may be possible for a very narrow range of possible users and uses, it isn’t a practical or reasonable choice for all. Mar-tyrdom is rarely a productive choice in a political calculus; as straightforward as the rational-actor binary of opting in or out may be, a choice between
acceptance and dropping off the edge of the (networked) earth isn’t really a choice at all. We often end up in compromised situations, trying to make the best decision from a narrow menu of options that are problematic to various degrees and in various ways. The user who makes consistently perfect choices about data security and privacy is, like the perfectly rational economic agent, WHY IS OBFUSCATION NECESSARY?
59
more likely to be found in theory than in practice, and in practice such a person would be a strange balance between a technologist of great sophistication and a Luddite refusenik.
What about relying on businesses to adopt best practices for their customers?
Of course, the users are not the only part of the data-acquisition equation. The companies involved could resolve many of the concerns users have, rendering obfuscation moot. A well-designed opt-out policy could offer fine-tuned
control of the processes of aggregation and analysis, allowing you to make choices that lay between the extremes of refusal and compliance. It would
enable one to receive certain benefits in return for a degree of use, and it would specify that data could be gathered or deployed only in certain contexts, only for certain purposes, and for only a set period of time. That might offer genuine options for users to evaluate. However, private-sector efforts of this kind are hampered by the fact that companies, for good reasons and bad, are the major strategic beneficiaries of data mining. The present-day consumer economy runs on data—surveys, conversion analysis, customer-retention
analysis, demography, targeted advertising, and data collected at the point of sale that feed back through the entire supply chain, from the just-in-time production facility to the trend-spotting system.20 Whether the particular company in question is in the business of gathering, bundling, and selling individual data (as DoubleClick and Acxiom are), whether it has used data generated and provided by its customers to improve its operations (as Amazon and Wal-Mart
have), whether it is based on user-data-driven advertising revenue (as Google is), or whether it subcontracts the analysis of consumer data for purposes of spotting credit, insurance, or rental risks, it isn’t in a company’s interest to support general restraints on access to this information.21
Owing to the competitive disadvantage associated with general restraints
on access to information, any individual company risks losing the returns on data about customers, clients, consumers, even patients. Web publishers—
particularly those who must answer to shareholders—are terrified to leave
the value that can be derived from personal information “on the table,” unex-ploited. Further, the liquidity and portability of data renders any piecemeal strategy of relinquishment highly problematic, because material of little consequence when in the hands of one company can result in a serious breach of privacy when in the hands of another company that has access to a richer
60
CHApTER 3
or better-managed database. For companies in the information services industry, or companies utilizing data to promote their competitive edge, consumers’ chagrin and occasional fines and slaps on the wrist are a small
enough cost of doing business, and such companies fight fiercely to retain access to the “standing reserve” of personal data.22
What about relying on government to enact and enforce better laws?
Isn’t government supposed to be the venue where interests are balanced and values and political principles protected? This raises another question against which obfuscation must justify itself: Why are businesses having to invent data-collection and data-management practices on their own? Surely such
practices should be defined and enforced by governments.
Indeed, regulation and law have historically been central bulwarks of per-
sonal privacy, from the Fourth Amendment of the U.S. Constitution to the
European Union’s data-protection requirements and directives. Our laws probably will be the eventual site of the conversation in which we answer, as a society, hard questions about the harvesting and stockpiling of personal information. But they operate slowly, and whatever momentum propels agents of
government and law in the direction of protecting privacy in the public interest it is amply counterbalanced by opposing forces of corporations and other institutional actors, including government itself.
In the world after Snowden, it has become clear that, for many national-
security, espionage, and law-enforcement organizations, having a population already predisposed to disclose to companies huge volumes of information
about themselves that can either be subpoenaed or covertly exploited is all to the good.23 Poorly designed and managed social platforms create an efficiently self-spying population, doing their own wiretapping gratis with photos
uploaded with their EXIF metadata intact and with detailed social chit-chat waiting to be subjected to data-mining algorithms.
Particularly in the United States, people will have to ask careful and
demanding questions about any governmental project to reform data-
collection rules and practices. Enormous quantities of personal data are
already in circulation. Ever-increasing amounts of freely provided personal data are packaged and sold, while the patient and uncertain work of legislation and judicial decision unfolds slowly, with some forward steps and some back-ward steps. The rate of progress doesn’t inspire great optimism. This brings us WHY IS OBFUSCATION NECESSARY?
61
back to the question with which we began: Since technologies have generated the context and the parameters of many of these problems, why can’t superior technologies solve them?
What about relying on superior technological solutions?
Powerful, thoughtful, well-designed systems have been produced to preserve and enhance privacy, be it in data mining, surfing or searching the Web, or transmitting confidential information. Yet the situation remains imperfect.
Producing tools for detecting data provenance, properly anonymizing data-
sets, generating contextual awareness, and providing secure, confidential
communication poses serious technical challenges. Potential systems like
these also face resistance from well-heeled business interests and govern-
mental organizations that would rather we used inferior, badly implemented, and poorly adopted (and adapted) systems.24 Furthermore, no matter how
convincing the technical developments and standards, adoption by societal
/> actors whose organizations and institutions mediate many flows of data is
fraught with politics. Even on the individual scale, difficulties persist, as Arvind Narayanan notes in his study of the use of “Pragmatic Crypto” (as distinct from
“Cypherpunk Crypto,” a techno-determinist project to wholly reshape society through encryption)—adoption is fraught with complex engineering and
usability issues for the developers.25 None of these problems diminish the accomplishments or the utility of privacy technologies, from Tor to Off-the-Record (OTR) messaging to email encryption toolkits such as Gnu Privacy
Guard (GPG). Yet the combination of technical accomplishments, law and regulation, industry best practice, and user choice leaves great, neglected, unprotected empty spaces, like a Venn diagram in negative, in which obfuscation comes into its own.
As we will discuss later in more practical detail, obfuscation is, in part, a troublemaking strategy. Although privacy is served by the constraints of law and regulation, disclosure limits imposed by organizational best practices, protective technological affordances provided by conscientious developers, and the exercise of abstinence or opting out, the areas of vulnerability remain vast. Obfuscation promises an additional layer of cover for these. Obfuscation obscures by making noise and muddying the waters; it can be used for data
disobedience under difficult circumstances and as a digital weapon for the informationally weak.
62
CHApTER 3
4 IS OBFUSCATION JUSTIFIED?
Be fire with fire; Threaten the threatener and outface the brow.
Shakespeare, King John, 1595
After a lecture on TrackMeNot,1 a member of the audience rose to say that she was deeply troubled by the valorization of deceit and dishonesty. To her it didn’t seem right to submit search queries that were not of true interest. The question of deception has not been the sole source of opposition to obfuscation; other sources of opposition include wastefulness, free riding, database pollution, and violation of terms of service.
Challenges such as that made by the woman at the lecture were worri-
some to us: ours was supposed to be the moral high ground, with TrackMeNot defending individuals against illegitimate and exploitative information practices. But such challenges could not be summarily brushed aside. Because
obfuscating tactics are often fundamentally adversarial, involving dissimulation and misdirection, the appropriation of resources for unintended or unde-sired uses must be explained and justified. In an article titled “A Tack in the Shoe,” Gary Marx writes: “Criteria are needed which would permit us to speak of ‘good’ and ‘bad,’ or appropriate and inappropriate efforts to neutralize the collection of personal data.”2 To use obfuscation because it works, or even because it is the only approach that works, isn’t enough. Obfuscation, if used, must be defensible on ethical grounds, and must be compatible with the political values of the society in which one lives.
TrackMeNot exposed many of the ethical issues that can confront not only
developers of obfuscating systems but also users, and as a consequence
exposed a need to distinguish uses that are morally defensible from uses that are not. Intuition places the Craigslist robber, with his unwilling identically dressed confederates, among the latter, and the Allies’ radar chaff among the former, but why? What makes them different? And how might we adapt the
answer to more ambiguous cases? Mere approval or disapproval isn’t suffi-
cient if we are to defend the legitimacy of a particular system; instead, we must provide systematic reasons why that system avoids moral and political hazards.
This chapter prepares designers or users of obfuscation to meet a range
of challenges they are likely to confront. Some of the challenges are ethical,
claiming that obfuscation causes harm or violates ethical rights beyond general harms. Other challenges are political, suggesting that obfuscation abridges political rights and values, that it is unfair or unjust, that it redistributes power in illegitimate ways, or that it is generally at odds with the political values of surrounding societies or communities.
4.1 Ethics of obfuscation
Dishonesty
It is nearly impossible to avoid charges of dishonesty when the aim of
obfuscation is to mislead and misdirect. Linking obfuscation to the ethics of lying leads to a vast landscape of philosophical thought that, though beyond the scope of our book, contributes important insights to our more limited
purpose.
The classic Kantian position on lying, which holds that it is absolutely
wrong and which famously prescribes truth even in reply to a murderer
seeking to locate an innocent victim, would condemn any use of obfuscation.
Other defenses of lying have been based on more varied and more contingent ethical positions. Generally, the literature on lying has two strands, one concerned with defining lying and the other with its ethics—whether it is always wrong, whether it is ever right, and whether, even if wrong, it ever can be excused. In practice these two strands are interdependent, because a hard line on the wrongness of lying is softened by a narrow definition. Thomas Aquinas, for example, allowed prudent dissimulation to pass the ethical test not because lying is sometimes morally acceptable but because dissimulation sometimes
falls outside of the definition.3 Our guess is that few people are as resolutely committed to truth-telling as Kant and Aquinas, and that most would condone lying with appropriate justification, such as preventing egregious harm, acting under duress, keeping a promise, or achieving other important ends.4
In many of the cases we have discussed in this book, obfuscation pres-
ents a means of resisting coercion, exploitation, or threat—ends that might generally legitimize acts of dishonesty. We might say, therefore, that whether obfuscation, like lying, is morally defensible depends on the legitimacy of its ends: radar chaff protecting Allied bombers passes the test, but disseminating malware, robbing a bank, or fixing an election does not, even though we might admire or chuckle at the ingenuity of those who do such things. We do not
64
ChApTEr 4
want to overstate the conclusion and say that legitimate ends alone justify obfuscation, insofar as it is a form of dishonesty; we want to say only that legitimate ends are a necessary condition for ethical obfuscation.
Even in the case that someone chooses obfuscation to achieve praisewor-
thy he or she will need to defend this choice against further challenges. After we have explored some of the other ethical charges aimed against obfuscation, we will return to the question of sufficiency in order to explain what still is missing from an ethical assessment beyond laudable or even simply acceptable ends.
Waste
Critics may say that an obfuscation system is wasteful if it draws on any
important resources to generate noise. In the case of TrackMeNot, for example, some complained about its wasteful use of search engines’ servers, its burden on network bandwidth and even its unnecessary draw on electricity. Similarly, CacheCloak5 could be faulted for wasting network and mobile-app resources, many noise-generating social-network tools for drawing excessively on Facebook’s services, and Uber for squandering the effort of drivers responding to spurious calls. In defense of one’s preferred obfuscation system, one should immediately recognize a hidden agenda in any such accusations, for the notion of waste is thoroughly normative. It presumes standards of acceptable, desirable, or legitimate use, consumption, exploitation, or employment of the
resources in question. Only a strong societal consensus around these stan-
dards elevates such charges above mere personal opinion, and only a sound
foundation in factual knowledge lends credibility to the suggestion that any particular obfuscation system wastes resources.
When standards are not settled, however, there is greater un
certainty
over the line between use and waste. We might all agree that carelessly
leaving a tap running is a waste of water, but residents of Los Angeles disagree with residents of Seattle over whether daily watering to maintain verdant lawns in a desert climate is wasteful. To defend TrackMeNot against charges of wastefulness, we can point out that its network usage is minimal compared with usage generated by image, audio, and video files, rich information flows on social networks, and Internet-based communications services. Yet noting huge differences in scale between the traffic generated by TrackMeNot search terms and those needed to maintain (say) Bitcoin or World of Warcraft doesn’t IS OBFUSCATION JUSTIFIED?
65
address the complaint fully. After all, the cumulative flow of a dripping faucet may be far less than the amount of water a daily shower requires, but the
former may still be judged wasteful because it is unnecessary.
Whether one considers the noise produced by systems such as Cache-
Cloak and TrackMeNot wasteful depends not only on the volume of the noise
but also on one’s values. A defender points out that protecting privacy by preventing profiling on the basis of search queries is worth the bandwidth—
certainly more worthwhile than a good number of the videos clogging band-
width en route from servers to households. Some critics remain doubtful,
though their doubts are less about wasteful usage of common resources than about waste of private ones, such as the server space belonging to providers of search engines and mobile apps. Here too, both quantity and legitimacy
matter. In cases where noise overloads an adversary’s system or, in more
extreme cases, even consumes all available resources, it becomes a denial-
of-service attack and the bar of justification is very high. Unless you can convincingly demonstrate that your target is engaged in oppressive, domineering, or clearly unfair practices, a debilitating obfuscation attack is difficult to justify.
In the case where an obfuscating system merely uses but does not debili-