Obfuscation

Home > Other > Obfuscation > Page 10
Obfuscation Page 10

by Finn Brunton

tate a privately owned resource, what counts as legitimate may not be obvious.

  Take the case of Web searching. Manually submitted queries, no matter how

  frivolous the purpose, seem not to provoke complaints of waste. No one

  argues that “ninja turtle” or “fantasy football” is more wasteful of Google’s server resources than, say, “symptoms of Ebola,” although some critics have said that the automated search queries submitted by TrackMeNot are wasteful. We can think of no other reason for such criticism than that TrackMeNot’s queries run counter to Google’s interests, desires, or preferences and that these, according to critics, trump users’ interests, desires, or preferences for privacy-seeking obfuscation. Such is the rhetorical struggle between those who defend obfuscation as a means of protecting its users against illegitimate information capture, and those who are targets of obfuscation who label such actions wasteful. The winner of this debate captures the ethical high ground and transforms a private dispute over conflicting vested interests into a matter of public morality. But it is important to see, in this instance, that when defenders of search resources vilify obfuscation as “waste,” they beg the very question that we, collectively, have not yet properly addressed. In the name of privacy protection, query obfuscation utilizes private resources without 66

  ChApTEr 4

  owners’ authorization, but whether we deem this wasteful or legitimate, prohibited or allowed, is a political question about the exercise of power and privilege and a question to which we will return later in the chapter.

  Free riding

  Depending on the design of one’s preferred obfuscation system, one may be

  accused of free riding—that is, taking advantage of other people’s willingness to submit to the collection, aggregation, and analysis of data or of using services provided by data collectors while denying them profit from your personal information. In the first instance, the adversary will go after the less costly target—people who don’t obfuscate—just as predators, according to the

  adage, go after the slower prey. In the second instance, if you use services offered by targets such as Facebook and Foursquare in ways that diverge from the terms of service, you are violating an implied contract and are free riding not only on people whose behaviors comply with the terms of service but also on investments made by the providers of the services. This applies, for

  instance, to users of ad-blocking browser plug-ins, who can enjoy a quieter, faster-loading, ad-free Web experience while having access to content under-written by users who haven’t installed ad blockers. Or so the critics suggest.

  Cast as free riders, obfuscators appear to be sneaks more than rebels; after all, when you aspire to the moral high ground, do you want instead to be

  someone who games the system by exploiting the ignorance and foolishness

  of others? These charges must be taken seriously, but in our view whether

  they stick depends on answers to two questions: Is your obfuscation system (either one you have created or one you are using) freely available to others?

  And are people who aren’t obfuscating left no worse off as a result of your use of that system? When the answers to both of those questions are Yes, as holds for many of the systems we have discussed, we see no exploitation, no moral wrong. When the answer to either question is No, the situation is complex and requires further probing. Secretive obfuscation may be excusable if it leaves non-obfuscators no worse off; obfuscation that disadvantages non-obfuscators may be justified if it is widely and freely available to all. Though further justification is needed in both scenarios, the case that poses the most difficult questions is closed, secretive obfuscation that results in disadvantage to non-obfuscators.

  IS OBFUSCATION JUSTIFIED?

  67

  These difficult questions plunge us into philosophical debates about moral responsibility. Even in the worst case, you might redirect blame to the targets of your obfuscating system: the data gatherers. You may ask “Who

  is taking advantage of whom?” Returning to the metaphor of predator and

  prey, you can argue “Don’t blame me for being fleet footed; it is the predator, after all, who is responsible for the demise of its victims.” Though you

  expose your slower compatriots to higher odds of capture, surely blame

  accrues primarily to the predator. This leaves a stalemate of mutual recrimination, the data collector accusing the obfuscator of free riding on services and the obfuscator accusing the data collector of free riding on personal

  information.

  In the dominant economy of the Internet, individual users enjoy free ser-

  vices, which are sustained by the value extracted from information about

  those users by ad networks and by other third-party data aggregators. Unlike traditional commercial market-based exchanges, where a price is explicitly paid for goods or services, the economy into which the Internet has settled is based on the capture of information by indirect, subtle, and often well-hidden means. The informational price—effectively a blank check—is anything but

  free, according to experts whose commentaries have inspired our own

  thoughts on this matter.6 When relinquishment of personal information with no reasonable account of its use is a necessary condition for receiving a service, when it is disproportionate to need (as in over-collection), and when it is inappropriate (as when it violates contextual expectations), such a price is exploitative and the practice is oppressive. Furthermore, when traditional institutional protections aren’t effective in addressing practices such as these, the obfuscator who has been accused of free riding may justly challenge the presumptive entitlements of the entrenched system, in which naive users succumb to rhetorical trickery that engages them in terms of exchange they have had little hand in setting.7 Each party has an interest in setting terms for the exchange of valuable resources, but which interests are favored must be fairly settled or, says the obfuscator, this is a claim that doesn’t warrant respect.

  This argument doesn’t make all information obfuscation legitimate and

  defensible against the charge of free riding; it does so only when other moral requirements are met and the question of free riding hinges on who is entitled to surplus value generated by the interactions of individual users with service providers collecting information on them. In other words, after you have

  68

  ChApTEr 4

  satisfied yourself that your system meets other ethical criteria, such as worthy ends, questions that remain about conflicting interests and desires or about fair distribution of benefits and entitlements enter the realms of economic and political analysis, taken up below.

  Pollution, subversion, and system damage

  The charge of data pollution is as vexing as it is unavoidable. Obfuscation, defined as the insertion of noise, invites a parallel to pollution—making

  something impure or unclean. Someone who taints water, soil, or air with toxic chemicals, particulates, or waste can be roundly criticized because environmental integrity is highly valued not only as an ideal but also as a practical goal. However, critics drawing on the normative clout of environmental pollution aren’t coolly observing that obfuscation clutters a data repository; they are alleging that it contaminates a data environment whose integrity is prized.

  There is, however, a difference. In most present-day societies, the value of the natural environment is presumed and an action that has been shown to pollute it is considered reprehensible. But unless one can make an explicit case that a data assemblage is worthy of protection, a claim for its integrity begs the question.

  Even environmental integrity isn’t absolutely valued and has been traded

  off against other values, such as security, commerce, and property rights.

  Analogously, in order for a charge of data pollution to stick, a data assemblage must be shown to hold greater value than whatever the obfuscator aims to

  protec
t. Simply revealing negative consequences for a database is, once again, to beg the ethical question. It comes down to this: Data pollution is unethical only when the integrity of the data flow or data set in question is ethically required. Moreover, whether the integrity of the data outweighs other values and interests at stake must be explicitly settled. When what is in question is whether the interests of a data collector are negatively affected by obfuscation, ethical questions can be settled only by establishing that these interests are of general value and that they override the interests of the obfuscator.

  When there are no clear moral grounds favoring the respective, conflicting interests (or preferences) of a data collector and an obfuscator, a political resolution, or perhaps a market-based resolution, may be the best one can

  hope for.

  IS OBFUSCATION JUSTIFIED?

  69

  If there is genuine public interest in the integrity of particular data flows or data sets, and if obfuscation negatively affects the system as a whole, the burden shifts to the obfuscator to justify his or her actions. For example, one may justly challenge the obfuscator who diminishes the integrity of a population health database when so doing reduces the potential public benefits it can provide. But even in a case such as this, we should assess whether the price an individual pays for the benefit of others or in the public interest is fair. If individuals are coerced to contribute, it should be with assurances that how the information will be used, where it will travel, and how it will be secured will, at the very least, be in line with familiar principles of fair information practice. In other words, the ethical argument hinges on two considerations: whether the data in question are of genuine public and common interest and how much individuals are asked to sacrifice on behalf of such interests.

  Keeping both of these considerations in sight recognizes that the integrity of a data assemblage—even one deemed valuable—is not absolute, and data

  controllers have the burden of defending the public importance of the assemblage (and associated practices) as well as the legitimacy of any burdens it might impose on individual data subjects.

  In the discussion thus far, we have not differentiated among the three

  terms “pollution,” “subversion,” and “system damage.” You might want to

  consider which of the three is relevant when striving to ensure an ethically defensible system. Obfuscating systems that pollute or subvert only the

  obfuscators’ data trail pose fewer ethical challenges than those that also affect other data subjects, and even fewer than those that interfere with a system’s general functioning, as in a denial of service. A careful assessment would involve asking questions similar to those we have discussed above—

  questions concerning respective harms, entitlements, societal welfare and

  proportionality—about data collection as well as about data obfuscation in relation to legitimate ends.

  4.2 From ethics to politics

  Ends and means

  Since obfuscation almost always involves dissemblance, unauthorized uses of system resources, or impairment of functionality, appreciating obfuscation’s intended ends, aims, purposes, or goals, is crucial to evaluating its moral 70

  ChApTEr 4

  standing. Although some ends might seem unequivocally good and others unequivocally bad, a vast middle ground exists that encompasses merely

  unproblematic ends (e.g., foiling supermarket surveillance) and ends that are somewhat controversial (e.g., enabling peer-to-peer file sharing). In these zones of ethical ambiguity or flexibility, politics and policy come into play.

  Ends, however, are only part of the picture—necessary but not sufficient

  conditions. Ethical theory and common sense demand that means, too, be

  defensible, and, as the saying warns, ends may not justify all means. Whether means are acceptable may rest on numerous ethical factors but, as often, may depend on the interaction of ends with various contingent and contextual

  factors, whose consideration resides in the zone of the political.

  Recognizing that certain disputes over ethical issues are best resolved

  politically doesn’t necessarily remove them from ethical consideration entirely when one takes a view, such as Isaiah Berlin’s, of political philosophy as moral inquiry, “applied to groups and nations, and indeed, mankind as a whole.”8 In some instances, disagreements over the ethics of obfuscation that reduce to disagreements over clashing ends and values may yet be amenable to purely

  ethical resolutions, such as the resolution Kant seems to have found when he prioritized truth over preventing murder. But disagreements over ends may not always be accessible to purely ethical reasoning. In these cases, resolution becomes a matter for social policy because how these disagreements are

  settled affects the constitution or shape of the society in which they are embedded. Ethical questions such as those requiring societal resolution have inspired political philosophers through the ages—from Plato to Hobbes and

  Rousseau to the present—who have sought to compare and evaluate political

  systems, to identify political properties and modes of decision making that characterize good societies, and to articulate political principles of justice, fairness, and decency. When we conclude that answers to ethical questions

  must be answered politically, because they are about the distribution of power, authority, and goods in society, we still have ethics on our minds. We do not mean any society; we mean societies opposed to tyranny and striving to be good, just, and decent in the ways that great philosophers, critical thinkers, and political leaders have idealized in word and action. With this in mind, let us revisit the issues of dishonesty (dissimulation), waste, free riding, pollution, and system damage arising in the context of obfuscation.

  IS OBFUSCATION JUSTIFIED?

  71

  As we worked through the issue of waste, we imagined clashes of opponents parrying back and forth, one accusing the other of wasteful activity and the other insisting that the activity in question constituted a legitimate use.

  This was the case when critics accused TrackMeNot users of wasting band-

  width with searches that were of no genuine interest and TrackMeNot users

  responded that they weren’t wasting bandwidth but rather were using it to

  promote legitimate privacy claims. Similarly, one who is accused of polluting a dataset or impairing a system’s data-mining capacity counters that the

  purpose of the dataset or data mining is not one that warrants societal protection, or at least not one that should trump the obfuscator’s evasion of

  surveillance.

  Generally, asserting that data obfuscation impairs and damages a data-

  base or compromises a system, or that it overuses or wastes a common

  resource, doesn’t entitle one to call the obfuscation unethical unless one can clearly explain how the data store or system in question furthers societal goals more important than contrary goals the obfuscator seeks to promote.

  Rarely are these conflicting ends explicitly or systematically addressed in ways that call on data collectors to justify the value of their activity. To understand the criterion of ends, you would ask about the purposes or values served by data collection—database or information flow—and the same for the

  obfuscating activities. Further, you would ask how these ends feature within broader political commitments of the collective—society, nation, etc. Thus far, we seem to give great leeway to the Transportation Security Administration’s pursuit and assembly of personal information profiles insofar as its purposes are to provide security for travelers. Accordingly, we might be less tolerant of individuals who obfuscate in this context even for the purposes of protecting privacy, the point being that ends should make a difference in our reactions both to the ethics of data collection and to obfuscation.

  But means matter, too. Even good ends may not justify all means. In law


  and policy, we are often asked to consider proportionality—for example,

  demanding that the punishment should fit the crime. Although an obfuscator must be challenged to justify means that are disruptive, even damaging, surely it is fair also to challenge the target. You may decide to install TrackMeNot not because you object to the basic practice of logging search queries but because you object to unacceptable extremes such as holding data with too much

  detail, for too long, without appropriate limits on use. Keeping data in order to 72

  ChApTEr 4

  improve search functions, even to match contextual ads to queries, may seem acceptable, but isn’t it grossly disproportionate to a search engine’s core function to hold data indefinitely in order to refine behavioral advertising and to match search histories with other online activity so as to profile people too personally, too precisely, too intimately? Such questions are relevant to all the extreme forms of information surveillance, with online surveillance a particular case in which ubiquitous tracking of online behavior seems wildly disproportionate as a means, insofar as it serves only the parochial ends of

  commercial advertising, even if this tracking slightly improves the efficacy of the ads. But the obfuscator, too, must answer the challenges of proportionality, and in quite concrete terms. Thus, we may agree that the ends of TrackMeNot are legitimate, but still want to regulate the volume of noise—say, to foil profiling but not to disable a search engine entirely with denial-of-service attacks. Drawing an exact line between proportional and disproportional is never easy, but the intuition that there is a line, even if it must be drawn case by case, is robust and deep.

  Proportionality suggests normative standards for particular pairs of

  means and ends and pairs of actions and reactions, but means may also be

  measured by comparative standards, such as whether their cost is lower than that of alternatives. Utilitarian thinking is a case in point, demanding not only that the happiness yielded by actions or social policies under consideration should be greater than the unhappiness, or that the benefits should exceed the costs, but also that the actions or policies should yield the optimal proportion among available alternatives. Where obfuscation involves pulling the wool

 

‹ Prev