Book Read Free

Obfuscation

Page 8

by Finn Brunton


  opinion” that formerly had figured in calculations of trust and risk. In the place of “personal acquaintance and community opinion,” they relied on credit

  bureaus to collect data that could be used to make informed decisions as to whether individuals would receive loans, insurance, leases, and other risky things. By the late 1920s, credit bureaus’ reports and analyses constituted a private surveillance system on a scale that dwarfed any domestic project conducted by the U.S. government. Several major consequences followed from

  this, among them the coercion of character assessment built into one’s “financial identity” and the rise of targeted marketing as new uses for the accumulated data were invented. One consequence is particularly relevant to our

  argument here. That consequence, which really comes into play with the rise of digital databases and tools, is that credit reporting decreases risk, yes, but under some circumstances it also exports risk. (These consequences are in the domain of Anthony Giddens’s “manufactured risks”: dangers produced by the

  process of modernization, rather than mitigated by it, and, in turn, requiring new systems of mitigation.15)

  In the process of decreasing risk for a lender, an insurance company, or a business opening a line of credit for a customer, risks are increased for the individual. One risk is that of identity theft: you have to trust a department store’s subcontractor, whoever that is, to follow immaculate security practices. Another is the risk of violations of context, such as the store’s selling data to shady data brokers, sharing data with partners, letting data be acquired with the rest of a company, or letting data be gathered indiscriminately by government in the course of some larger data-collection project. This may be 52

  CHApTER 3

  a fair trade, but it is important to remember that risk doesn’t disappear with data collection—new forms of risk are created and externalized by those who hold the data. Those risks will be borne by you, and by others whom your data can be used to better analyze and understand. On a larger scale, the surveillance and data-collection projects our governments launch in the name of

  security are always about protection from one class of risks against which the state must defend, but they produce another class of risks whose danger citizens take on: the risk that dissent will be stifled, the risk that legitimate opposition will be crushed, or just the risk that accidents will happen and innocent people will be detained, tracked, exposed, and punished. These are cases in which increasing the volume and the detail of information collected reduces risk for some while increasing it for others—an experience of information

  asymmetry we encounter every day and believe certain forms of obfuscation

  can help to correct.

  “They” (or a range of “they”s) know much about us, and we know little

  about them or about what they can do. Situations so asymmetrical in knowl-

  edge, power, and risk make effective responses difficult to plan, much less carry out. These are not the asymmetries of the priest or the busybody in a small town where people know one another’s business and some people

  know more than others. What we describe here is different because of the

  convergence of asymmetries: those who know about us have power over us.

  They can deny us employment, deprive us of credit, restrict our movements, refuse us shelter, membership, or education, and limit our access to the

  good life.

  3.3 The fantasy of opting out

  Of course, we still choose to participate in these asymmetrical relationships, don’t we? For most of these forms of data collection, some of the fault must lie with the individuals who use services or engage with institutions that offer unfavorable terms of service and are known to misbehave. Isn’t putting all the blame on government institutions and private services unfair, when they are trying to maintain security and capture some of the valuable data produced by their users? Doesn’t this subject the users to classic moral hazard, making service providers take on the burden of risk and responsibility for choices that users make? Can’t we users just opt out of systems with which we

  disagree?

  WHY IS OBFUSCATION NECESSARY?

  53

  To see to what degree simply opting out is increasingly unreasonable, consider a day in the life of a fairly ordinary person in a large city in a stable, democratically governed country. She is not in prison or institutionalized, nor is she a dissident or an enemy of the state, yet she lives in a condition of permanent and total surveillance unprecedented in its precision and intimacy. As soon as she leaves her apartment, she is on camera: while in the hallway and the elevator of her building, when using the ATM outside her bank (which produces a close-up image time-stamped with her withdrawal record), while

  passing shops and waiting at crosswalks, while in the subway station and on the train, while in the lobby, the elevator, and her cubicle in her workplace—and all that before lunch. A montage of very nearly every move of her life in the city outside her apartment could be assembled, and each step accounted

  for—particularly if she chooses to don her fitness-tracking device. But that montage would hardly be necessary: her mobile phone, in the course of its

  ordinary operation of seeking base stations and antennas to keep her con-

  nected as she walks, provides a constant log of her position and movements.

  Any time she spends in “dead zones” without phone reception can also be

  accounted for: her subway pass logs her entry into the subway, and her

  radio-frequency identification badge produces a record of her entry into the building in which she works. (If she drives a car, her electronic toll-collection pass serves a similar purpose, as does automatic license-plate imaging.) If her apartment is part of a smart-grid program, spikes in her electricity usage can reveal exactly when she is up and around, turning on lights and ventilation fans and using the microwave oven and the coffee maker.

  Before we return to the question of opting out, consider how thoroughly

  the systems mentioned in the preceding paragraph are embedded in our hypo-

  thetical ordinary person’s everyday life, far more invasively than mere logs of her daily comings and goings. Someone observing her could assemble in

  forensic detail her social and familial connections, her struggles and interests, and her beliefs and commitments. From Amazon purchases and Kindle highlights, from purchase records linked with her loyalty cards at the drugstore and the supermarket, from Gmail metadata and chat logs, from search-history and checkout records from the public library, from Netflix-streamed movies, and from activity on Facebook and Twitter, dating sites, and other social networks, a very specific and personal narrative is clear. The mobile device in her pocket, the fitness-tracking device around her wrist, and the Event Data

  54

  CHApTER 3

  Recorder installed in her car follow her when she is on the move. When even some of the data are pooled and correlated with data produced by others like her, powerful demographic inferences and predictions can be made. We know

  our subject with a thoroughness that would be the envy of any secret-police agent of a few decades ago—and with relatively little effort, as our subject spies on herself for us.

  If the apparatus of total surveillance that we have described here were

  deliberate, centralized, and explicit, a Big Brother machine toggling between cameras, it would demand revolt, and we could conceive of a life outside the totalitarian microscope. But if we are nearly as observed and documented as any person in history, our situation is a prison that, although it has no walls, bars, or wardens, is difficult to escape.

  Which brings us back to the problem of “opting out.” For all the dramatic

  language about prisons and panopticons, the sorts of data collection we

  describe here—the kinds to which obfuscation is a response—are, in demo-

  cratic countries, s
till theoretically voluntary. But the costs of refusal are high and getting higher: a life lived in ramifying social isolation, using any pay phones you can find (there are half as many in New York City as there were just five years ago) or mobile “burners,” able to accept only very particular forms of employment, living far from centers of business and commerce, without

  access to many forms of credit, insurance, or other significant financial instru-ments, not to mention the minor inconveniences and disadvantages—long

  waits at road toll cash lines, higher prices at grocery stores, inferior seating on airline flights—for which disclosure is the unspecified price.16 It isn’t possible for everyone to live on principle; as a practical matter, many of us must make compromises in asymmetrical relationships, without the control or consent

  for which we might wish. In those situations—everyday twenty-first-century life—there are still ways to carve out spaces of resistance, counterargument, and autonomy. They are weapons of the weak.

  3.4 Weapons of the weak: what obfuscation can do

  The political scientist James C. Scott went to “Sedaka,” a pseudonymized

  village in Malaysia, to answer a question that has engaged historians, anthro-pologists, and activists of all stripes: How do people who lack the commonly recognized means of political recourse—votes, money, violence—engage in

  resistance?17 Peasants, sharecroppers, and corvée laborers have their work WHY IS OBFUSCATION NECESSARY?

  55

  captured and surplus extracted from it, whether as grain, cash, various forms of debt, or time in uncompensated occupations. Only rarely can the peasants risk a confrontation with the forces that take advantage of them. They have fewer resources on which to draw, in order to make dramatic and historically memorable stands against injustice, than skilled industrial workers in urban centers have. Scott was interested in an empirical question: What do peasants, in the face of obviously unjust actions, do? The answer was a list of ordinary, everyday, eminently practical ways of taking action and talking back, which Scott gathered under the heading “weapons of the weak.” These join the rich and varied accounts of resisting and keeping some measure of autonomy in

  the balance between consent and outright refusal—most notably, in regard to surveillance, in the work of Gary Marx.18

  It is obvious, but still worth saying, that we do not intend a one-to-one

  comparison between the people chronicled by Scott and, generally, the users of obfuscation. Nor do we see obfuscation as having precisely the same set of limitations and properties as Scott’s concept. For purposes of this book, we are inspired by fundamental themes in Scott’s idea: we can better understand acts of obfuscation within a context of unavoidable relationships between

  people and institutions with large informational and power asymmetries. To begin, we observe the necessarily small and additive nature of many of these

  “weapons”—obfuscation and the ones Scott observes—reflecting their role in an ongoing and open-ended set of social and political arrangements, rather than an overturning world revolution. Instead of a mass invasion of inequitably distributed land, the approach is to squat or poach. Pilfering and thumb-on-the-scale fraud (the phenomenon large retailers euphemistically call “mer-

  chandise shrinkage”) are fractional versions of the project of the re- allocation of needful things. The response to orders is not some cinematic refusal, but foot dragging, slowdowns, feigned ignorance, deliberate stupidity, and the pretense of compliance. Finally, and most important for our purposes, rather than overt backtalk or heroic here-we-stand speeches there is the evasive

  muttering, gossip, and slander of what Scott terms the hidden transcript.19

  It is likely that every reader of this book has turned away from a superior (occupational, filial, legal, religious, or otherwise) and subvocally muttered dissent. Perhaps the dissent takes place wholly in the mind; perhaps one

  dares a barely audible murmur, meant for oneself alone; perhaps it is shared in privacy among subordinate groups. (As Scott points out, powerful groups 56

  CHApTER 3

  also have hidden transcripts—ways of accumulating and maintaining power that can’t be generally discussed or disclosed.) Dissent in a workplace may take the form of gossip, jokes, anecdotes, or stories that make it possible to criticize the order of power without speaking outright. Dissent creates a space in which the dignity and relative autonomy of the speaker can exist, even as it accomplishes other things. An assertion is made, however covertly, that one is not what one may publicly appear to be.

  With that outline in place, we will lay out a few quick distinctions. No

  reasonable analogy can be made between one of the peasants Scott studied

  and an obfuscator who is installing a browser extension or running a Tor relay; the breadth of resources available to one and the other—the structures and infrastructures—and the mechanisms of coercion and control they face do not allow for simple comparisons. As our summary here suggests, though, part of what Scott accomplishes is broadening the spectrum of responses to oppression and coercion that we take into account. It’s not just armed uprising or nothing at all, and no one is merely passive. There are very different degrees of access to the power, wealth, status, and other components of autonomy and redress, but we push back when and where we can. Taking up this thread, we can look to one of the perennial questions about digital privacy: Why don’t people use powerful, verifiably reliable, openly audited, robust protection systems, such as end-to-end public-key encryption of their messages—

  “strong” cryptography? Why not use the optimal system?

  We do not want to argue that they shouldn’t. Quite the opposite! There

  are, however, times, circumstances, populations, and events in which the

  strong system, the optimal system, isn’t possible, accessible, desirable, or some combination of the three. Situations arise in which we are obligated to be visible, in which we need to be visible, or want to be visible (whether to friends or compatriots, or as an act of public protest or presence) and still we want to muddy our tracks as best we can. Sometimes we don’t have a choice

  about having our data collected, so we may as well (if we feel strongly about it) put a little sand in the gears. When doing work for government or when developing software, we may have to gather data to provide service, but still seek to do right by our users and to protect their interests from future groups who don’t share our good intentions. In those moments, under those constraints, we often are stuck with weaker systems, or strong systems with a few weak components, and are, ourselves, “weak.”

  WHY IS OBFUSCATION NECESSARY?

  57

  We want to follow Scott but take his work in a slightly different direction as we broaden the spectrum of responses to situations involving data surveillance and obfuscation. There is real utility in an obfuscation approach, whether that utility lies in bolstering an existing strong privacy system, in covering up some specific action, in making things marginally harder for an adversary, or even in the “mere gesture” of registering our discontent and refusal. An obfuscation approach offers expressive and functional—though sometimes

  fragile—methods of protest and evasion that are accessible to a range of

  actors but are particularly important for actors who lack access to other

  methods or wish to complement them. Thus we apply the concept of “weapons

  of the weak.”

  Before we turn, in the next section, to the kinds of situations in which

  obfuscation may be useful, one more bit of explanation is necessary to avoid confusion: “Strong” forces can, and do, use obfuscation techniques. Consider some of the examples cited in the book thus far: corporate over-disclosure of documents in legal cases, anticompetitive tricks by companies, the manufacturing of evidence, and some military camouflage technologies. The weak

  need to be invisible, to escape notice, but being i
nvisible can also be advantageous to the strong. Our argument is one of relative utility. Let’s put this bluntly: If you have access to wealth, the law, social sanction, and force, if you have the whole vocabulary of strong systems at your disposal, on the advantageous side of the asymmetry of power, and can retain top lawyers and hire sharp programmers, why bother with obfuscation? If you have diplomatic

  pouches and NSA-secured phone lines, you need not waste your time shuffling SIM cards and making up identities. Obfuscation does sometimes come in

  handy for powerful actors with strong systems for privacy already in place, and we discuss that aspect accordingly, but it is a tool more readily adopted by those stuck with a weak system.

  3.5 Distinguishing obfuscation from strong privacy systems

  So far, we have contended that there are times when optimal, “strong” security and privacy practices aren’t practical or available for individuals and groups.

  This is not an argument against other systems and practices; it is merely an acknowledgment that there are circumstances in which obfuscation may

  provide an appropriate alternative or could be added to an existing technology 58

  CHApTER 3

  or approach. Obfuscation can serve a function akin to the hidden transcript, concealing dissent and covert speech and providing an opportunity to assert one’s sense of autonomy—an act of refusal concealed within a gesture of

  assent—or can provide more straightforward tools for protest or obscurity.

  There are situations in which many people may periodically find themselves obligated to give things up, with uncertain consequences and without a clear mechanism for reasserting control—moments when obfuscation can play a

  role, providing not a comprehensive military-grade data-control solution

  (though it may be usefully combined with such a solution) but an intuitive approach to throwing up a bit of smoke.

  Explaining what obfuscation is requires us to clarify what it is not and

 

‹ Prev