Obfuscation
Page 6
process to help their clients stay lost. Obviously many of the techniques and methods they employ have nothing to do with obfuscation, but rather are
merely evasive or concealing—for instance, creating a corporation that can lease your new apartment and pay your bills so that your name will not be
connected with those common and publicly searchable activities. However, in response to the proliferation of social networking and online presence, disappearance specialists advocate a strategy of disinformation, a variety of obfuscation. “Bogus individuals,” to quote the disappearance consultant Frank
Ahearn, can be produced in number and detail that will “bury” pre-existing personal information that might crop up in a list of Web search results.26 This entails creating a few dozen fictitious people with the same name and the
same basic characteristics, some of them with personal websites, some with accounts on social networks, and all of them intermittently active. For clients fleeing stalkers or abusive spouses, Ahearn recommends simultaneous producing numerous false leads that an investigator would be likely to follow—
for example, a credit check for a lease on an apartment in one city (a lease that was never actually signed) and applications for utilities, employment addresses and phone numbers scattered across the country or the world, and a checking account, holding a fixed sum, with a debit card given to someone traveling to pay for expenses incurred in remote locations. Strategies suggested by disappearance specialists are based on known details about the
adversary: the goal is not to make someone “vanish completely,” but to put one far enough out of sight for practical purposes and thus to use up the seek-er’s budget and resources.
OTHER EXAMPLES
35
2.12 Apple’s “cloning service” patent: polluting electronic profiling In 2012, as part of a larger portfolio purchase from Novell, Apple acquired U.S.
Patent 8,205,265, “Techniques to Pollute Electronic Profiling.”27 An approach to managing data surveillance without sacrificing services, it parallels several systems of technological obfuscation we have described already. This “cloning service” would automate and augment the process of producing misleading
personal information, targeting online data collectors rather than private investigators.
A “cloning service” observes an individual’s activities and assembles a
plausible picture of his or her rhythms and interests. At the user’s request, it will spin off a cloned identity that can use the identifiers provided to authenti-cate (to social networks, if not to more demanding observers) that represents a real person. These identifiers might include small amounts of actual confidential data (a few details of a life, such as hair color or marital status) mixed in with a considerable amount of deliberately inaccurate information. Starting from its initial data set, the cloned identity acquires an email address from which it will send and receive messages, a phone number (there are many
online calling services that make phone numbers available for a small fee), and voicemail service. It may have an independent source of funds (perhaps a gift card or a debit card connected with a fixed account that gets refilled from time to time) that enables it to make small transactions. It may even have a mailing address or an Amazon locker—two more signals that suggest per-sonhood. To these signals may be added some interests formally specified by the user and fleshed out with existing data made accessible by the scraping of social-network sites and by similar means. If a user setting up a clone were to select from drop-down menus that the clone is American and is interested in photography and camping, the system would figure out that the clone should be interested in the work of Ansel Adams. It can conduct searches (in the
manner of TrackMeNot), follow links, browse pages, and even make pur-
chases and establish accounts with services (e.g., subscribing to a mailing list devoted to deals on wilderness excursions, or following National Geographic’s Twitter account). These interests may draw on the user’s actual interests, as inferred from things such as the user’s browsing history, but may begin to diverge from those interests in a gradual, incremental way. (One could also salt the profile of one’s clone with demographically appropriate activities, automatically chosen, building on the basics of one’s actual data by selecting 36
CHAPTER 2
interests and behaviors so typical that they even out the telling idiosyncrasies of selfhood.)
After performing some straightforward analysis, a clone can also take on a person’s rhythms and habits. If you are someone who is generally offline on weekends, evenings, and holidays, your clone will do likewise. It won’t run continuously, and you can call it off if you are about to catch a flight, so an adversary will not be able to infer easily which activities are not yours. The clones will resume when you do. (For an explanation of why we now are
talking about multiple clones, see below.) Of course, you can also select classes of activities in which your clones will not engage, lest the actors feigning to be you pirate some media content, begin to search for instructions on how to manufacture bombs, or look at pornography, unless they must do
so to maintain plausibility—making all one’s clones clean-living, serious-
minded network users interested only in history, charitable giving, and recipes might raise suspicions. (The reason we have switched from talking about a
singular clone to speaking about multiple clones is that once one clone is up and running there will be many others. Indeed, imagine a Borgesian joke in which sufficiently sophisticated clones, having learned from your history, demography, and habits, create clones of their own—copies of copies.) It is in your interest to expand this population of possible selves, leading lives that could be yours, day after day. This fulfills the fundamental goal outlined by the patent: your clones don’t dodge or refuse data gathering, but in complying they pollute the data collected and reduce the value of profiles created from those data.
2.13 Vortex: cookie obfuscation as game and marketplace
Vortex—a proof-of-concept game (of sorts) developed by Rachel Law, an
artist, designer, and programmer28—serves two functions simultaneously: to educate players about how online filtering systems affect their experience of the Internet and to confuse and misdirect targeted advertising based on
browser cookies and other identifying systems. It functions as a game, serving to occupy and delight—an excellent venue for engaging users with a subject as seemingly dry and abstract as cookie-based targeted advertising. It is, in other words, a massively multi-player game of managing and exchanging personal data. The primary activities are “mining” cookies from websites and
swapping them with other players. In one state of play, the game looks like a OTHER EXAMPLES
37
few color-coded buttons in the bookmarks bar of your browser that allow you to accumulate and swap between cookies (effectively taking on different
identities); in another state of play, it looks like a landscape that represents a site as a quasi-planet that can be mined for cookies. (The landscape representation is loosely inspired by the popular exploration and building game Minecraft.)
Vortex ingeniously provides an entertaining and friendly way to display,
manage, and share cookies. As you generate cookies, collect cookies, and
swap cookies with other players, you can switch from one cookie to another with a click, thereby effectively disguising yourself and experiencing a different Web, a different set of filters, a different online self. This makes targeted advertising into a kind of choice: you can toggle over to cookies that present you as having a different gender, a different ethnicity, a different profession, and a different set of interests, and you can turn the ads and “personalized”
details into mere background noise rather than distracting and manipulative components that peg you as some marketer’s model of your identity. You can experience the Web as many di
fferent people, and you can make any record of yourself into a deniable portrait that doesn’t have much to do with you in particular. In a trusted circle of friends, you can share account cookies that will enable you to purchase things that are embargoed in your location—for
example, video streams that are available only to viewers in a certain country.
Hopping from self to self, and thereby ruining the process of compiling
demographic dossiers, Vortex players would turn online identity into a field of options akin to the inventory screens of an online role-playing game. Instead of hiding, or giving up on the benefits that cookies and personalization can provide, Vortex allows users to deploy a crowd of identities while one’s own identity is offered to a mob of others.
2.14 “Bayesian flooding” and “unselling” the value of online identity
In 2012, Kevin Ludlow, a developer and an entrepreneur, addressed a familiar obfuscation problem: What is the best way to hide data from Facebook?29 The short answer is that there is no good way to remove data, and wholesale withdrawal from social networks isn’t a realistic possibility for many users.
Ludlow’s answer is by now a familiar one.
“Rather than trying to hide information from Facebook,” Ludlow wrote, “it
may be possible simply to overwhelm it with too much information.” Ludlow’s 38
CHAPTER 2
experiment (which he called “Bayesian flooding,” after a form of statistical analysis) entailed entering hundreds of life events into his Facebook Timeline over the course of months—events that added up to a life worthy of a
three-volume novel. He got married and divorced, fought cancer (twice), broke numerous bones, fathered children, lived all over the world, explored a dozen religions, and fought for a slew of foreign militaries. Ludlow didn’t expect anyone to fall for these stories; rather, he aimed to produce a less targeted personal experience of Facebook through the inaccurate guesses to which the advertising now responds, and as an act of protest against the manipulation and “coercive psychological tricks” embedded both in the advertising itself and in the site mechanisms that provoke or sway users to enter more information than they may intend to enter. In fact, the sheer implausibility of Ludlow’s Timeline life as a globe-trotting, caddish mystic-mercenary with incredibly bad luck acts as a kind of filter: no human reader, and certainly no friend or acquaintance of Ludlow’s, would assume that all of it was true, but the analysis that drives the advertising has no way of making such distinctions.
Ludlow hypothesizes that, if his approach were to be adopted more
widely, it wouldn’t be difficult to identify wild geographic, professional, or demographic outliers—people whose Timelines were much too crowded with
incidents—and then wash their results out of a larger analysis. The particular understanding of victory that Ludlow envisions, which we discuss in the typology of goals presented in second part of this book, is a limited one. His Bayesian flooding isn’t meant to counteract and corrupt the vast scope of data
collection and analysis; rather, its purpose is to keep data about oneself both within the system and inaccessible. Max Cho describes a less extreme version:
“The trick is to populate your Facebook with just enough lies as to destroy the value and compromise Facebook’s ability to sell you”30—that is, to make your online activity harder to commoditize, as an act of conviction and protest.
2.15 FaceCloak: concealing the work of concealment
FaceCloak offers a different approach to limiting Facebook’s access to personal information. When you create a Facebook profile and fill in your personal information, including where you live, where you went to school, your likes and dislikes, and so on, FaceCloak allows you to choose whether to display this information openly or to keep it private.31 If you choose to display the information openly, it is passed to Facebook’s servers. If you choose to keep it OTHER EXAMPLES
39
private, FaceCloak sends it to encrypted storage on a separate server, where it may be decrypted for and displayed only to friends you have authorized when they browse your Facebook page using the FaceCloak plug-in. Facebook never gains access to it.
What is salient about FaceCloak for present purposes is that it obfuscates its method by generating fake information for Facebook’s required profile
fields, concealing from Facebook and from unauthorized viewers the fact that the real data are stored elsewhere. As FaceCloak passes your real data to the private server, FaceCloak fabricates for Facebook a plausible non-person of a certain gender, with a name and an age, bearing no relation to the real facts about you. Under the cover of the plausible non-person, you can forge genuine connections with your friends while presenting obfuscated data for others.
2.16 Obfuscated likefarming: concealing indications of manipulation
Likefarming is now a well-understood strategy for generating the illusion of popularity on Facebook: employees, generally in the developing world, will
“like” a particular brand or product for a fee (the going rate is a few U.S. dollars for a thousand likes).32 A number of benefits accrue to heavily liked items—
among other things, Facebook’s algorithms will circulate pages that show
evidence of popularity, thereby giving them additional momentum.
Likefarming is easy to spot, particularly for systems as sophisticated as
Facebook’s. It is performed in narrowly focused bursts of activity devoted to liking one thing or one family of things, from accounts that do little else. To appear more natural, they employ an obfuscating strategy of liking a spread of pages—generally pages recently added to the feed of Page Suggestions,
which Facebook promotes according to its model of the user’s interests.33 The paid work of systematically liking one page can be hidden within scattered likes, appearing to come from a person with oddly singular yet characterless interests. Likefarming reveals the diversity of motives for obfuscation—not, in this instance, resistance to political domination, but simply provision of a service for a fee.
2.17 URME surveillance: “identity prosthetics” expressing protest
The artist Leo Selvaggio wanted to engage with the video surveillance of public space and the implications of facial-recognition software.34 After considering 40
CHAPTER 2
the usual range of responses (wearing a mask, destroying cameras, ironic attention-drawing in the manner of the Surveillance Camera Players), Selvaggio hit on a particularly obfuscating response with a protester’s edge: he produced and distributed masks of his face that were accurate enough so that
other people wearing them would be tagged as him by Facebook’s facial-
recognition software.
Selvaggio’s description of the project offers a capsule summary of
obfuscation: “[R]ather than try to hide or obscure one’s face from the camera, these devices allow you to present a different, alternative identity to the camera, my own.”
2.18 Manufacturing conflicting evidence: confounding investigation
The Art of Political Murder: Who Killed the Bishop? —Francisco Goldman’s account of the investigation into the death of Bishop Juan José Gerardi
Conedera—reveals the use of obfuscation to muddy the waters of evidence
collection.35 Bishop Gerardi, who played an enormously important part in
defending human rights during Guatemala’s civil war of 1960–1996, was mur-
dered in 1998.
As Goldman documented the long and dangerous process of bringing at
least a few of those responsible within the Guatemalan military to justice for this murder, he observed that those threatened by the investigation didn’t merely plant evidence to conceal their role. Framing someone else would be an obvious tactic, and the planted evidence would be assumed to be false.
Rather, they produced too much conflicting evidence, too many witnesses and testimonials, too many possib
le stories. The goal was not to construct an airtight lie, but rather to multiply the possible hypotheses so prolifically that observers would despair of ever arriving at the truth. The circumstances of the bishop’s murder produced what Goldman terms an “endlessly exploitable situation,” full of leads that led nowhere and mountains of seized evidence, each factual element calling the others into question. “So much could be made and so much would be made to seem to connect,” Goldman writes, his italics emphasizing the power of the ambiguity.36
The thugs in the Guatemalan military and intelligence services had plenty
of ways to manage the situation: access to internal political power, to money, and, of course, to violence and the threat of violence. In view of how opaque OTHER EXAMPLES
41
the situation remains, we do not want to speculate about exact decisions, but the fundamental goal seems reasonably clear. The most immediately
significant adversaries—investigators, judges, journalists—could be killed, menaced, bought, or otherwise influenced. The obfuscating evidence and
other materials were addressed to the larger community of observers, a proliferation of false leads throwing enough time-wasting doubt over every aspect of the investigation that it could call the ongoing work, and any conclusions, into question.
42
CHAPTER 2
Understanding Obfuscation
II
3 WHY IS OBFUSCATION NECESSARY?
Where does a wise man hide a leaf? In the forest. But what does he do if there is no forest? … He grows a forest to hide it in.
G. K. Chesterton, “The Sign of the Broken Sword”
3.1 Obfuscation in brief
Privacy is a complex and even contradictory concept, a word of such broad
meanings that in some cases it can become misleading, or almost meaning-
less. It is expressed in law and policy, in technology, philosophy, and in everyday conversation. It encompasses a space that runs from a dashboard on a
website—your privacy settings, managed through drop-down menus and