by Finn Brunton
A U S E R ’ S G U I D E
FOR PRIVACY AND PROTEST
Finn Brunton Helen Nissenbaum
The MIT Press
Cambridge, Massachusetts
London, England
© 2015 Finn Brunton and Helen Nissenbaum
All rights reserved. No part of this book may be reproduced in any form by any electronic or mechanical means (including photocopying, recording, or information storage and retrieval) without permission in writing from the publisher.
MIT Press books may be purchased at special quantity discounts for business or sales promotional use. For information, email [email protected].
Set in PF Din Text Cond Pro by Toppan Best-set Premedia Limited. Printed and bound in the United States of America.
Library of Congress Cataloging-in-Publication Data are available.
ISBN: 978-0-262-02973-5
10 9 8 7 6 5 4 3 2 1
CONTENTS
Acknowledgments ix
Introduction 1
I AN OBFUSCATION VOCABULARY
1 Core Cases 8
1.1 Chaff: defeating military radar 8
1.2 Twitter bots: filling a channel with noise 9
1.3 CacheCloak: location services without location tracking 12
1.4 TrackMeNot: blending genuine and artificial search queries 13
1.5 Uploads to leak sites: burying significant files 14
1.6 False tells: making patterns to trick a trained observer 15
1.7 Group identity: many people under one name 15
1.8 Identical confederates and objects: many people in one outfit 16
1.9 Excessive documentation: making analysis inefficient 17
1.10 Shuffling SIM cards: rendering mobile targeting uncertain 18
1.11 Tor relays: requests on behalf of others that conceal personal
traffic 19
1.12 Babble tapes: hiding speech in speech 21
1.13 Operation Vula: obfuscation in the struggle against Apartheid 21
2 Other Examples 25
2.1 Orb-weaving spiders: obfuscating animals 25
2.2 False orders: using obfuscation to attack rival businesses 25
2.3 French decoy radar emplacements: defeating radar detectors 26
2.4 AdNauseam: clicking all the ads 26
2.5 Quote stuffing: confusing algorithmic trading strategies 27
2.6 Swapping loyalty cards to interfere with analysis of shopping
patterns 28
2.7 BitTorrent Hydra: using fake requests to deter collection of
addresses 29
2.8 Deliberately vague language: obfuscating agency 30
2.9 Obfuscation of anonymous text: stopping stylometric analysis 31
2.10 Code obfuscation: baffling humans but not machines 33
2.11 Personal disinformation: strategies for individual disappearance 35
2.12 Apple’s “cloning service” patent: polluting electronic profiling 36
2.13 Vortex: cookie obfuscation as game and marketplace 37
2.14 “Bayesian flooding” and “unselling” the value of online identity 38
2.15 FaceCloak: concealing the work of concealment 39
2.16 Obfuscated likefarming: concealing indications of manipulation 40
2.17 URME surveillance: “identity prosthetics” expressing protest 40
2.18 Manufacturing conflicting evidence: confounding investigation 41
II UNDERSTANDING OBFUSCATION
3 Why Is Obfuscation Necessary? 45
3.1 Obfuscation in brief 45
3.2 Understanding information asymmetry: knowledge and power 48
3.3 The fantasy of opting out 53
3.4 Weapons of the weak: what obfuscation can do 55
3.5 Distinguishing obfuscation from strong privacy systems 58
4 Is Obfuscation Justified? 63
4.1 Ethics of obfuscation 64
4.2 From ethics to politics 70
vi
CONTENTS
5 Will Obfuscation Work? 84
5.1 Obfuscation is about goals 85
5.2 I want to use obfuscation … 87
… to buy some time 88
… to provide cover 88
… for deniability 89
… to prevent individual exposure 89
… to interfere with profiling 90
… to express protest 90
5.3 Is my obfuscation project … 90
… individual, or collective? 91
… known, or unknown? 92
… selective, or general? 92
… short-term, or long-term? 94
Epilogue 97
Notes 99
Bibliography 113
Index 121
CONTENTS
vii
ACKNOWLEDGMENTS
This book began with technology—the TrackMeNot project—and we owe
our deepest thanks to Daniel Howe, who got it on its feet, and Vincent
Toubiana, who joined the effort, expanded its scope, and continues tirelessly to support it and its adopters. Feedback and comments from the community of users and from the privacy community at large, and a joint technical paper with Lakshminarayanan Subramanian (Vincent Toubiana, Lakshminarayanan
Subramanian, and Helen Nissenbaum, “TrackMeNot: Enhancing the Privacy of
Web Search”) have opened our eyes to its potential and limitations. More
recently, creating and launching a second system, AdNauseam, with Daniel
Howe, in collaboration with the designer Mushon Zer-Aviv, further expanded our perspective on obfuscation and on the need for a deeper, systematic
appreciation of what it offers as a method and a strategy.
As we began to work on a general understanding of obfuscation, we were
able to explore many of the concepts in a paper published in First Monday and a chapter in Privacy, Due Process and the Computational Turn, which benefited enormously from review and editorial feedback in those venues.
Obfuscation became a book with the encouragement and thorough advice of the reviewers and of Marguerite Avery, Gita Manaktala, Susan Buckley,
Katie Helke, and Paul Bethge at the MIT Press. Our thanks to all of them. Emily Goldsher-Diamond did meticulous work as a research assistant, as well as
organizing many other aspects of this project. Work on this book through all its drafts was supported by grants from the National Science Foundation (ITR-0331542: Sensitive Information in a Wired World), from EAGER (CNS-1355398: Values in Design for Future Internet Architecture—Next Phase), from the Air Force Office of Scientific Research (MURI-ONR BAA 07-036: Collaborative
Policies and Assured Information Sharing), and from the Intel Science and
Technology Center for Social Computing. Support from these grants provided time, technology, and a collegial context for pursuing this project and bringing it to fruition.
Two major events helped shape and refine our thinking about obfuscation.
One was the Symposium on Obfuscation (February 15, 2014), jointly organized by New York University’s Department of Media, Culture, and Communication
and the Information Law Institute and co-sponsored by the Intel Science and Technology Center for Social Computing. For making this event possible, we would like to thank Nicole Arzt, Emily Goldsher-Diamond, Dove Helena
Pedlosky, Melissa Lucas-Ludwig, Erica Robles-Anderson, and Jamie Schuler—
and, above all, Seda Gürses, who organized, structured, and shaped so much of the day. Every single speaker had a direct effect on our manuscript. The other event was the ongoing conversation of the Privacy Research Group at
NYU, at whose weekly seminars we presented several stages of this material.
r /> The book would not have this final form without the PRG discussions; our fond thanks to everyone involved.
Other opportunities to present and test aspects of this work have been
enormously productive, and our ideas have been greatly improved by the
responses of supporters, critics, believers, and skeptics. These opportunities have included a joint MIT Center for Civic Media and Comparative Media Studies Colloquium; The New School for Social Research 2014 Graduate Conference;
New Media Salon, Tel Aviv; Communications and Journalism Departmental
Seminar, Hebrew University of Jerusalem; IBM R&D Labs, Haifa; Eyebeam Art
+ Technology Center; HotPETS 2013; Computers, Privacy and Data Protection, Brussels; and the Surveillance Studies Conference, Queens University.
We are deeply grateful for friends and colleagues with whom we could
discuss obfuscation as it developed, and who offered feedback, criticism,
encouragement, and ideas. In particular we would like to thank Julia Angwin, Solon Barocas, danah boyd, Claudia Diaz, Cynthia Dwork, Cathy Dwyer,
Tarleton Gillespie, Mireille Hildebrandt, Ari Juels, Nick Montfort, Deirdre Mulligan, Arvind Narayanan, Martijn van Otterloo, Ira Rubinstein, Ian Spiro, Luke Stark, Katherine Strandburg, Matthew Tierney, Joe Turow, Janet Vertesi, Tal Zarsky, Malte Ziewitz, and Ethan Zuckerman.
Finally, this book would not have been possible without the support of our professional home base, the Department of Media, Culture, and Communication at New York University. Thanks to you all!
x
ACKNOWLEDGMENTS
INTRODUCTION
We mean to start a revolution with this book. But not a big revolution—at
least, not at first. Our revolution does not rely on sweeping reforms, on a comprehensive Year Zero reinvention of society, or on the seamless and perfectly uniform adoption of a new technology. It is built on preexisting components—
what a philosopher would call tools ready-to-hand, what an engineer would
call commodity hardware—that are available in everyday life, in movies, in software, in murder mysteries, and even in the animal kingdom. Although its lexicon of methods can be, and has been, taken up by tyrants, authoritarians, and secret police, our revolution is especially suited for use by the small players, the humble, the stuck, those not in a position to decline or opt out or exert control over our data emanations. The focus of our limited revolution is on mitigating and defeating present-day digital surveillance. We will add
concepts and techniques to the existing and expanding toolkit for evasion, noncompliance, outright refusal, deliberate sabotage, and use according to our terms of service. Depending on the adversary, the goals, and the resources, we provide methods for disappearance, for time-wasting and analysis-frustrating, for prankish disobedience, for collective protest, for acts of individual redress both great and small. We draw an outline around a whole
domain of both established and emerging instances that share a common
approach we can generalize and build into policies, software, and action. This outline is the banner under which our big little revolution rides, and the space it defines is called obfuscation.
In a sentence: Obfuscation is the deliberate addition of ambiguous,
confusing, or misleading information to interfere with surveillance and data collection. It’s a simple thing with many different, complex applications and uses. If you are a software developer or designer, obfuscation you build into your software can keep user data safe—even from yourself, or from whoever
acquires your startup—while you provide social networking, geolocation, or other services requiring collection and use of personal information. Obfuscation also offers ways for government agencies to accomplish many of the
goals of data collection while minimizing the potential misuses. And if you are a person or a group wanting to live in the modern world without being a subject of pervasive digital surveillance (and an object of subsequent analysis),
obfuscation is a lexicon of ways to put some sand in the gears, to buy time, and to hide in the crowd of signals. This book provides a starting point.
Our project has tracked interesting similarities across very different domains in which those who are obliged to be visible, readable, or audible have
responded by burying salient signals in clouds and layers of misleading
signals. Fascinated by the diverse contexts in which actors reach for a strategy of obfuscation, we have presented, in chapters 1 and 2, dozens of detailed instances that share this general, common thread. Those two chapters, which make up part I of the book, provide a guide to the diverse forms and formats that obfuscation has taken and demonstrate how these instances are crafted and implemented to suit their respective goals and adversaries. Whether on a social network, at a poker table, or in the skies during the Second World War, and whether confronting an adversary in the form of a facial-recognition
system, the Apartheid government of 1980s South Africa, or an opponent
across the table, properly deployed obfuscation can aid in the protection of privacy and in the defeat of data collection, observation, and analysis. The sheer range of situations and uses discussed in chapters 1 and 2 is an inspira-tion and a spur: What kind of work can obfuscation do for you?
The cases presented in chapter 1 are organized into a narrative that introduces fundamental questions about obfuscation and describes important
approaches to it that are then explored and debated in part II of the book. In chapter 2, shorter cases illustrate the range and the variety of obfuscation applications while also reinforcing underlying concepts.
Chapters 3–5 enrich the reader’s understanding of obfuscation by consid-
ering why obfuscation has a role to play in various forms of privacy work; the ethical, social, and political problems raised by using obfuscatory tactics; and ways of assessing whether obfuscation works, or can work, in particular scenarios. Assessing whether an obfuscation approach works entails under-
standing what makes obfuscation distinct from other tools and understanding its particular weaknesses and strengths. The titles of chapters 3–5 are framed as questions.
The first question, asked in chapter 3, is “Why is obfuscation necessary?”
In answering that question, we explain how the challenges of present-day
digital privacy can be met by obfuscation’s utility. We point out how obfuscation may serve to counteract information asymmetry, which occurs when data 2
INTRODUCTION
about us are collected in circumstances we may not understand, for purposes we may not understand, and are used in ways we may not understand. Our
data will be shared, bought, sold, managed, analyzed, and applied, all of which will have consequences for our lives. Will you get a loan, or an apartment, for which you applied? How much of an insurance risk or a credit risk are you?
What guides the advertising you receive? How do so many companies and
services know that you’re pregnant, or struggling with an addiction, or plan-ning to change jobs? Why do different cohorts, different populations, and different neighborhoods receive different allocations of resources? Are you going to be, as the sinister phrase of our current moment of data-driven anti-terrorism has it, “on a list”? Even innocuous or seemingly benign work in this domain has consequences worth considering. Obfuscation has a role to play, not as a replacement for governance, business conduct, or technological interventions, or as a one-size-fits-all solution (again, it’s a deliberately small, distributed revolution), but as a tool that fits into the larger network of privacy practices. In particular, it’s a tool particularly well suited to the category of people without access to other modes of recourse, whether at a particular
moment or in general—people who, as it happens, may be unable to deploy
optimally configured privacy-protection tools because they are on the weak side of a particular i
nformation-power relationship.
Similarly, context shapes the ethical and political questions around
obfuscation. Obfuscation’s use in multiple domains, from social policy to social networks to personal activity, raises serious concerns. In chapter 4, we ask “Is obfuscation justified?” Aren’t we encouraging people to lie, to be willfully inaccurate, or to “pollute” with potentially dangerous noise databases that have commercial and civic applications? Aren’t obfuscators who use commercial
services free riding on the good will of honest users who are paying for
targeted advertising (and the services) by making data about themselves
available? And if these practices become widespread, aren’t we going to be collectively wasting processing power and bandwidth? In chapter 4 we address these challenges and describe the moral and political calculus according to which particular instances of obfuscation may be evaluated and found to be acceptable or unacceptable.
What obfuscation can and can’t accomplish is the focus of chapter 5. In
comparison with cryptography, obfuscation may be seen contingent, even
shaky. With cryptography, precise degrees of security against brute-force
INTRODUCTION
3
attacks can be calculated with reference to such factors as key length, processing power, and time. With obfuscation such precision is rarely possible, because its strength as a practical tool depends on what users want to accomplish and on what specific barriers they may face in respective circumstances of use. Yet complexity does not mean chaos, and success still rests on careful attention to systematic interdependencies. In chapter 5 we identify six common goals for an obfuscation project and relate them to design dimensions. The goals include buying some time, providing cover, deniability, evading observation, interfering with profiling, and expressing protest. The aspects of design we identify include whether an obfuscation project is individual or collective, whether it is known or unknown, whether it is selective or general, and
whether it is short-term or long-term. For some goals, for instance, obfuscation may not succeed if the adversary knows that it is being employed; for other goals—such as collective protest or interference with probable cause and production of plausible deniability—it is better if the adversary knows that the data have been poisoned. All of this, of course, depends on what resources are available to the adversary—that is, how much time, energy, attention, and money the adversary is willing to spend on identifying and weeding out obfuscating information. The logic of these relationships holds promise because it suggests that we can learn from reasoning about specific cases how to