Obfuscation

Home > Other > Obfuscation > Page 3
Obfuscation Page 3

by Finn Brunton


  Initially culled from RSS feeds, these terms evolve so that different users develop different seed lists. The precision of the imitation is continually refined by repopulating the seed list with new terms generated from returns to search COre CaSeS

  13

  queries. TrackMeNot submits queries in a manner that tries to mimic real users’ search behaviors. For example, a user who has searched for “good

  wi-fi cafe chelsea” may also have searched for “savannah kennels,” “freshly pressed juice miami,” “asian property firm,” “exercise delays dementia,” and

  “telescoping halogen light.” The activities of individuals are masked by those of many ghosts, making the pattern harder to discern so that it becomes much more difficult to say of any query that it was a product of human intention rather than an automatic output of TrackMeNot. In this way, TrackMeNot

  extends the role of obfuscation, in some situations, to include plausible

  deniability.

  1.5 Uploads to leak sites: burying significant files

  WikiLeaks used a variety of systems for securing the identities of both visitors and contributors. However, there was a telltale sign that could undercut the safety of the site: uploads of files. If snoops could monitor the traffic on WikiLeaks, they could identify acts of submitting material to WikiLeaks’ secure server. Especially if they could make informed guesses as to the compressed sizes of various collections of subsequently released data, they could retroac-tively draw inferences as to what was transmitted, when it was transmitted, and (in view of failures in other areas of technical and operations security) by whom it was transmitted. Faced with this very particular kind of challenge, WikiLeaks developed a script to produce false signals. It launched in the

  browsers of visitors, generating activity that looked like uploads to the secure server.19 A snoop would therefore see an enormous mob of apparent leakers

  (the vast majority of whom were, in actuality, merely reading or looking

  through documents already made available), a few of whom might really be

  leakers. It didn’t seek to provide particular data to interfere with data mining or with advertising; it simply sought to imitate and conceal the movements of some of its users.

  Even encrypted and compressed data contain pertinent metadata,

  however, and the proposal for OpenLeaks—an ultimately unsuccessful variant on WikiLeaks, developed by some of the disaffected participants in the original WikiLeaks system—includes a further refinement.20 After a statistical analysis of the WikiLeaks submissions, OpenLeaks developed a model of fake uploads

  that would keep to the same ratios of sizes of files typically appearing in the upload traffic of a leak site. Most of the files ranged in size from 1.5 to 2

  14

  Chapter 1

  megabytes, though a few outliers exceeded 700 megabytes. If an adversary can monitor upload traffic, form can be as telling as content, and as useful in sorting real signals from fake ones. As this example suggests, obfuscation mechanisms can gain a great deal from figuring out all the parameters that can be manipulated—and from figuring out what the adversary is looking for, so as to give the adversary a manufactured version of it.

  1.6 False tells: making patterns to trick a trained observer

  Consider how the same basic pattern of obfuscation can be called to service in a context lighter than concealing the work of whistleblowers: poker.

  Much of the pleasure and much of the challenge of poker lies in learning

  to infer from expressions, gestures, and body language whether someone is

  bluffing (that is, pretending to hold a hand weaker than the one he or she actually holds) in hopes of drawing a call. Central to the work of studying one’s opponents is the “tell”—some unconscious habit or tic that an opponent displays in response to a strong or a weak hand, such as sweating,

  glancing worriedly, or leaning forward. Tells are so important in the informational economy of poker that players sometimes use false tells—that is, they create mannerisms that may appear to be parts of a larger pattern.21 In

  common poker strategy, the use of a false tell is best reserved for a crucial moment in a tournament, lest the other players figure out that it is inaccurate and use it against you in turn. A patient analysis of multiple games could separate the true tells from the false ones, but in the time-bound context of a high-stakes game the moment of falsehood can be highly effective. Similar

  techniques are used in many sports that involve visible communication.

  One example is signaling in baseball—as a coach explained to a newspaper

  reporter, “Sometimes you’re giving a sign, but it doesn’t even mean

  anything.”22

  1.7 Group identity: many people under one name

  One of the simplest and most memorable examples of obfuscation, and one

  that introduces the work of the group in obfuscation, is the scene in the film Spartacus in which the rebel slaves are asked by Roman soldiers to identify their leader, whom the soldiers intend to crucify.23 As Spartacus (played by Kirk Douglas) is about to speak, one by one the others around him say “I am Spartacus!” until the entire crowd is claiming that identity.

  COre CaSeS

  15

  Many people assuming the same identity for group protection (for example, Captain Swing in the English agricultural uprisings of 1830, the ubiquitous “Jacques” adopted by the radicals in Dickens’s A Tale of Two Cities, or the Guy Fawkes mask in the graphic novel V for Vendetta, now associated with the hacktivist group known as Anonymous) is, at this point, almost a cliché.24

  Marco Deseriis has studied the use of “improper names” and collective identities in the effacement of individual responsibility and the proliferation of action.25 Some forms of obfuscation can be conducted solo; others rely on

  groups, teams, communities, and confederates.

  1.8 Identical confederates and objects: many people in one outfit

  There are many examples of obfuscation by members of a group working in

  concert to produce genuine but misleading signals within which the genuine, salient signal is concealed. One memorable example from popular culture is the scene in the 1999 remake of the film The Thomas Crown Affair in which the protagonist, wearing a distinctive Magritte-inspired outfit, is suddenly in a carefully orchestrated mass of other men, dressed in the same outfit, circulat-ing through the museum and exchanging their identical briefcases.26 The

  bank-robbery scheme in the 2006 film Inside Man hinges on the robbers’ all wearing painters’ overalls, gloves, and masks and dressing their hostages the same way.27 Finally, consider the quick thinking of Roger Thornhill, the protagonist of Alfred Hitchcock’s 1959 film North By Northwest, who, in order to evade the police when his train arrives in Chicago, bribes a redcap (a

  baggage handler) to lend him his distinctive uniform, knowing that the crowd of redcaps at the station will give the police too much of something specific to look for.28

  Identical objects as modes of obfuscation are common enough and suffi-

  ciently understood to recur in imagination and in fact. The ancilia of ancient Rome exemplify this. A shield ( ancile) fell from the sky—so the legend goes—

  during the reign of Numa Pompilius, Rome’s second king (753–673 BCE), and

  was interpreted as a sign of divine favor, a sacred relic whose ownership

  would guarantee Rome’s continued imperium.29 It was hung in the Temple of

  Mars along with eleven exact duplicates, so would-be thieves wouldn’t know which one to take. The six plaster busts of Napoleon from which the Sherlock Holmes story gets its title offers another example. The villain sticks a black pearl into the wet plaster of an object that not only has five duplicates but also 16

  Chapter 1

  is one of a larger class of objects (cheap white busts of Napoleon) that are ubiquitous enough to be invisible.30

  A real-world ins
tance is provided by the so-called Craigslist robber. At 11

  a.m. on Tuesday, September 30, 2008, a man dressed as an exterminator (in a blue shirt, goggles, and a dust mask), and carrying a spray pump, approached an armored car parked outside a bank in Monroe, Washington, incapacitated

  the guard with pepper spray, and made off with the money.31 When the police arrived, they found thirteen men in the area wearing blue shirts, goggles, and dust masks—a uniform they were wearing on the instructions of a Craigslist ad that promised a good wage for maintenance work, which was to start at

  11:15 a.m. at the bank’s address. It would have taken only a few minutes to determine that none of the day laborers was the robber, but a few minutes was all the time the robber needed.

  Then there is the powerful story, often retold though factually inaccurate, of the king of Denmark and a great number of Danish gentiles wearing the

  Yellow Star so that the occupying Germans couldn’t distinguish and deport

  Danish Jews. Although the Danes courageously protected their Jewish popu-

  lation in other ways, the Yellow Star wasn’t used by the Nazis in occupied Denmark, for fear of arousing more anti-German feeling. However, “there

  were documented cases of non–Jews wearing yellow stars to protest Nazi

  anti–Semitism in Belgium, France, the Netherlands, Poland, and even Germany itself.”32 This legend offers a perfect example of cooperative obfuscation: gentiles wearing the Yellow Star as an act of protest, providing a population into which individual Jews could blend.33

  1.9 excessive documentation: making analysis inefficient

  Continuing our look at obfuscation that operates by adding in genuine but misleading signals, let us now consider the overproduction of documents as a

  form of obfuscation, as in the case of over-disclosure of material in a lawsuit.

  This was the strategy of Augustin Lejeune, chief of the General Police Bureau in the Committee of Public Safety, a major instrument in the Terror phase of the French Revolution. Lejeune and his clerks produced the reports that laid the groundwork for arrests, internments, and executions. Later, in an effort to excuse his role in the Terror, Lejeune argued that the exacting, overwhelmingly detailed quality of the reports from his office had been deliberate: he had instructed his clerks to overproduce material, and to report “the most minor COre CaSeS

  17

  details,” in order to slow the production of intelligence for the Committee without the appearance of rebellion. It is doubtful that Lejeune’s claims are entirely accurate (the numbers he cites for the production of reports aren’t reliable), but, as Ben Kafka points out, he had come up with a bureaucratic strategy for creating slowdowns through oversupply: “He seems to have recognized, if only belatedly, that the proliferation of documents and details presented opportunities for resistance, as well as for compliance.”34 In situations where one can’t say No, there are opportunities for a chorus of unhelpful

  Yeses—for example, don’t send a folder in response to a request; send a

  pallet of boxes of folders containing potentially relevant papers.

  1.10 Shuffling SIM cards: rendering mobile targeting uncertain

  As recent reporting and some of Edward Snowden’s disclosures have revealed, analysts working for the National Security Agency use a combination of

  signals-intelligence sources—particularly cell-phone metadata and data

  from geolocation systems—to identify and track targets for elimination.35 The metadata (showing what numbers were called and when they were called)

  produce a model of a social network that makes it possible to identify particular phone numbers as belonging to persons of interest; the geolocative properties of mobile phones mean that these numbers can be situated, with varying degrees of accuracy, in particular places, which can then be targeted by

  drones. In other words, this system can proceed from identification to location to assassination without ever having a face-to-face visual identification of a person. The closest a drone operator may come to setting eyes on someone

  may be the exterior of a building, or a silhouette getting into a car. In view of the spotty records of the NSA’s cell-phone-metadata program and the drone

  strikes, there are, of course, grave concerns about accuracy. Whether one is concerned about threats to national security remaining safe and active, about the lives of innocent people taken unjustly, or about both, it is easy to see the potential flaws in this approach.

  Let us flip the situation, however, and consider it more abstractly from the perspective of the targets. Most of the NSA’s targets are obligated to always have, either with or near them, a tracking device (only the very highest-level figures in terrorist organizations are able to be free of signals-generating technology), as are virtually all the people with whom they are in contact. The calls and conversations that sustain their organizations also provide the

  18

  Chapter 1

  means of their identification; the structure that makes their work possible also traps them. Rather than trying to coordinate anti-aircraft guns to find a target somewhere in the sky, the adversary has complete air superiority, able to

  deliver a missile to a car, a street corner, or a house. However, the adversary also has a closely related set of systemic limitations. This system, remarkable as it is in scope and capabilities, ultimately relies on SIM (subscriber identity module) cards and on physical possession of mobile phones—a kind of

  narrow bandwidth that can be exploited. A former drone operator for the Joint Special Operations Command has reported that targets therefore take measures to mix and confuse genuine signals. Some individuals have many SIM

  cards associated with their identity in circulation, and the cards are randomly redistributed. One approach is to hold meetings at which all the attendees put their SIM cards into a bag, then pull cards from the bag at random, so that who is actually connected to each device will not be clear. (This is a time-bound approach: if metadata analysis is sufficiently sophisticated, an analyst should eventually be able to sort the individuals again on the basis of past calling patterns, but irregular re-shuffling renders that more difficult.) Re-shuffling may also happen unintentionally as targets who aren’t aware that they are

  being tracked sell their phones or lend them to friends or relatives. The end result is a system with enormous technical precision and a very uncertain rate of actual success, whether measured in terms of dangerous individuals eliminated or in terms of innocent noncombatants killed by mistake. Even when

  fairly exact location tracking and social-graph analysis can’t be avoided, using obfuscation to mingle and mix genuine signals, rather than generating false signals, can offer a measure of defense and control.

  1.11 tor relays: requests on behalf of others that conceal

  personal traffic

  Tor is a system designed to facilitate anonymous use of the Internet through a combination of encryption and passing the message through many different

  independent “nodes.” In a hybrid strategy of obfuscation, Tor can be used in combination with other, more powerful mechanisms for concealing data. Such a strategy achieves obfuscation partially through the mixing and interleaving of genuine (encrypted) activity. Imagine a message passed surreptitiously

  through a huge crowd to you. The message is a question without any identifying information; as far as you know, it was written by the last person to hold it, COre CaSeS

  19

  the person who handed it to you. The reply you write and pass back vanishes into the crowd, following an unpredictable path. Somewhere in that crowd, the writer receives his answer. Neither you nor anyone else knows exactly who

  the writer was.

  If you request a Web page while working through Tor, your request will

  not come from your IP address; it will come from an “exit node” (analogous to the last person who hands the message to its addressee) on the
Tor system, along with the requests of many other Tor users. Data enter the Tor system and pass into a labyrinth of relays—that is, computers on the Tor network

  (analogous to people in the crowd) that offer some of their bandwidth for the purpose of handling Tor traffic from others, agreeing to pass messages sight unseen. The more relays there are, the faster the system is as a whole. If you are already using Tor to protect your Internet traffic, you can turn your computer into a relay for the collective greater good. Both the Tor network and the obfuscation of individuals on the network improve as more people make use

  of the network.

  Obfuscation, Tor’s designers point out, augments its considerable protec-

  tive power. In return for running a Tor relay, “you do get better anonymity against some attacks. The simplest example is an attacker who owns a small number of Tor relays. He will see a connection from you, but he won’t be able to know whether the connection originated at your computer or was relayed

  from somebody else.”36 If someone has agents in the crowd—that is, if

  someone is running Tor relays for surveillance purposes—the agents can’t

  read a message they pass, but they can notice who passed it to them. If you are on Tor and not running a relay, they know that you wrote the message you gave to them. But if you are letting your computer operate as a relay, the message may be yours or may be just one among many that you are passing

  on for other people. Did that message start with you, or not? The information is now ambiguous, and messages you have written are safe in a flock of other messages you pass along. This is, in short, a significantly more sophisticated and efficient way to render particular data transactions ambiguous and to

  thwart traffic analysis by making use of the volume of the traffic. It doesn’t merely mix genuine signals (as shaking up SIM cards in a bag does, with all the consequent problems of coordination); it gets each message to its destina-tion. However, each message can serve to make the sources of other mes-

  sages uncertain.

  20

  Chapter 1

 

‹ Prev