by Finn Brunton
1.12 Babble tapes: hiding speech in speech
An old cliché about mobsters under threat from the FBI involved a lot of
talking in bathrooms: the splash and hiss of water and the hum of the ventilation fan, so the story went, made conversations hard to hear if the house
was bugged or if someone in the room was wearing a wire. There are now
refined (and much more effective) techniques for defeating audio surveillance that draw more directly on obfuscation. One of these is the use of so-called babble tapes.37 Paradoxically, babble tapes have been used less by mobsters than by attorneys concerned that eavesdropping may violate attorney-client privilege.
A babble tape is a digital file meant to be played in the background during conversations. The file is complex. Forty voice tracks run simultaneously
(thirty-two in English, eight in other languages), and each track is compressed in frequency and time to produce additional “voices” that fill the entire frequency spectrum. There are also various non-human mechanical noises, and
a periodic supersonic burst (inaudible to adult listeners) engineered specifically to interfere with the automatic gain-control system of an eavesdropping device configures itself to best pick up an audio signal. Most pertinent for present purposes, the voices on a babble tape used by an attorney include
those of the client and the attorney themselves. The dense mélange of voices increases the difficulty of discerning any single voice.
1.13 Operation Vula: obfuscation in the struggle against apartheid
We close this chapter with a detailed narrative example of obfuscation
employed in a complex context by a group seeking to get Nelson Mandela
released from prison in South Africa during the struggle against Apartheid.
Called Operation Vula (short for Vul’indlela, meaning Opening the Road), it was devised by leaders of the African National Congress within South Africa who were in contact with Mandela and were coordinating their efforts with those of ANC agents, sympathizers, and generals around the world.
The last project of this scale that the ANC had conducted had resulted in
the catastrophe of the early 1960s in which Mandela and virtually all of the ANC’s top leaders had been arrested and the Liliesleaf Farm documents had
been captured and had been used against them in court. This meant that Operation Vula had to be run with absolutely airtight security and privacy practices.
Indeed, when the full scope of the operation was revealed in the 1990s, it came COre CaSeS
21
as a surprise not just to the South African government and to international intelligence services but also to many prominent leadership figures within the ANC. People purportedly receiving kidney transplants or recovering from
motorcycle accidents had actually gone deep underground with new identities and then had returned to South Africa, “opening the road” for Mandela’s
release. Given the surveillance inside and outside South Africa, the possible compromise of pre-existing ANC communications channels, and the interest
of spies and law-enforcement groups around the world, Operation Vula had to have secure ways of sharing and coordinating information.
The extraordinary tale of Operation Vula has been told by one of its chief architects, Tim Jenkin, in the pages of the ANC’s journal Mayibuye.38 It represents a superb example of operations security, tradecraft, and managing a secure network.
Understanding when and how obfuscation came to be employed in Oper-
ation Vula requires understanding some of the challenges its architects faced.
Using fixed phone lines within South Africa, each linked to an address and a name, wasn’t an option. The slightest compromise might lead to wiretaps and to what we would now call metadata analysis, and thus a picture of the activist network could be put together from domestic and overseas phone logs. The
Vula agents had various coding systems, each of them hampered by the difficulty and tedium of doing the coding by hand. There was always the temptation to fall back on “speaking in whispers over phones again,” especially when
crises happened and things began moving fast. The operation had to be seamlessly coordinated between South Africa (primarily Durban and Johannesburg) and Lusaka, London, Amsterdam, and other locations around the world as
agents circulated. Postal service was slow and vulnerable, encrypting was
enormously time consuming and often prone to sloppiness, use of home
phones was forbidden, and coordinating between multiple time zones around
the world seemed impossible.
Jenkin was aware of the possibilities of using personal computers to
make encryption faster and more efficient. Based in London after his escape from Pretoria Central Prison, he spent the mid 1980s working on the communications system needed for Operation Vula, which ultimately evolved into a remarkable network. Encryption happened on a personal computer, and the
ciphered message was then expressed as a rapid series of tones recorded
onto a portable cassette player. An agent would go to a public pay phone and 22
Chapter 1
dial a London number, which would be picked up by an answering machine that Jenkin had modified to record for up to five minutes. The agent would play the cassette into the mouthpiece of the phone. The tones, recorded on the cassette’s other side, could be played through an acoustic modem into the
computer and then decrypted. (There was also an “outgoing” answering
machine. Remote agents could call from a pay phone, record the tones for
their messages, and decrypt them anywhere they had access to a computer
that could run the ciphering systems Jenkin had devised.)
This was already an enormously impressive network—not least because
large parts of its digital side (including a way of implementing error-handling codes to deal with the noise of playing back messages over international
phone lines from noisy booths) had to be invented from scratch. However, as Operation Vula continued to grow and the network of operatives to expand, the sheer quantity of traffic threatened to overwhelm the network. Operatives
were preparing South Africa for action, and that work didn’t leave a lot of time for finding pay phones that accepted credit cards (the sound of coins dropping could interfere with the signal) and standing around with tape players. Jenkin and his collaborators would stay up late, changing tapes in the machines as the messages poured in. The time had come to switch to encrypted email,
but the whole system had been developed to avoid the use of known, owned
telephone lines within South Africa.
Operation Vula needed to be able to send encrypted messages to and
from computers in South Africa, in Lukasa, and in London without arousing
suspicion. During the 1980s, while the network we have described was taking shape, the larger milieu of international business was producing exactly the kind of background against which this subterfuge could hide itself. The question was, as Jenkin put it, “Did the enemy have the capacity to determine which of the thousands of messages leaving the country every day was a ‘suspicious’
one?” The activists needed a typical user of encrypted email—one without
clear political affiliation—to find out if their encrypted messages could escape notice in the overall tide of mail. They needed, Jenkin later recalled, to “find someone who would normally use a computer for communicating abroad and
get that person to handle the communications.”
They had an agent who could try this system out before they switched
their communications over to the new approach: a native South African who
was about to return to his homeland after working abroad for many years as a COre CaSeS
23
programmer for British telecommunications companies. Their agent would b
ehave just as a typical citizen sending a lot of email messages every day would, using a commercial email provider rather than a custom server and
relying on the fact that many businesses used encryption in their communications. “This was a most normal thing for a person in his position to do,” Jenkin recalled. The system worked: the agent’s messages blended in with the ordinary traffic, providing a platform for openly secret communications that could be expanded rapidly.
Posing as computer consultants, Tim Jenkin and Ronnie Press (another
important member of the ANC Technical Committee) were able to keep abreast of new devices and storage technologies, and to arrange for their purchase and delivery where they were needed. Using a combination of commercial
email providers and bulletin-board services run off personal and pocket computers, they were able to circulate messages within South Africa and around the world, and also to prepare fully formatted ANC literature for distribution.
(The system even carried messages from Mandela, smuggled out by his
lawyer in secret compartments in books and typed into the system.) The ordinary activity of ordinary users with bland business addresses became a
high-value informational channel, moving huge volumes of encrypted data
from London to Lukasa and then into South Africa and between Vula cells in that country. The success of this system was due in part to historical
circumstance—personal computers and email (including encrypted email)
had become common enough to avoid provoking suspicion, but not so common
as to inspire the construction of new, more comprehensive digital surveillance systems such as governments have today.
The Vula network, in its ultimate stage, wasn’t naive about the security of digital messages; it kept everything protected by a sophisticated encryption system full of inventive details, and it encouraged its users to change their encryption keys and to practice good operations security. Within that context, however, it offers an excellent example of the role obfuscation can play in building a secure and secret communications system. It illustrates the benefits of finding the right existing situation and blending into it, lost in the hubbub of ordinary commerce, hidden by the crowd.
24
Chapter 1
2 OTHER EXAMPLES
2.1 Orb-weaving spiders: obfuscating animals
Some animals (and some plants too) have ways to conceal themselves or
engage in visual trickery. Insects mimic the appearance of leaves or twigs, rabbits have countershading (white bellies) to eliminate the cues of shape that enables a hawk to easily see and strike, and spots on buttterflies’ wings mimic the eyes of predatory animals.
A quintessential obfuscator in the animal world is Cyclosa mulmeinensis, an orb-weaving spider.1 This spider faces a particular problem for which
obfuscation is a sound solution: its web must be somewhat exposed in order to catch prey, but that makes the spider much more vulnerable to attack by wasps. The spider’s solution is to make stand-ins for itself out of remains of its prey, leaf litter, and spider silk, with (from the perspective of a wasp) the same size, color, and reflectivity of the spider itself, and to position these decoys around the web. This decreases the odds of a wasp strike hitting home and gives Cyclosa mulmeinensis time to scuttle out of harm’s way.
2.2 False orders: using obfuscation to attack rival businesses
The obfuscation goal of making a channel noisier can be employed not only to conceal significant traffic, but also to raise the costs of organization through that channel—and so raise the cost of doing business. The taxi-replacement company Uber provides an example of this approach in practice.
The market for businesses that provide something akin to taxis and car
services is growing fast, and competition for both customers and drivers is fierce. Uber has offered bonuses to recruit drivers from competing services, and rewards merely for visiting the company’s headquarters. In New York,
Uber pursued a particularly aggressive strategy against its competitor Gett, using obfuscation to recruit Gett’s drivers.2 Over the course of a few days, several Uber employees would order rides from Gett, then would cancel those orders shortly before the Gett drivers arrived. This flood of fruitless orders kept the Gett drivers in motion, not earning fees, and unable to fulfill many legitimate requests. Shortly after receiving a fruitless order, or several of them, a Gett driver would receive a text message from Uber offering him
money to switch jobs. Real requests for rides were effectively obfuscated by Uber’s fake requests, which reduced the value of a job with Gett. (Lyft, a ride-
sharing company, has alleged that Uber has made similar obfuscation attacks on its drivers.)
2.3 French decoy radar emplacements: defeating radar detectors
Obfuscation plays a part in the French government’s strategy against radar detectors.3 These fairly common appliances warn drivers when police are
using speed-detecting radar nearby. Some radar detectors can indicate the
position of a radar gun relative to a user’s vehicle, and thus are even more effective in helping drivers to avoid speeding tickets.
In theory, tickets are a disincentive to excessively fast and dangerous
driving; in practice, they serve as a revenue source for local police departments and governments. For both reasons, police are highly motivated to
defeat radar detectors.
The option of regulating or even banning radar detectors is unrealistic in view of the fact that 6 million French drivers are estimated to own them.
Turning that many ordinary citizens into criminals seems impolitic. Without the power to stop surveillance of radar guns, the French government has taken to obfuscation to render such surveillance less useful in high-traffic zones by deploying arrays of devices that trigger radar detectors’ warning signals
without actually measuring speed. These devices mirror the chaff strategy in that the warning chirps multiply and multiply again. One of them may, indeed, indicate actual speed-detecting radar, but which one? The meaningful signal is drowned in a mass of other plausible signals. Either drivers risk getting
speeding tickets or they slow down in response to the deluge of radar pings.
And the civic goal is accomplished. No matter how one feels about traffic cops or speeding drivers, the case holds interest as a way obfuscation serves to promote an end not by destroying one’s adversaries’ devices outright but by rendering them functionally irrelevant.
2.4 AdNauseam: clicking all the ads
In a strategy resembling that of the French radar-gun decoys, AdNauseam, a browser plug-in, resists online surveillance for purposes of behavioral advertising by clicking all the banner ads on all the Web pages visited by its users.
In conjunction with Ad Block Plus, AdNauseam functions in the background,
quietly clicking all blocked ads while recording, for the user’s interest, details about ads that have been served and blocked.
26
CHAPTER 2
The idea for AdNauseam emerged out of a sense of helplessness: it isn’t possible to stop ubiquitous tracking by ad networks, or to comprehend the
intricate institutional and technical complexities constituting its socio-technical backend. These include Web cookies and beacons, browser fingerprinting
(which uses combinations and configurations of the visitor’s technology to identify their activities), ad networks, and analytics companies. Efforts to find some middle ground through a Do Not Track technical standard have been
frustrated by powerful actors in the political economy of targeted advertising.
In this climate of no compromise, AdNauseam was born. Its design was
inspired by a slender insight into the prevailing business model, which charges prospective advertisers a premium for delivering viewers with proven interest in their products. What more telling evidence is there of interest than clicks o
n particular ads? Clicks also sometimes constitute the basis of payment to an ad network and to the ad-hosting website. Clicks on ads, in combination with
other data streams, build up the profiles of tracked users. Like the French radar decoy systems, AdNauseam isn’t aiming to destroy the ability to track clicks; instead it functions by diminishing the value of those clicks by obfuscating the real clicks with clicks that it generates automatically.
2.5 Quote stuffing: confusing algorithmic trading strategies
The term “quote stuffing” has been applied to bursts of anomalous activity on stock exchanges that appear to be misleading trading data generated to gain advantage over competitors on the exchange. In the rarefied field of high-frequency trading (HFT), algorithms perform large volumes of trades far faster than humans could, taking advantage of minute spans of time and differences in price that wouldn’t draw the notice of attention of human traders. Timing has always been critical to trading, but in HFT thousandths of a second separate profit and loss, and complex strategies have emerged to accelerate your trades and retard those of your competitors. Analysts of market behavior began to notice unusual patterns of HFT activity during the summer of 2010: bursts of quote requests for a particular stock, sometimes thousands of them in a
second. Such activity seemed to have no economic rationale, but one of the most interesting and plausible theories is that these bursts are an obfuscation tactic. One observer explains the phenomenon this way: “If you could generate a large number of quotes that your competitors have to process, but you
can ignore since you generated them, you gain valuable processing time.”4
OTHER EXAMPLES
27
Unimportant information, in the form of quotes, is used to crowd the field of salient activity so that the generators of the unimportant information can accurately assess what is happening while making it more difficult and time consuming for their competitors to do so. They create a cloud that only they can see through. None of the patterns in that information would fool or even distract an analyst over a longer period of time—it would be obvious that they were artificial and insignificant. But in the sub-split-second world of HFT, the time it takes merely to observe and process activity makes all the difference.