Habeas Data_Privacy vs. The Rise of Surveillance Tech
Page 23
* * *
The Domicilio apartment complex, a set of six-story high-rise buildings surrounding an internal courtyard with a swimming pool, sits adjacent to a local baseball field and soccer pitch. The Santa Clara Caltrain station—where thousands of local commuters board trains daily northbound for Palo Alto, San Mateo, and San Francisco—is just a few blocks away. The Hacker had told the apartment complex’s manager that he was Steven Travis Brawner, software engineer: a profile that fit right in with many other tenants in the area.
The Hacker began breathing more heavily. He may have thought about heading towards the train station, which would take him out of town, or perhaps towards the San Jose International Airport, just three miles away. The Hacker couldn’t be sure if there were cops following him, or if he was just being paranoid. But as soon as he saw the marked Santa Clara Police Department cars, he knew the truth, and he started running.
But the Hacker didn’t get far. He was quickly surrounded, arrested, and searched. The police found the key to the Hacker’s apartment. Once they had a warrant to search his apartment, they began to tear it apart. Authorities found there a folding chair and a folding table that served as a desk. There was no other furniture—his bed was a cot. Law enforcement also found his Verizon Wireless mobile Internet AirCard, and false driver’s licenses with the names “Steven Travis Brawner,” “Patrick Stout,” and more.
A 2010 FBI press release later stated that the agency also “seized a laptop and multiple hard drives, $116,340 in cash, over $208,000 in gold coins, approximately $10,000 in silver coins, false identification documents, false identification manufacturing equipment, and surveillance equipment.”
Investigators identified the Hacker, via his fingerprints, as Daniel Rigmaiden, previously convicted of state-level misdemeanors.
According to an Internal Revenue Service (IRS) special agent’s search warrant, Rigmaiden’s computer also included “email regarding leaving the United States for the country of Dominica…[and] documents regarding obtaining citizenship in other countries; emails regarding paying off Dominican officials to get Dominican birth certificates and passports; and a Belize residency guide.”
Rigmaiden’s case dates back several years. In 2007 and early 2008, the IRS identified a bank account at Compass Bank in Phoenix that was receiving fraudulent tax refunds under the name Carter Tax & Accounting, LLC. Authorities identified Carter as being involved in the possible scheme. In early 2008, undercover operatives identified another man they dubbed the Hacker, as well as another as-yet-unnamed co-conspirator, who served higher up than Carter. Those law enforcement agents, with the assistance of a bank, then opened an account for the Hacker, who unknowingly deposited some fraudulently obtained tax refunds electronically into that account. That way, investigators could monitor his banking activities more easily.
In April 2008, the second co-conspirator was arrested in Utah, and that case remains under seal. This suspect and the Hacker were deemed to be above Carter in the tax-fraud ring. From April to August 2008, federal investigators tracked the Hacker via his Arizona bank account, and via packages sent to FedEx Kinko’s locations across Northern California. Rigmaiden’s indictment was initially sealed, pending cooperation with a federal investigation. But from the start, Rigmaiden declined to cooperate, and moved to represent himself (after firing three attorneys) and the case was subsequently unsealed.
“Representing myself was the only way I was able to force the case to proceed,” Rigmaiden later recalled.
As the case moved through the legal system over the following years, it became increasingly clear as to how the FBI was able to locate Rigmaiden and pinpoint his exact physical location. In a February 2011 afternoon court hearing, during a back-and-forth between Rigmaiden and Frederick A. Battista, assistant US attorney, the suspect told the judge:
One key point that Mr. Battista leaves out is that they’ve actually already identified what the device [to determine my location] is. Postal Inspector Wilson identified it as a StingRay and he indicated FBI used the device to locate the air card [the mobile Internet device], and I suspect those so-called generic terms he’s talking about they redacted out of all these documents, I’m willing to bet that most of these terms actually read “StingRay.”
And I know from research that has been done by the defense is [sic] that the StingRay’s made by Harris Wireless Products Group and it’s a trademark term. There’s only one StingRay in existence, it’s not generic, and that’s the device.
So for them to sit there and say they didn’t know that, that’s hard to believe. And I don’t think that it’s really sensitive. Especially when I have pictures of a StingRay here, and other Harris products that they manufacture. There’s a StingRay, StingRay 2, the KingFish and the AmberJack. And these are pretty much the devices the government uses—the FBI used—to locate cell phones.
While StingRay is a trademark, as Rigmaiden told the judge, stingray has since become so ubiquitous in law enforcement and national security circles as to also often act as the catch-all generic term—like Kleenex or Xerox.
A stingray acts as a fake cell tower and forces cell phones and other mobile devices using a cell network (like Rigmaiden’s AirCard, which provided his laptop with Internet access) to communicate with it rather than with a bona fide mobile network. Stingrays are big boxes—roughly the size of a laser printer—like something out of a 1950s-era switchboard, with all kinds of knobs and dials and readouts. Stingrays can easily be hidden inside a police surveillance van or another nearby location. But like everything else in the tech world, they’re getting cheaper, smaller, and better all the time.
The Harris Corporation, a longstanding American military contractor, won’t say exactly how its stingrays work, or exactly who it’s selling to, but it’s safe to say that it’s selling to lots of federal agencies and, by extension, local law enforcement. A 2008 price list shows that its StingRays, KingFish, and related devices sell for tens to hundreds of thousands of dollars.
The company’s 2017 annual financial report filed with the Securities and Exchange Commission shows that in recent years Harris has increased its sales of surveillance equipment and related tactical radio systems. It works with not only the US military and law enforcement, but also Canada, Australia, Poland, and Brazil, among other countries. The company has profited over $1.8 billion from fiscal year 2013 through 2017.
* * *
All of our cell phones rely on a network of towers and antennas that relay our signal back to the network and then connect us to the person that we’re communicating with. As we move across a city, mobile networks seamlessly hand off our call from one tower to the next, usually providing an uninterrupted call. But in order for the system to work, the mobile phone provider needs to know where the phone actually is so that it can direct a signal to it.
It does so by sending a short message to the phone nearly constantly—in industry terminology this is known as a ping. The message basically is asking the phone: “Are you there?” And your phone responds: “Yes, I’m here.” (Think of it as roughly the mobile phone version of the children’s swimming pool game Marco Polo.) If your phone cannot receive a ping, it cannot receive service. The bottom line is, if your phone can receive service, then the mobile provider (and possibly the cops, too) know where you are.
In short, our mobile phones can really easily be turned into tracking devices. Today, stingrays cost thousands of dollars, and aren’t typically deployed everywhere as a matter of course. But it wouldn’t be hard to do so, particularly as the technology gets more efficient.
This tracking technology is even more invasive than law enforcement presenting a d-order to a mobile phone provider. Rather than have the government provide a court order for a company to hand over data, the stingray simply eliminates the middleman. The government, armed with its own stingray, can simply pluck the phone’s location (and possibly the contents of calls, text messages, or any other unencrypted data being transmitted at th
e time, depending on the configuration) directly out of the air.
Stingrays come in two basic types: passive and active. Passive stingrays fool all phones in a certain area into thinking they’re the nearest or strongest cell tower. A passive stingray simply waits for a phone to connect, at which point the phone will automatically transmit its unique International Mobile Subscriber Identity (IMSI) number. For this reason, stingrays are often also called IMSI catchers. The IMSI number is tied to a service account, and can be a strong identifier for a person. (Another number, known as the International Mobile Equipment Identity [IMEI] number, analogous to a unique serial number, identifies the particular mobile phone that a person is using, irrespective of the carrier or customer.) Once the IMSI has been collected, the stingray tells the phone to disconnect and use another tower. This entire process takes mere moments to occur and nearly always happens without the phone’s owner realizing it.
The technique that most stingrays use was first patented back in 1993. IMSIs reside directly on a phone’s SIM card, the small digital chip found in GSM phones around the world. (In the United States, that’s all AT&T or T-Mobile users. Sprint or Verizon users also have IMSI numbers, but they work slightly differently.)
In normal use, when a mobile phone is turned on, the phone broadcasts its IMSI to the closest cell phone tower. Every phone will always default to the strongest (usually nearest) mobile phone tower—and stingrays take advantage of an inherent quality that exists even in bona fide cell towers: phones are designed to accept all commands issued from them. That also means that a stingray can tell the phone to not communicate in an encrypted manner—it doesn’t require any advanced software or hardware. Spoofing the tower means that the tower can simply tell the phone to transmit unencrypted data, and that phone will do so with no problem, and without alerting the user in any way. In fact, it’s extremely difficult to know if a stingray is being used. (In 2011, German mobile security researcher Karsten Nohl released Catcher Catcher, a piece of software that analyzes network traffic to determine if a stingray is in use.)
With a basic passive stingray, law enforcement doesn’t know that a given IMSI number is linked with a particular person. That link can be positively identified later by compelling the mobile provider with a subpoena or other legal procedure to connect a given IMSI number with a subscriber.
But it’s a useful tool for establishing which IMSI numbers were at a particular location at a specific moment in time. Those IMSI numbers could then also be kept and scanned for again at a later date. For example, does this IMSI number show up at a train station, airport, international border, or another location where a positive identification document has to be presented with a physical person? In other words, can a given IMSI number be positively linked with a particular person, even without contacting the mobile carrier? Stingrays could easily filter out any non-target IMSI numbers—and only alert the stingray operator if a certain IMSI number is observed.
While it does take time for a stingray to force a particular mobile phone to switch off of a real network and onto the fake network, there are many other methods that can be used to speed up this process. That includes frequently rotating the Location Area Code (LAC) that the stingray transmits. Ordinary towers broadcast an LAC, letting other towers know where they are. When a phone detects new LACs, it begins the process of handing off a call without interruption. In this case, that means that the target phone has more traps to fall into. Another technique commonly used since 2008 to speed up this process is to block all 3G (third-generation) signals—which are much better encrypted than an earlier protocol, 2G—by using radio frequency noise. (3G is designed to drop down to 2G when a 3G signal isn’t available. While that may be a good feature to increase the reliability of mobile phone service, it’s also highly insecure.) This forces the target phone to employ 2G, making it much easier to track.
According to testimony in the Rigmaiden case by Assistant US Attorney Battista, authorities were able to locate their suspect through the Internet Protocol (IP) address that he was using to file the fraudulent tax returns. IP addresses are unique strings of numbers that identify any device on the Internet at any given moment—but they can be obscured. Under normal use, it can be fairly easy, in collaboration with the Internet service provider (ISP), to connect a given IP address to a user account. The FBI was able to determine that the IP address their suspect was using was connected to a Verizon Wireless account.
“They had identified one of the IP addresses I was using, or a couple I had been assigned by Verizon over a short period of time,” Rigmaiden said later.
“I was always using proxies, but at some point one of the proxies forwarded my real IP address, or my proxy software otherwise failed and my computer connected directly to the e-file website. I’m not really sure what happened. I was using thousands of proxies in an automated fashion over many months. But it wasn’t as simple as, ‘oh, let’s just look up his IP we have logged.’ Law enforcement sent out hundreds of subpoenas to ISPs for physical addresses tied to IP addresses I was using. The Verizon account was just one out of many they had identified. It took months for them to do more cross-referencing before making an educated guess that the Verizon account was tied to my actual IP address.”
When the FBI contacted Verizon, the company provided records (likely including the IMSI number connected to the historical IP addresses) showing that the suspect’s AirCard was transmitting through certain cell towers in a certain part of Santa Clara, California. Likely by using a stingray, the FBI was then able to determine that a device matching that IMSI number was transmitting “from an area the size of approximately three to four apartment units within an apartment complex,” according to Battista’s testimony from February 2011. (However, according to Rigmaiden, “this claim was debunked and the judge agreed with me. The government was forced to concede, but never factually admit, that the StingRay identified my exact apartment. With this concession, they also had to concede that the Stingray device had conducted a Fourth Amendment search and seizure.”) In essence, like in Kyllo, the stingray penetrated the four walls of his apartment and determined Rigmaiden’s precise location.
Battista continued:
So in a sense, what had happened was by using the false identity to obtain the apartment, in a sense unknowingly Mr. Rigmaiden, we believe, in a sense painted a bull’s eye on the door to that particular unit. Because we were looking for a person—the air card had been obtained using a false identity. The apartment had been obtained using a false identity. So it was—we were able then to, using these different classic investigative techniques, zero in on this one particular apartment.
But part of the problem is that stingrays aren’t targeted devices—they don’t just go after one device at a time. In the Rigmaiden case, while the FBI probably knew which IMSI number they were interested in, they were also capturing data (other IMSI numbers) from any other mobile device that happened to be in the area at the time. And that’s one of the biggest problems: like many other dragnet data-collection programs, there is a lot of incidental data collection of a huge number of people just to go after one or a few targeted suspects.
While passive stingrays can simply track and log IMSI numbers, active stingrays are far more sophisticated.
“Some make requests to phones, route traffic through the device, break the [encrypted] A5/1 cipher stream, and can listen to the calls, and ones with text message [capabilities], there are ones that swap words out for messages being sent,” Eric King, then of Privacy International, an advocacy group based in London, told me in a 2015 interview.
King has been at the forefront of trying to understand where, when, and how stingrays are being used. He’s repeatedly gone to law enforcement and military technology industry conventions around the world (which are normally closed to outsiders), and has come away with documents showing the rapid and unchecked growth of these types of technologies. In December 2011, Privacy International, WikiLeaks, and several media
organizations worldwide partnered to publish The Spy Files.
King added that he’s taken photographs of stingrays “that are not much bigger than a BlackBerry.” In 2013, Ars Technica reported on possible stingrays being deployed in the Moscow Metro system, and worse still, a body-worn stingray vest.
“The unit is optimized for short-range covert operation, designed to allow users to get close to Target(s) to maximize the chances of only catching the Target(s’) identities and minimal unwanted collateral,” boasts one of the marketing pamphlets from a company called Cobham. “The solution can be used as a standalone device or integrated into wider data-gathering and geo-tracking systems.”
In a dense urban area, stingrays are sometimes only effective within a radius of about 50 meters, whereas in a rural area they can be effective up to five kilometers, given that there is less physical interference due to the lack of buildings and other structures. Stingrays can also be reprogrammed to work for nearly any carrier in nearly any country.
As stingrays become cheaper and smaller, they will undoubtedly be used not just by large organizations like the FBI, but also by small-town police forces, or worse, organized crime. In 2010, Kristin Paget (then known as Chris Paget) demonstrated to an audience at the hacker conference DEF CON how he built a rudimentary, but functional, stingray for about $1,500 in parts. In that talk, he controlled (“pwned”) around 30 audience members’ phones with his homebrew device.