We can pass legislation to keep the government from gathering information about us with certain methods, but as this surveillance infrastructure spreads and technology (particularly image recognition) improves, law enforcement won’t need to use the most provocative, constitutionally questionable methods to get a credible picture of your present and future activities. When agents began putting GPS trackers on the undercarriage of cars to track the movement of suspects, they were sued and lost. Just a couple of years later the loss proved irrelevant. It turns out that tollbooth, streetlight, and security cameras all working together can track license plates across a city nearly as well as a GPS chip can broadcast a suspect’s location.
While our risk of falling victim to violent crime in an American city is diminishing, the risk of being caught in the gears of an ever more powerful law and order apparatus is growing. This is not a trade-off many are willing to make. It’s natural to dread a future in which our civilian power over law enforcement is diminished simply because we can’t see what they can, because we have small amounts of information and they have complete recall of our digital trail. To be an uninformed populace is to be a disarmed one. When our local cops or our national security personnel are not only better armed but also exponentially more intelligent than are we, the chances for abuse of power increases and the challenge of reforming the system becomes greater. That’s either a cause of concern or not, depending on your relationship with law enforcement.
These fears reflect reality, but not completely. In truth, some of us are much better informed than others. And the primary driver of the interconnected physical world is not government but garage entrepreneurs. The bigger threat to our privacy is not Big Brother; it’s us.
The Return of Gordon Jones
Remember Guardian Watch from chapter 1? It was an Internet of Things service that allowed anyone with a video phone to stream live footage of a disaster to law enforcement, first responders, and the public. Not long after developing Guardian Watch, creator Gordon Jones realized that for the service to really flourish, to save the life of someone in an emergency, or particularly stanch a disaster affecting an entire city, it had to already be on phones, a lot of them. This presented the classic social start-up catch-22: the density problem. In order for Guardian Watch to become the next Foursquare of disaster response, it had to already be the Foursquare of disaster response; it had to have coverage, lots of users able to supply enough information and content to keep the app relevant.
Problem: the utility of Jones’s creation during an emergency is obvious to anyone who has seen the demo, but nobody joins a social network while literally running for her life. Jones realized that network growth would depend largely on people adopting the service for reasons other than disaster preparedness. He rebranded the service (at the time called 911 Observer) as an enhanced neighborhood-watch network system: immediate help in emergencies—both real and imagined.
His first customer was the Richland County Sheriff’s Department in Augusta, Georgia. In effect, Guardian Watch allows the department to crowd-source some of the more difficult aspects of evidence gathering.
This idea is not without precedent. One of former Texas governor Rick Perry’s more creative legislative accomplishments was a program to digitally crowd-source border enforcement. The Texas Virtual Border Watch initiative enabled busybody constituents to monitor stretches of fence on the Texas-Mexico border from the comfort of their duct-taped La-Z-Boys, via live feed. The program was touted as a potential boon to taxpayers. The public was going to do for free what cost millions in pay to extra border guards.
The program failed for reasons having nothing to do with privacy and everything to do with why border patrolling is a hard job even on a good day. Watching a fence all day is boring. The site shut down in 2009 when, after an initial spike, traffic plummeted.
Guardian Watch allocates attentional resources to more interesting curated content, including but not limited to evidence of crime. Members can post pictures and videos from their phones into files such as “assault,” “burglary,” “domestic abuse,” even “suspicious behavior.” Much of the content uploaded thus far is of dubious value to law enforcement. One picture, marked “sexual,” appears to show a nude couple enjoying coitus in a park . . . or a beached whale in a pasture . . . or a dinosaur. Many of the videos in the “suspicious behavior” file appear to show pant legs and shoes, all clearly shot by accident.
But these are the early days for the network, which Jones has marketed very selectively and which boasted thirty-nine hundred users as of summer 2012. If Guardian Watch can attract a following and funding, and can scale up to meet demand, if all the ducks and planets align themselves to favor him, his start-up or one like it could revolutionize not only emergency response but also law enforcement. To understand this potential, simply imagine a future in which geo-tagged pictures and video—images captured in the moment and digitally attached to a location, time, and person—take the place of unreliable witness testimony.
In addition to the clear privacy issues associated with this practice, there are questions of fecundity. The majority of content on social networking sites is personal and benign in nature, the daily annals of parenting and partying (sometimes both at once). A tweet or post about a suspicious person in your neighborhood is buried among a lot of other noise not relevant to law enforcement. The same problem hobbles most crime surveillance programs in urban areas. The United Kingdom has been experimenting with a camera program for years, one very similar to Texas Virtual Border Watch, but staffed by professionals who are paid to watch footage. Like the Texas program the vast majority of the footage is noise. The cost of sifting through it is high but you can automate it somewhat through algorithms. Guardian Watch represents a clear innovation in the way it enlists human beings to manually select evidence that’s relevant to them.
Almost all the posted pictures pass through an intermediary before the content goes public. Jones wants to get rid of this step; a picture of a suspicious person in your neighborhood is really only valuable in real time. He also wants to further enhance the system with facial recognition capability, enabling it to tag people who show up in posted photos and videos automatically. That’s fine if you trust your neighbors. But a vigilante could use real-time video of, say, a lone teen wandering nearby to quickly assemble a posse . . . or a lynch mob.
I pointed this out to Jones in the context of the Trayvon Martin case. Do we really want the George Zimmermans of the world to be more capable than they are now? He argued that the Martin case is a perfect example of the necessity of his system. Had a neighborhood resident been able to use Guardian Watch, Trayvon’s father, Tracy, would have received a text or video about a suspicious person in his neighborhood in real time and seen Trayvon. He could send out a text alert to the group before George Zimmerman drew his gun.
I asked Jones about the prospect for more subtle forms of misuse. What’s to stop someone from publicly tagging his neighbor as a domestic abuser or a terrorist? He answered that I have the power to post any number of unflattering things about my neighbors to any number of social networks and officially accuse anyone of domestic abuse with a five-minute phone call. But there’s something of a social cost to posting that sort of content on Facebook. No similar social cost exists for posting the same material on Guardian Watch. It’s the very purpose of the site.
The hope is that the citizen-policing system will self-police according to the same rules it uses to police others. Just as there exists a log of every Facebook tag and every domestic-disturbance call to the police, so every post on Guardian Watch creates data not just about the subject but also about the poster. Members who abuse the system lose influence.
At least that’s the hope.
Naturally, anyone with a smartphone can already stream videos of people in her neighborhood to a Google+ circle or publicly through the Google+ Hangouts service, or to a specified group
on Facebook. Guardian Watch has just taken the extra step of “friending” law enforcement and like-minded people on the user’s behalf. Whether you see Jones’s little start-up as a great way to improve public safety without increasing police budgets or as a lot of white people taking pictures of nonwhite people to make them nervous, Guardian Watch would exist without Gordon Jones.
Here’s one of the more interesting examples: a Russian online newspaper called the Village, created an app provocatively named Parking Douche. It allows anyone with a smartphone to publicly shame a bad parker to that parker’s neighbors, coworkers, and anyone in the neighborhood. To use it, you just take a picture of a car that’s parked in a way that annoys you. The app then creates a banner ad with a picture of the car, license plate, and the name of the street. When people nearby (as determined by IP address) attempt to access online news though the Village’s Web site, they see the ad. It gets in their way. “Thanks to the IP address, only douches in your area will be highlighted, so all the offenders will be exposed to their colleagues’ friends and neighbors,” says the narrator in the demo video.19
What can we do to protect our privacy in a world where its value is falling faster than that of last year’s cell phone? One creative if tongue-in-cheek proposal comes from British artist Mark Shepard whose Sentient City Survival Kit includes such items as a CCD-Me-Not umbrella studded with 256 infrared light-emitting diods (LEDs) to scramble the night vision of closed-circuit camera systems. My favorite item in the kit is the Under(a)ware, a set of undergarments that can detect RFID tags and vibrate to alert the wearer. “In the near future sentient shopping center, item level tagging and discrete data sniffing will become both pervasive corporate culture and a common criminal pastime,” states a computerized voice on the demo video.
Unless our legal system becomes more transparent, accountable, and accessible we’ll never feel certain that the people looking out for us won’t abuse their power to persecute people who may technically be criminals but pose no real threat, such as pot smokers, prostitutes, and those who commit an act of trespass as part of a protest. How will we respond to this? Yes, we could put RFID tag readers in our underpants. Alternatively, we could decide to use surveillance and data to actually make the world safer and not abuse it. When you adopt the assumption that that’s possible, opportunities open up.
If the bad news is the cops are going to have a better window into your career as a lawbreaker, the good news is that in the naked future you’re more than just a suspect on her way to her next crime; you’re a set of probabilities, potential costs, and potential benefits. The challenge for all of us now is to make the price of overzealous or discriminatory policing both high and conspicuous. The benefits of good policing must be more readily obvious as well. The social and public costs of pestering and prosecuting people for petty crimes should be visible to citizens, lawmakers, and police all at once. Before that happens we may have to settle for those costs becoming more transparent to law enforcement, where at least some departments or agencies will use them as part of their decision making. The same sort of technology that took away your privacy is beginning to provide just that opportunity.
The Microsoft Windows for Predictive Policing
The year is 2002. The Palo Alto–based online payment outfit PayPal has a big problem; Russian mobsters are defrauding the company to the tune of $3 to $4 million a month. PayPal’s founder, Peter Thiel, and his coworker Joe Lonsdale realize they have to develop a system to better track the money moving through PayPal. Simply flagging individual transactions and users isn’t enough. The infiltrators adapt far too quickly, setting up new locations and user identities before the old fake profiles grow cold. Thiel and Lonsdale know that if they can better chart how the money gets into the system and how it leaves, where it’s spent and what it buys, then they can see whom it touches along the way.
They develop a new program to solve the problem. The Russian operation is broken. The fraud stops. It’s at this point that they realize they’ve invented something they can spin out. They secure funding from the Central Intelligence Agency’s investment arm, In-Q-Tel, and create Palantir Technologies, named for the seeing stones in The Lord of the Rings.
When I arrive at Palantir headquarters on a bright August day in 2012, I find halls filled with young programmers and graffiti-style murals. The only feature that distinguishes Palantir as a military and police service provider is the big-shouldered law enforcement types in the front lobby. I meet Courtney Bowman, a company spokesman and privacy expert. Bowman’s got deep statistical-modeling experience as well as a background in philosophy. In his work at Palantir the second credential is just as useful as the first.
Palantir today is a platform to coordinate and organize files for different law enforcement purposes and agencies. It’s like an operating system for classified or important information. The company itself doesn’t do any evidence collection, snooping, or investigating. It simply offers software solutions to connect, centralize, and especially visualize information that may be distributed across a wide number of databases and players.
If you’re a local law enforcement professional and you have a query about a suspect in a shooting, you can go to the Palantir interface on your desktop and find relevant records including arrests as well as suspect affiliations and even recent purchases. But the system doesn’t give everyone the same access, as different clearance levels can exist across departments and agencies. If, however, you want to know where a particular record came from and who last updated it, the system can tell you that. If you want to see how two individuals may be connected to a single incident, crime, or transaction, the system can draw a map between the points of evidence.
“What Palantir can do,” says Bowman, “is take those model outcomes or those hot-spot views, and, also, known information from criminal history records, from records management systems, from arrest records, and a multitude of other data sources that police legitimately have access to, and tie those all together into a picture of how the crimes, involving specific suspects or specific behavioral patterns, might play out.”
For instance, say you’re watching two suspects in a network. Person A is connected to person B through several affiliations. Person A makes a particular type of purchase, say, buying twelve rolls of toilet paper, before robbing a bank. The next day person B goes to a convenience store and buys twelve rolls of toilet paper. It’s reasonable to infer he might be preparing to rob a bank. It’s not enough to make an arrest but it does suggest an emerging pattern.
The practice of connection tracking, even when all that’s being observed is correlation, is extremely fruitful in intelligence. In 2003, after months of trying to get information on Saddam Hussein’s whereabouts from Hussein’s senior officers and inner circle, the U.S. military used a social network mapping tool called i2 to chart the connections between his chauffeurs. This led them eventually to the farmhouse in Tikrit where Hussein was captured.20
Tracing the social network of a dictator during war is rather less controversial than analyzing the connections of millions of Americans. Yet this is what the U.S. government under the Obama administration has begun to do. The obscure National Counterterrorism Center (NCTC) routinely keeps personal transaction information, flight information, and other types of data on Americans who have neither been convicted nor are under suspicion of a crime. It does so for as long as five years under the vague auspice that it may be useful in some sort of investigation one day, even if that information isn’t relevant to any operation at the time of collection.
The subjects of this transaction surveillance are people who have found their way into the Terrorist Identities Datamart Environment (TIDE), an enormous database of known terrorists, suspected terrorists, people who are loosely associated with suspected terrorists in some way (beekeepers, elementary school teachers, et cetera)—more than five hundred thousand links in all. The government has also given itself license
to share the data across departments and even with other governments, despite the Privacy Act of 1974, which prohibits this sort of sharing.21
If legal, technical, and public relations costs of expanding surveillance remain as low as they are now, it’s easy to imagine law enforcement considering a much broader array of connections and transactions worthy of monitoring.
But not all connection tracking in law enforcement is this creepy or controversial. For instance, say you’re an investigator, and a roughneck from a particular gang, let’s call him John Jet, is murdered. You know that the hit happened on a Friday night on disputed turf. Because this is gang related you’re not just looking to solve the crime, you need to make an inference about the members of the rival gang (the Sharks) who have the highest probability of becoming victims of a retaliatory strike. Figuring this out may require sharing information across departments and even making inferences on the basis of correlations. But in this situation, you can deploy an antigang unit to a particular location at a particular time and do so without worrying about trampling on anyone’s civil liberties.
While the NCTC may not need to consider itself accountable to the public, Palantir does. It’s sort of like a beta tester. It plays a feedback role that is helping Palantir improve its system and make the system more valuable. As Bowman explains, “The government will make claims about collection of suspicious activity reports as being critical because you never know when that information is going to be useful. The privacy advocate will come back and say, ‘Well, show me when this [piece of personal information on a subject] is actually useful. Give me hard metrics of why it’s justifiable to hold on to this information.’ If we can use the platform to demonstrate cases where this is useful, we can start to bridge the gap between these two communities and explain why this is valuable information.”
The Naked Future Page 25