Habeas Data_Privacy vs. The Rise of Surveillance Tech

Home > Other > Habeas Data_Privacy vs. The Rise of Surveillance Tech > Page 14
Habeas Data_Privacy vs. The Rise of Surveillance Tech Page 14

by Cyrus Farivar


  In April 2014, Vigilant commissioned a poll, which concluded, according to a press release, that the “majority of Californians agree that license plate reader (LPR) technology helps law enforcement solve crimes and any restrictions on who can photograph license plates would be unacceptable.”

  However, under scrutiny, the poll was revealed to not be particularly scientific. Amongst other flaws, the poll did not explain to the respondents what an LPR was, or even confirm whether people had a basic familiarity with the technology.

  Some of the questions were oddly worded: “Do you agree or disagree that license plates reveal nothing about me. People who see my license plate cannot determine my name or where I live.”

  While 24 percent of respondents said they strongly agree, the question refers to “people who see my license plate,” not machines. The questions also do not make clear that LPR data, when gathered by a private firm like Vigilant, is then routinely handed to the cops who can definitely determine your name and where you live.

  The following month, May 2015, Senator Hill withdrew the bill.

  Later that year, however, Vigilant Solutions encountered some smalltime competition: a new piece of open-source software that allows anyone to roll-their-own LPR, known as OpenALPR.

  It seems like only a matter of time before this technology becomes so common that anyone can set up a camera at their home, or point it at their local police station.

  “I would love to have something to rein in the use of private LPRs, I don’t know how likely that is given our country, it seems that with our current administration and Congress, that would be unlikely,” Katz-Lacabe said.

  The idea that Vigilant Solutions has this database of 3.2 billion records and there is no regulation about it, that just bothers me. It enables law enforcement to do something that they wouldn’t be able to do themselves. How can government make decisions on behalf of its citizens when they can be so readily bypassed?

  * * *

  However, LPRs—despite the fact that most Americans wouldn’t know what one looks like even if it was staring them in the face—are practically old news. There are even more advanced technologies, including body-worn cameras and large-scale aerial surveillance that take wanton data capture to an entirely new level. All of these technologies, under the Knotts precedent, are predicated on the legal theory that there is no reasonable expectation of privacy when in public. These technologies are converging, and there seems to be little to stop their inevitable ubiquity.

  Body cameras are rapidly becoming the norm across America, in cities big and small. In May 2014, in Katz-Lacabe’s hometown, the San Leandro City Council formally approved the purchase of body-worn cameras for the San Leandro Police Department. They cost the city more than $440,000 over five years for the cameras and data storage. That wasn’t too much for the city to bear: the annual police budget for 2015–2016 was over $30.9 million.

  While it couldn’t have known it at the time, San Leandro was about to join hundreds, if not thousands of American law enforcement agencies that would quickly be pushed into acquiring cameras and putting them on their officers.

  San Leandro, a small working-class city (population 85,000) to the immediate southeast of Oakland, is one of the Bay Area cities that had restrictive covenants—a clause in a deed or lease that restricts what the owners can do with their property, often used to restrict African-Americans from owning or leasing property in a given area—for decades. In 1960, the city was nearly entirely white, while Oakland, in comparison, had a large African-American population. By 2010, Asian-Americans comprised roughly one-third of San Leandro’s population.

  Less than three months after the San Leandro City Council approved the purchase of body cameras, in August 2014, Michael Brown, an unarmed black 18-year-old in Ferguson, Missouri, was shot and killed by a local cop. Like most law enforcement agencies nationwide at the time, the Ferguson Police Department lacked body cameras. On December 1, 2014, the White House announced a three-year, $263 million grant program for local law enforcement body cameras. Many agencies, like the San Leandro Police Department (SLPD), were already in the process of obtaining body cameras, but Ferguson helped accelerate deployment in many cities.

  “There’s been a lot of talk about body cameras as a silver bullet or a solution,” said President Barack Obama in March 2015. “I think the task force concluded that there is a role for technology to play in building additional trust and accountability, but it’s not a panacea. It has to be embedded in a broader change in culture and a legal framework that ensures that people’s privacy is respected and that not only police officers but the community themselves feel comfortable with how technologies are being used.”

  As body cameras have become more and more commonplace, it’s become clear that many police reform activists and lots of rank-and-file officers want cameras, albeit for different reasons. The activists want them because they don’t trust the police and want a way to keep tabs on them. Cops, by contrast, often feel that they are being wrongly accused by over-zealous citizens who have been fed a steady diet of police brutality videos, and may use lethal force to express that anger.

  Cities like body cameras as they can help exonerate officers accused of misconduct, and protect against lawsuits, which can be costly to defend. After all, even in quiet San Leandro, the city has silently ended numerous cases in recent years rather than let a civil rights case go to trial.

  In 2013, the city paid out $80,000 to settle claims brought by the family of a man who was high on methamphetamine and ended up dying after a struggle with police. In an even stranger incident, San Leandro settled with two men for $45,000 after the pair was accused of engaging in lewd conduct at a local marina restroom. In 2007, San Leandro paid a $395,000 settlement to the family of Jose Maravilla Perez, Jr., who died at the hands of the SLPD after he was repeatedly shocked with a Taser.

  In more recent and higher-profile examples from other cities the list goes on: In 2015, Baltimore, Maryland, settled with the estate of Freddie Gray for $6.4 million. In 2017, St. Anthony, Minnesota, settled with the estate of Philando Castile for $3 million. Also in 2017, Ferguson, Missouri, settled with the estate of Michael Brown for $1 million. In short, cities nationwide are pushing for body cameras to make sure that both citizen and officer are on their best behavior.

  But like LPRs, where data-retention policies are all over the map, so it is with body-worn cameras. In Pueblo, Colorado, some camera footage is kept indefinitely. In Minnesota, traffic stops are kept for a year. In Orlando, Florida, non-evidentiary body-camera videos are kept for 90 days, while in Oakland, California, those same videos are kept for two years.

  Just like with other consumer technology, which is getting faster, cheaper, and smaller all the time, law enforcement surveillance technology is as well. Even now, a Silicon Valley startup, Visual Labs, is selling body-camera software that runs on existing Android phones—eliminating the need for another dedicated piece of hardware on an officer’s body.

  The small, central California town of Dos Palos (population 5,000) in Merced County is one of a handful of law enforcement agencies testing out this system, and is paying considerably less than it would with one of Visual Labs’ larger competitors, like VIEVU or Axon (the company formerly known as Taser).

  * * *

  For now, facial recognition works a lot better in theory than it does in practice, but it’s rapidly improving. While the idea of scanning someone’s face to establish their identity has been floating around science fiction for years, it has now quietly become the norm.

  The technology works, in essence, as a more sophisticated version of LPRs, where software—rather than analyzing letters and numbers—is quickly measuring various physical characteristics, ranging from the distance between the pupils to the shape of one’s jaw, among many others. Like LPR technology, facial recognition is most useful when there’s a “hot list” of faces to compare an unknown face with. To the government’s advantage, there already
is such a list: photos are found in the Department of Motor Vehicles of all states, and the Department of State’s passport photo database.

  However, facial recognition doesn’t work nearly as well when the captured image is of a person who is African-American, captured at a distance, turned at a non-straight angle, or wearing a hoodie or a head covering of some type. Even the 2013 Boston bombing suspects were not caught via facial recognition—it is far more effective when the photo is taken under controlled circumstances (like a mug shot) and compared to a database with similar-quality photos. (Although, algorithms to mitigate this hurdle are rapidly improving, as 2015 research by Facebook shows.)

  In October 2016, Georgetown researchers released a massive report titled “The Perpetual Lineup,” which found that half of all adults in the United States are already in a facial-recognition database. The report also found that in many cases, steps are not being taken to create meaningful policies and oversight for use of the technology. To take one example, there’s nothing stopping law enforcement from capturing faces at large gatherings like sporting events (which took place during the Super Bowl as early as 2001) and political protests.

  Georgetown professor Alvaro Bedoya, top FBI official Kimberly Del Greco, and a number of other activists and government officials were invited to speak before a House committee hearing in March 2017. Bedoya and others warned that facial-recognition technology had potential racial bias built in. They found that such systems often issue more false positives for African-Americans compared to other groups. Plus, the technology could potentially be used to suppress political speech.

  “I think there’s been an aggressive development of technology come to the forefront,” Representative Stephen Lynch (D-Massachusetts) said during the hearing. “When you think about how this could change who we are as a nation, it’s very, very troubling. This nation was founded on protest and is continually shaped by protest. It disturbs me greatly whether it was the death of Freddie Gray and those protests, or the women’s protest recently that was all over the country, it disturbs me greatly that we’re out there taking in this information.”

  Lawmakers acknowledged that as a society we all want criminal suspects to be investigated, apprehended, and brought to justice as appropriate. However, the idea of pervasive image capture by street cops as a way to routinely and automatically identify individuals is just as invasive as demanding that everyone show their papers at a moment’s notice.

  For now, the most prominent uses of facial recognition are relatively low-level: a man who fled federal custody 25 years ago was arrested in Nevada in July 2017 due to that state’s facial-recognition technology finding out that he had already been issued a driver’s license under a different name.

  In June 2017, a Jacksonville, Florida, man—who told undercover policemen that his name was “Midnight” during a $50 crack cocaine buy—was sentenced to eight years in prison after he was identified via facial recognition. An Indiana man who was suspected of child molestation in 1999 was arrested in Oregon in January 2017 due to a facial-recognition search. A year earlier, in January 2016, New York State authorities said they had arrested 100 people over identity fraud stemming from an upgrade to an automatic scan of their Department of Motor Vehicles records. Like with LPRs, it’s likely that the hit rate remains quite low.

  While facial recognition is currently in use by local and federal law enforcement, it is only a matter of time before this technology is fully integrated into body-worn cameras, making it yet another tool in an officer’s toolkit. In fact, both Axon executives and at least some police officers relish the notion.

  In July 2016, Lieutenant Dan Zehnder, of the Las Vegas Police Department, who runs the agency’s body-camera program, imagines a near-future tool where an officer on patrol on the Las Vegas Strip’s camera routinely captures all faces and the scans are sent to be analyzed off-site.

  “And there is real-time analysis, and then in my earpiece there is, ‘Hey, that guy you just passed 20 feet ago has an outstanding warrant.’ Wow,” he told Bloomberg Businessweek.

  No courts anywhere have addressed the question as to whether facial recognition should be treated like a pair of binoculars (no court order or warrant required), or if it is an invasive law enforcement technique, analogous to a wiretap, which requires a super-warrant standard.

  * * *

  For now, LPRs, body-worn cameras, and facial-recognition technology are largely limited to terrestrial applications. They routinely capture particular types of data in a limited environment, the field of view of the camera. But what happens when that field of view is expanded to 10,000 feet in the air?

  Persistent Surveillance Systems (PSS) does exactly that. For years, the company has been operating a TiVo-in-the-sky setup: a plane equipped with specialized cameras designed to capture all movement down below over long periods of time. It’s designed to help law enforcement track suspect vehicles and people, proving that they were at the location of particular crime scenes. (Harris Corporation, the maker of the stingray, which will be covered later in the book, makes a similar product, known as the CorvusEye.)

  The Dayton, Ohio, company is run by Ross McNutt, a former Air Force officer who wants domestic policing to be more like military intelligence. He’s been working on this technology for over a decade, back when it first began as a military research project at the Air Force Institute of Technology.

  “We have hundreds of politicians that say crime is their number one issue, and no it’s not,” McNutt told Ars Technica in July 2014. “If it was true, they wouldn’t stand for 36,000 crimes a year (per city), worth a $1 billion a year. It shows up in lower housing prices and it shows in people not wanting to move there. If you could get rid of the crime stigma you would see house prices rise and businesses move there. I am frustrated that politicians don’t have the leadership to do it.”

  For years, PSS has attempted to get cities to sign on as permanent customers, but it’s run into roadblocks. In 2011, the company ran a trial in Compton, California, for nine days, but kept it quiet from local politicians, community members, and reporters. That trial was not revealed publicly until the Center for Investigative Reporting did a story on it in April 2014.

  “The system was kind of kept confidential from everybody in the public,” Los Angeles Sheriff’s Department sergeant Doug Iketani said. “A lot of people do have a problem with the eye in the sky, the Big Brother, so in order to mitigate any of those kinds of complaints, we basically kept it pretty hush-hush.”

  In 2013, Dayton, Ohio—the hometown of PSS—wouldn’t even sign a deal with the company after pushback from local activists, and an expense it could not pay.

  In the summer of 2016, Bloomberg Businessweek did a blockbuster story about how the Baltimore Police Department hired PSS in early 2016 for a test that lasted several weeks. Funds came not from local taxpayers, but rather from Texas philanthropists who funneled money through a Baltimore police charity, where it was not subject to normal oversight rules. McNutt and his colleagues set up shop in a small office in a downtown parking garage, with a simple, but vague sign outside: “Community Support Program.”

  After the Bloomberg Businessweek story came out in August 2016, McNutt made an appearance at a press conference with Baltimore police officials, defending the program.

  “We believe we contribute significantly to the safety and support of the citizens in Baltimore,” he said. “We do have the legal analysis that covers the program. We are no different than any other law enforcement program. There are four Supreme Court precedents.”

  But again, like the previous cases discussed, including Knotts and Smith, there remains a huge gulf between the facts of those cases and the vast implications that they have created.

  McNutt was likely referring to the 1989 Supreme Court decision in Florida v. Riley, which found that no warrant is needed if police conduct aerial surveillance. However, that case only involved one helicopter flying over one person’s alleged marijuana grow
in a greenhouse that had some roofing panels removed. In that case, the police helicopter pilot could simply see what was inside with his own two eyes—he didn’t even use an infrared imaging device or anything more advanced.

  In June 2017, Miami-Dade Police Department floated yet another PSS trial in South Florida—but after concerns by the American Civil Liberties Union, among others, the plan was scrapped within two weeks.

  * * *

  Just like Smith, which inadvertently paved the way for phone records to be indiscriminately captured on a much larger scale by the National Security Agency, the combination of Knotts and Riley are creating a legal theory that allows for everything to be watched from the sky, for now.

  Many of these technologies, including LPRs, body-worn cameras, wide-area aerial surveillance—not to mention others, like drones, cell-site simulators, and more—would have been covered as part of a proposed California law that Katz-Lacabe helped push.

  The bill, known as Senate Bill 21, aimed to bring the use of these technologies into the light of day. It declared:

  (a) While law enforcement agencies increasingly rely on surveillance technologies because those technologies may enhance community safety and aid in the investigation of crimes, those technologies are often used without any written rules or civilian oversight, and the ability of surveillance technology to enhance public safety should be balanced with reasonable safeguards for residents’ civil liberties and privacy.

  (b) Promoting a safer community through the use of surveillance technology while preserving the protection of civil liberties and privacy are not mutually exclusive goals, and policymakers should be empowered to make informed decisions about what kind of surveillance technologies should be used in their community.

 

‹ Prev