San Francisco made history in 2019 when its Board of Supervisors voted to ban city agencies including the police department from using face recognition. About two dozen other US cities have since followed suit. But on Tuesday San Francisco voters appeared to turn against the idea of restricting police technology, backing a ballot proposition that will make it easier for city police to deploy drones and other surveillance tools.
Proposition E passed with 60 percent of the vote and was backed by San Francisco Mayor London Breed. It gives the San Francisco Police Department new freedom to install public security cameras and deploy drones without oversight from the city’s Police Commission or Board of Supervisors. It also loosens a requirement that SFPD get clearance from the Board of Supervisors before adopting new surveillance technology, allowing approval to be sought any time within the first year.
Matt Cagle, a senior staff attorney with ACLU of Northern California, says those changes leave the existing ban on face recognition in place but loosen other important protections. “We’re concerned that Proposition E will result in people in San Francisco being subject to unproven and dangerous technology,” he says. “This is a cynical attempt by powerful interests to exploit fears about crime and shift more power to the police.”
Mayor Breed and other backers have positioned it as an answer to concern about crime in San Francisco. Crime figures have broadly declined but fentanyl has recently driven an increase in overdose deaths and commercial downtown neighborhoods are still struggling with pandemic-driven office and retail vacancies. The proposition was also supported by groups associated with the tech industry, including campaign group GrowSF, which did not respond to a request for comment.
“By supporting the work of our police officers, expanding our use of technology and getting officers out from behind their desks and onto our streets, we will continue in our mission to make San Francisco a safer city,” Mayor Breed said in a statement on the proposition passing. She noted that 2023 saw the lowest crime rates in a decade in the city—except for a pandemic blip in 2020—with rates of property crime and violent crime continuing to decline further in 2024.
Proposition E also gives police more freedom to pursue suspects in car chases and reduces paperwork obligations, including when officers resort to use of force.
Caitlin Seeley George, managing director and campaign director for Fight for the Future, a nonprofit that has long campaigned against the use of face recognition, calls the proposition “a blow to the hard-fought reforms that San Francisco has championed in recent years to rein in surveillance.”
“By expanding police use of surveillance technology, while simultaneously reducing oversight and transparency, it undermines peoples’ rights and will create scenarios where people are at greater risk of harm,” George says.
Although Cagle of ACLU shares her concerns that San Francisco citizens will be less safe, he says the city should retain its reputation for having catalyzed a US-wide pushback against surveillance. San Francisco’s 2019 face recognition ban was followed by around two dozen other cities, many of which also added new oversight mechanisms for police surveillance.
“What San Francisco started by passing that ban and oversight legislation is so much bigger than the city,” Cagle says. “It normalized rejecting the idea that surveillance systems will be rolled out simply because they exist.”
The San Francisco mayor’s office hasn’t said which type of drone, surveillance, or body-worn cameras police might use under the new rules. Anshel Sag, a Principal Analyst at Moor Insights & Strategy, a tech research firm, notes that almost all newer drones on the market have forms of face recognition technology built-in. Some of Insta360’s action cameras include this, he says, as well as drones made by DJI, the world’s largest commercial drone maker. “DJI’s cameras use it to track a person and stabilize the video capture,” he says.
In some cases, the customer may be able to toggle off tracking options. And, Sag adds, the video-capture technology may be more coarse and not specifically track a face. But this isn’t always clear to users of the technology, he says, “because the object-tracking algorithms operate like a black box.”
Saira Hussain, a senior staff attorney for the Electronic Frontier Foundation, notes that San Francisco’ previous ban on face recognition allows the police department to possess devices with the technology built in if it’s a manufacturer-installed capability. (San Francisco’s Board of Supervisors had to update the law to make iPhones, which use face recognition technology to unlock, legal.) The law stipulates that these devices not be acquired for the basis of using it in policing functions.
More concerning to the EFF specifically is how Proposition E allows for a certain level of secrecy around surveillance technologies trialed by SFPD, for as long as a year without being disclosed, Hussain says. “It’s about making sure the police stick to the contours of the law.”
Additional reporting by Amanda Hoover
Source : Wired