San Francisco(CNN Business) San Francisco, long one of the most tech-friendly and tech-savvy cities in the world, is now the first in the United States to prohibit its government from using facial-recognition technology.
The ban is part of a broader anti-surveillance ordinance that the city's Board of Supervisors approved on Tuesday. The ordinance, which outlaws the use of facial-recognition technology by police and other government departments, could also spur other local governments to take similar action. Eight of the board's 11 supervisors voted in favor of it; one voted against it, and two who support it were absent.
Facial-recognition systems are increasingly used everywhere from police departments to rock concerts to homes, stores and schools. They are designed to identify specific people from live video feeds, recorded video footage or still photos, often by comparing their features with a set of faces (such as mugshots).
San Francisco's new rule, which is set to go into effect in a month, forbids the use of facial-recognition technology by the city's 53 departments — including the San Francisco Police Department, which doesn't currently use such technology but did test it out between 2013 and 2017. However, the ordinance carves out an exception for federally controlled facilities at San Francisco International Airport and the Port of San Francisco. The ordinance doesn't prevent businesses or residents from using facial recognition or surveillance technology in general — such as on their own security cameras. And it also doesn't do anything to limit police from, say, using footage from a person's Nest camera to assist in a criminal case.
"We all support good policing but none of us want to live in a police state," San Francisco Supervisor Aaron Peskin, who introduced the bill earlier this year, told CNN Business ahead of the vote.
The ordinance adds yet more fuel to the fire blazing around facial-recognition technology. While the technology grows in popularity, it has come under increased scrutiny as concerns mount regarding its deployment, accuracy, and even where the faces come from that are used to train the systems.
In San Francisco, Peskin is concerned that the technology is "so fundamentally invasive" that it shouldn't be used.
"I think San Francisco has a responsibility to speak up on things that are affecting the entire globe, that are happening in our front yard," he said.
Facial recognition has improved dramatically in recent years due to the popularity of a powerful form of machine learning called deep learning. In a typical system, facial features are analyzed and then compared with labeled faces in a database.
Yet AI researchers and civil rights groups such as the American Civil Liberties Union are particularly concerned about accuracy and bias in facial-recognition systems. There are concerns that they are not as effective at correctly recognizing people of color and women. One reason for this issue is that the datasets used to train the software may be disproportionately male and white.
The ACLU is one of many civil-rights groups supporting the ordinance. Matt Cagle, a technology and civil liberties attorney at the ACLU of Northern California, said the raft of issues posed by facial-recognition systems mean the city's legislation would prevent harm to community members. He also expects that the rule will prompt other cities to follow suit.
"With this vote, San Francisco has declared that face surveillance technology is incompatible with a healthy democracy and that residents deserve a voice in decisions about high-tech surveillance," he said in a statement Tuesday afternoon. "We applaud the city for listening to the community, and leading the way forward with this crucial legislation."
There are currently no federal laws addressing how artificial-intelligence technology in general, or facial-recognition systems specifically, can be used, though a Senate bill introduced in March would force companies to get consent from consumers before collecting and sharing identifying data.
A few states and local governments have made their own efforts: Illinois, for example, has a law that requires companies get consent from customers before collecting biometric information. California's senate is currently considering a bill that would ban police in the state from using biometric technology — such as facial recognition — with body-camera footage.
In the Bay Area alone, Berkeley, Oakland, Palo Alto and Santa Clara County (of which Palo Alto is a part) have passed their own surveillance-technology laws. Oakland is also currently considering whether to ban the use of facial-recognition technology.
Under the new San Francisco law, any city department that wants to use surveillance technology or services (such as the police department if it were interested in buying new license-plate readers, for example) must first get approval from the Board of Supervisors. That process will include submitting information about the technology and how it will be used, and presenting it at a public hearing. With the new rule, any city department that already uses surveillance tech will need to tell the board how it is being used.
The ordinance also states that the city will need to report to the Board of Supervisors each year on whether surveillance equipment and services are being used in the ways for which they were approved, and include details like what data was kept, shared or erased.
In a statement, the San Francisco Police Department said it welcomes moves to protect civil liberties and civil rights "while balancing the needs that protect the residents, visitors and businesses of San Francisco."
Even before the vote took place, the police department said that, keeping with the rule, it is auditing the technologies it uses and related policies.
Some locals have been vocally opposed to the surveillance ordinance, including several groups of residents. Frank Noto, president of Stop Crime SF, a group focused on crime prevention, said prior to the vote that his organization recognizes privacy and civil-liberties concerns that may have prompted the ordinance's introduction, but sees it as flawed legislation largely because it requires the police department to get approval from the city for existing surveillance technology.
After the ordinance passed, Stop Crime SF vice president Joel Engardio said that overall the legislation is "necessary and helpful" though it "could have been better."
And while Stop Crime SF sees the faults in existing facial-recognition technology, it's also concerned about prohibiting its use entirely. The group believes a moratorium on using it might be a better option so that it's possible to use the technology when it improves.
"When responsibly used, it could be a good public safety tool," Engardio said.