City Of San Francisco Bans Use Of Facial Recognition Tech By Government Agencies
from the FaceBlock dept
San Francisco is getting out ahead of the tech curve. Instead of waiting until after law enforcement had already deployed a suite of surveillance tools, city legislators have passed a ban on facial recognition tech by government agencies.
San Francisco, long at the heart of the technology revolution, took a stand against potential abuse on Tuesday by banning the use of facial recognition software by the police and other agencies.
The action, which came in an 8-to-1 vote by the Board of Supervisors, makes San Francisco the first major American city to block a tool that many police forces are turning to in the search for both small-time criminal suspects and perpetrators of mass carnage.
This move is being applauded by privacy and rights activists. Obviously, it has it critics as well. But the argument made here is pretty much a non-starter.
“It is ridiculous to deny the value of this technology in securing airports and border installations,” said Jonathan Turley, a constitutional law expert at George Washington University. “It is hard to deny that there is a public safety value to this technology.”
The ban doesn't affect federal agencies so the borders and airports will be just as protected as they ever were by software known mostly for its false positive rates. And in San Francisco, the use of facial recognition tech by law enforcement is -- and apparently will remain -- theoretical. To date, no local law enforcement agency has deployed facial recognition tech in San Francisco.
But the ban [PDF] also prevents local cops from pulling info from outside databases compiled using facial recognition tech, which means there won't be any backdoor searches for faces. This whole package of preemptives isn't popular with local cops… or at least not with their union representation.
[T]he San Francisco Police Officers Association, an officers’ union, said the ban would hinder their members’ efforts to investigate crime.
“Although we understand that it’s not a 100 percent accurate technology yet, it’s still evolving,” said Tony Montoya, the president of the association. “I think it has been successful in at least providing leads to criminal investigators.”
Upon information but mostly belief, the SFPOA touts the success of a tool it's never used and backs it with facts not in evidence. Undeniably, facial recognition software has been used to capture criminals. But there's little suggesting it's a regular occurrence, much less one that offsets citizens' long-held beliefs that their commutes in public should not become the tech equivalent of being tailed by police officers at all times.
Bells are easier to ring than un-ring. The city's decision to ring the ban bell before the deployment of facial recognition tech by local agencies shoves the burden on law enforcement to show it can be trusted with the surveillance tech it already has before it starts asking for this ban to be rolled back. Government agencies know better than most how much of an uphill battle repeals are. The SFPD lost before it even knew it was playing.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: facial recognition, law enforcement, san francisco
Reader Comments
Subscribe: RSS
View by: Time | Thread
“Although we understand that it’s not a 100 percent accurate technology yet..."
Your understanding leaves something to be desired. LMFTFY:
Although it’s not a 100 percent inaccurate technology, although it is damn close to being 100 percent inaccurate across the board...
[ link to this | view in thread ]
Security: Detection vs Investigation
Looking at many security type devices, such as cameras, they are useful mostly in looking for perpetrators after the fact, rather than prevention. Locks, gates, walls, etc. might be useful in preventing crime, at least until ways can be found to breach those.
So where is the public safety value in facial recognition? Searching for perpetrators after the fact? For the investigative value? That doesn't make me feel more safe. That they have a 2% or 3% chance of finding that face vs. the many other options for investigation that take actual work, and is still an after the fact event. If they know the face of a potential perpetrator in advance, then detain them, that is if you have the goods on them. If not, then why are you looking for them? Partial goods might be a reason, but then there are all the innocent people being 'recognized' whose privacy is then violated.
I deny any value in this technology, and will continue to do so even then the technology becomes better. Even if it was 100% accurate, it is still tends to be after the fact, and violates every person who isn't a criminal their privacy. Security, at least in the sense of terrorism, for example, would be better enacted in prevention, not investigation. There is a whole lot of surveillance going on, and when they find actual links to something nefarious, they should act, not preemptively, but as law enforcement agents should. Make a case and bring it and the supposed perpetrators to a court.
And it doesn't take facial recognition software to ID a person entering the country at a border, it takes well trained and currently informed agents checking passports.
[ link to this | view in thread ]
Known, Knowns*
“It is ridiculous to deny the value of this technology in securing airports and border installations,” said Jonathan Turley, a constitutional law expert at George Washington University. “It is hard to deny that there is a public safety value to this technology.”
It is ridiculous to make specious claims about the value of this technology when it has a known false postive rate of 94%.
Mayhap one day in the future this technology will perform as advertised but until that time:
Caveat emptor, dear frogs.
*
There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don't know. But there are also unknown unknowns. There are things we don't know we don't know. ~ Donald Rumsfeld , War Criminal, Idiot
https://www.brainyquote.com/quotes/donald_rumsfeld_148142
[ link to this | view in thread ]
Re: Known, Knowns*
Yes, it's those unknown unknowns that idiots who are absolutely certain fail to account for.
Then there are those who are sort of certain, but fail to account for known unknowns. They are knowingly obtuse about those.
And then the cream of the crop, those who are also absolutely certain and cannot account for known known's because they know better, but don't actually know anything as their ideology prevents them from knowing. We have a few around here.
[ link to this | view in thread ]
Re: Security: Detection vs Investigation
"So where is the public safety value in facial recognition? Searching for perpetrators after the fact?"
Yes. Airports, border crossings, those kind of places make sense - the "faces" of thousands of criminals are already in the databases.
Everyday "street" use? It'll be abused just like ALPR's are - they're mainly used to catch such dangerous criminals as parking ticket scofflaws.
Of course, all this only applies if the recognition system is at least reasonably accurate - say 30%.
Since current tech is about 95% inaccurate, perhaps the best use of it would be to round up the 5% or so that it doesn't have a false match for...
[ link to this | view in thread ]
Re: Re: Known, Knowns*
It's a good thing you know that I know that you know that I know that you know what you're talking about...
/s
[ link to this | view in thread ]
Re: Re: Re: Known, Knowns*
...is unknown.
[ link to this | view in thread ]
The mistakes are ridiculously overblown to suit the agenda of those who have something to hide.
China uses the technology excellently, with a database of over 1.6 billion faces in it.
[ link to this | view in thread ]
Re: Re: Re: Re: Known, Knowns*
Knowingly knowing that unknown unknowns are unknowable makes knowing that unknowable unknowns knowable.
[ link to this | view in thread ]
Re:
Can you point to any evidence of their accuracy rate? No? Well then, how is it excellent?
Credible citations necessary.
[ link to this | view in thread ]
Re:
"China uses the technology excellently, with a database of over 1.6 billion faces in it."
How well does it work during high pollution alerts?
[ link to this | view in thread ]
Re: Re:
Can you point to any evidence of their accuracy rate?
I would say that the investments in this technology and general scientific knowledge should be sufficient to shift the burden of proof. The stories about the mistakes have been a bit sensationalist.
Two in five ironworkers on NYC skyscrapers died on the job. Early trains used to derail off bridges or burn up. What we do know is that the facial composition is extremely unique and that capturing it is possible. We also know that it will become more accurate over time.
Money talks, so if you're right the money in this will dry up.
[ link to this | view in thread ]
Re: Re:
*"China uses the technology excellently, with a database of over 1.6 billion faces in it."
How well does it work during high pollution alerts?*
Considering that we haven't even perfected autocorrect, we should certainly proceed with caution, but the underlying science is sound.
[ link to this | view in thread ]
I would like to point out that the City of SF banning this is, from a practical perspective, meaningless.
They can't stop the Federal government from doing it inside the city nor can they stop other state agencies from doing it.
So the local cops might not do it - instead of spending money on getting the tech themselves they'll partner up with a state or federal agency for an 'information sharing' agreement.
Its a similar situation with bodycams - local policies might require them but they only apply to local LEO. And LEO's working with state or federal task forces will be ordered to leave their bodycam behind.
Or Stingrays.
Or parallel construction
Or Civil Asset Forfeiture.
[ link to this | view in thread ]
Re: Re: Re:
Let's for argument sake say they make it 100%, confirmably reliable. So what? The 1 in 10,000th person they are looking for does not negate the 9,999 people's who's privacy was violated (or whatever the numbers really are).
Then one question is are those 9,999 privacy violations worth the i catch? Another question is, are there other methods that don't violate multitudes of non suspect persons privacy? And the fact is that the answer to that second question is yes (but it takes more work), and the answer to the first question is no, mostly in light of the answer to the second question (but they don't care).
[ link to this | view in thread ]
Re: Re: Re:
I would say that the investments in this technology and general scientific knowledge should be sufficient to shift the burden of proof.
So no evidence then, just a failed attempt to shift the burden of proof. Could have saved yourself some typing if you'd just led with that.
[ link to this | view in thread ]
Re: Re: Re:
You avoided the question.
It does not work at all during high pollution days because the camera can not see more than a few feet. This is what prompted China to have a look at its pollution, not the health problems but the spying problems.
[ link to this | view in thread ]
Re:
Are you happy now?
[ link to this | view in thread ]
'Tell ya what, point the cameras at the officers first...'
“It is ridiculous to deny the value of this technology in securing airports and border installations,” said Jonathan Turley, a constitutional law expert at George Washington University. “It is hard to deny that there is a public safety value to this technology.”
“Although we understand that it’s not a 100 percent accurate technology yet, it’s still evolving,” said Tony Montoya, the president of the association. “I think it has been successful in at least providing leads to criminal investigators.”
Other things that present obstacles to 'public safety' and 'gaining leads' include such dastardly things as privacy in your own home, being able to talk or otherwise communicate without being recorded, being able to travel without having your every move tracked...
Plenty of things like 'privacy' and 'the law' make finding and catching criminals harder, that doesn't mean they just get thrown under the bus simply because it makes the jobs of those with badges harder.
[ link to this | view in thread ]
Re:
You seem very keen on having a global social points system - oh, I'm sorry, a global popularity-decides-if-you-live-today system.
[ link to this | view in thread ]
Re: Re:
You seem very keen on having a global social points system - oh, I'm sorry, a global popularity-decides-if-you-live-today system.
You mean like GOOGLE?
[ link to this | view in thread ]
Re: Re: Re: Re:
I would say that the investments in this technology and general scientific knowledge should be sufficient to shift the burden of proof.
So no evidence then, just a failed attempt to shift the burden of proof. Could have saved yourself some typing if you'd just led with that.
Evidence is evidence. Investment of billions of dollars is evidence. Scientific theory and knowledge is evidence in that the technology is certainly possible. What we don't know is how far along we are, as with many new technologies.
[ link to this | view in thread ]
Re: Re: Re: Re:
Privacy is as dead as copyright protection.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Known, Knowns*
He who knows, and knows that he knows, shall lead.
He who knows not, and knows that he knows not, shall follow.
He who knows not, and knows that he knows not, is a fool whose nose is tied up in a knot.
[ link to this | view in thread ]
Re: Re: Re: Re: Re:
Evidence is evidence.
Indeed it is, and you still have yet to present any, as neither how much money has been dumped into an idea/technology, nor what it might be possible to do in the future with it answers the question of what it is capable of, and used for, now.
[ link to this | view in thread ]
So it is still copasetic to move your bowels on the streets of San Francisco.
[ link to this | view in thread ]
Re: Re: Security: Detection vs Investigation
That is a horribly inaccurate system, that leads to innocent people being hassled. In reality it would become an excuse for the police to stop whoever they want, and justify the stop on the system.
[ link to this | view in thread ]
Re: Re: Re: Security: Detection vs Investigation
At 30%, used in airports and border crossings, it's wildly accurate compared to some rentacop's "feeling" that someone is a bad guy.
They already have the "excuses" they need, and they're ludicrous. At a 1:3 chance of a match, that's 2:3 people that would not be hassled.
And still nowhere near close enough for evidentiary use. I wouldn't consider even 98% efficient good enough for that.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Known, Knowns*
Don't ask me, I don't know - Ozzy Osbourne.
[ link to this | view in thread ]
It is creepy. I recently went to my local PD to fill out a firearms application. I was greeted by name by someone I never met before.. I was like, how do you know who I am? She pointed to her computer and showed me that I was identified as soon as I walked in. Weird..
[ link to this | view in thread ]
Re: Re: Re: Re: Security: Detection vs Investigation
"At 30%, used in airports and border crossings, it's wildly accurate compared to some rentacop's "feeling" that someone is a bad guy."
Not really, no..."30% accuracy" means that 70% of every subject scanned will be shown to be a criminal. Even the dullest rent-a-cop won't bring himself to peg 700 out of a 1000 visitors at an airport as being worthy of a strip-down search.
"At a 1:3 chance of a match, that's 2:3 people that would not be hassled."
Not how the math works.
Assume that you've got some algorithm running tabs on 100 million americans.
Assume the algorithm is 99% accurate.
Assume that out of 100 million, 10000 are criminals.
The algorithm will pick up 9900 of the actual criminals (99%).
It will also pick out 1 million innocent citizens and brand them as criminal (1%).
This is called the paradox of the false positive and is the main reason why automated bad-guy-identification is a terrible idea.
With a 30% accuracy 1/3 of the bad guys will be identified as bad guys. 1/3 of everybody else will also be identified as a bad guy.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Security: Detection vs Investigation
[amended]
"With a 30% accuracy 1/3 of the bad guys will be identified as bad guys. 2/3 of everybody else will also be identified as a bad guy."
[ link to this | view in thread ]