Not Ready For Prime Time: UK Law Enforcement Facial Recognition Software Producing Tons Of False Positives [Updated]
from the Citizen-Suspect dept
Law enforcement agencies have embraced facial recognition. And contractors have returned the embrace, offering up a variety of "solutions" that are long on promise, but short on accuracy. That hasn't stopped the mutual attraction, as government agencies are apparently willing to sacrifice people's lives and freedom during these extended beta tests.
The latest example of widespread failure comes from the UK, where the government's embrace of surveillance equipment far exceeds that of the United States. Matt Burgess of Wired obtained documents detailing the South Wales Police's deployment of automated facial recognition software. What's shown in the FOI docs should worry everyone who isn't part of UK law enforcement. (It should worry law enforcement as well, but strangely does not seem to bother them.)
During the UEFA Champions League Final week in Wales last June, when the facial recognition cameras were used for the first time, there were 2,470 alerts of possible matches from the automated system. Of these 2,297 turned out to be false positives and 173 were correctly identified – 92 per cent of matches were incorrect.
That's the most gaudy number returned in response to the records request. But the other numbers -- even though they contain smaller sample sets -- are just as terrible. The following table comes from the South Wales Police FOI response [PDF]:
In all but three cases, the number of false positives outnumbered positive hits. (And in one of those cases, it was a 0-0 tie.) The police blame the 2,300 false positives on garbage intake.
A spokesperson for the force blamed the low quality of images in its database and the fact that it was the first time the system had been used.
The company behind the tech insists this is an end user problem.
The company behind the facial recognition system, NEC, told ZDNet last year that large watchlists lead to a high number of false positives.
And it illustrates this with a highly-questionable analogy.
"We don't notice it, we don't see millions of people in one shot ... but how many times have people walked down the street following somebody that they thought was somebody they knew, only to find it isn't that person?" NEC Europe head of Global Face Recognition Solutions Chris de Silva told ZDNet in October.
I think most people who see someone they think they know might wave or say "Hi," but only the weirdest will follow them around attempting to determine if they are who they think they are. Even if everyone's a proto-stalker like NEC's front man seems to think, the worst that could happen is an awkward (and short) conversation. The worst case scenario for false positives triggered by law enforcement software is some time in jail and an arrest record. The personal stake for citizens wrongly identified is not even comparable using de Silva's analogy.
If large watchlists are the problem, UK law enforcement is actively seeking to make it worse. Wired reports the South Wales Police are looking forward to adding the Police National Database (19 million images) to its watchlist, along with others like drivers license data stores.
No matter what the real issue is here, the South Wales Police believe there are no adverse effects to rolling out facial recognition tech that's wrong far more often than it's right. It states it has yet to perform a false arrest based on bogus hits, but its privacy assessment shows it's not all that concerned about the people swept up by poorly-performing software.
South Wales Police, in its privacy assessment of the technology, says it is a "significant advantage" that no "co-operation" is required from a person.
Sure, it's an "advantage," but one that solely serves law enforcement. It allows them to gather garbage images and run them against watchlists while hoping the false hits won't result in the violation of an innocent person's rights. But that's all they have: hope. The tech isn't ready for deployment. But it has been deployed and UK citizens are the beta testing group.
So, it will come as an unpleasant non-surprise that Axon (Taser's body cam spinoff) is looking to add facial recognition tech to cameras officers are supposed to deploy only in certain circumstances. This addition will repurpose them into always-on surveillance devices, gathering up faces with the same efficiency as their automated license plate readers. False positives will continue to be a problem and deployment will scale far faster than tech advancements.
UPDATE: Axon apparently takes issue with the final paragraph of this post. It has demanded a correction to remove an unspecified "error" and to smooth the corners off some "bold claims." Here's Axon's full statement:
At this point in time, we are not working on facial recognition technology to be deployed on body cameras. While we do see the value in this future capability, we also appreciate the concerns around privacy rights and the risks associated with misidentification of individuals. Accordingly, we have chosen to first form an AI Ethics Board to help ensure we balance both the risks and the benefits of deploying this technology. At Axon we are committed to ensuring that the technology we develop makes the world a better, and a safer place.
If there's anything to be disputed in the last paragraph of the post, it might be "looking to add facial recognition tech to its cameras." But more than one source (including the one linked in the paragraph) make the same claim about Axon looking at the possibility of adding this tech to its body camera line, so while Axon may not be currently working on it, it appears to be something it is considering. The addition of an ethics board is certainly the right way to approach this issue and its privacy concerns, but Axon's statement does not actually dispute the assertions I made in the post.
As for the rest of the paragraph, I will clarify that I did not mean Axon specifically will push for body cameras to become ALPRs but for faces. Axon likely won't. But police departments will. If the tech is present, it will be used. And history shows the tech will be deployed aggressively under minimal oversight, with apologies and policies appearing only after some damage has been done. To be certain, accuracy will be improved as time goes on. But as the UK law enforcement efforts show, deployment will far outpace tech advancements, increasing the probability of wrongful arrests and detentions.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: facial recognition, false positives, law enforcement, south wales, uk
Reader Comments
Subscribe: RSS
View by: Time | Thread
The false positives in this blatant non-analogy aren't harming anything except the police budget, and in that case the technology is still saving more time and effort (and your taxes!) than would be spent without it.
If you want to shit on facial recognition, shit on the aspect that deserves it, namely the destruction of privacy and abuses it enables. False positives are pretty irrelevant.
[ link to this | view in chronology ]
Re:
You're assuming no false negatives, and that's probably not a valid assumption. The false negative rate seems unknowable. If police assume the murder-needle is not in the big pile but it actually is, that's gotta be a bad thing.
Sure, and there's no reason taxpayers should be concerned about that, right?
That depends highly on what the police do when the system indicates a match. I assume that at a minimum they are detaining people to determine their identity, and that alone rises to a level above "irrelevant".
[ link to this | view in chronology ]
Re: Re:
Being the UK,the first thing that they will likely do is have a close look at the indicated subject and the photos that they have of them. They will then look for an opportunity to approach and identify the person, maintaining formal politeness while doing so.
[ link to this | view in chronology ]
Re:
You have no idea if the system is giving you a true negative or a false negative. All you have done here is to check out all the positives to determine if they are true or negative.
In your analogy all that has been done here is to scan the haystack, check out a couple of thousand pieces of hay and assume that was is left of the haystack does not contain the needle.
[ link to this | view in chronology ]
Re:
It could be none, it could be 20,000.
If you found 3 positives and 10 false positives out of 20,000 needles present, are you content that it's a job well done? Or was it a seriously flawed POS?
We'll never know how well the software truly works, but relying on the facial recognition software to do the job with the assumption that it's better than not using the software - could be a major mistake.
[ link to this | view in chronology ]
Re:
But then if you want to shit on the rights of others then you are certainly within your rights to do so - right? Damn the torpedoes - full speed ahead
[ link to this | view in chronology ]
Re:
This could be true, if the false positive is discovered before the arrest. What if it's discovered afterward?
[ link to this | view in chronology ]
How did they determine the true accuracy?
In other words, did they check that every single scan of every person was returning the correct result in order to determine the true accuracy of the system? I am going to go out on a limb here and say, did they bollocks.
[ link to this | view in chronology ]
Re: How did they determine the true accuracy?
[ link to this | view in chronology ]
Re: Re: How did they determine the true accuracy?
So - how often? Possibly, in the future, your vehicle may be disabled until you update your facial recognition profile ... daily.
[ link to this | view in chronology ]
Re: Re: Re: How did they determine the true accuracy?
Fingerprints also change over time, and that's something that government agencies apparently don't know about, or care about.
Fingerprints are not just used for finding 'bad' people, it's your badge of personal identity that the government has on file that's considered permanent (but really isn't, at least not in the eyes of a computer).
Applying for US citizenship is one such example. If your current fingerprints don't precisely computer-match those on file that were taken by the INS many years or decades ago, then you are essentially a non-person in the eyes of the law and are not eligible to apply for citizenship. Proving that you are indeed still the same person could be an expensive, lengthy, and time-consuming uphill legal battle.
So "what's the big deal?" you might ask. Even for people who never plan to ever vote in an election, citizenship becomes an extremely important issue in a person's old age, because US citizens can inherit a deceased spouse's property tax-free, while non-citizen permanent residents ("green card" holders) are hit with a whopping 40% federal tax on all assets upon the death of a spouse.
But back to the subject of automatic facial recognition issues, its almost a certainty that it will end up being abused in some ways, or the system trusted more than it ever should be, such as with innocent people having to suffer because "the computer says so."
[ link to this | view in chronology ]
'More justifications for searches? Where do we sign?!'
(It should worry law enforcement as well, but strangely does not seem to bother them.)
If the UK police are even remotely similar to the US police then there's nothing 'strange' about it. More false positives means more chances to search people and possibly find something incriminating they can use to boost the 'look at all the criminals we're finding!' numbers.
That this would have massive negative impacts on privacy of everyone searched is a sacrifice they are valiantly willing to have the public pay.
[ link to this | view in chronology ]
Re: 'More justifications for searches? Where do we sign?!'
If the UK police are even remotely similar to the US police
Fortunately they aren't, at least for now.
For one thing they don't have the option of shooting first and asking questions afterwards.
Traditionally there have been some pretty good senior police officers in the UK.
eg John Alderson https://www.theguardian.com/uk/2011/oct/11/john-alderson
Sadly they remain quite rare
[ link to this | view in chronology ]
Re: Re: 'More justifications for searches? Where do we sign?!'
http://www.gayinthe80s.com/2014/05/1986-politics-manchesters-chief-constable-james-anderto n/
What with his notorious frothing hatred of "licentious dancing", and anyone in general just out enjoying themselves, oh, and the £32k of public money (a fortune in the '80s) he lavished on himself by putting in luxurious thick pile shag in his Old Trafford office... this dolt was universally loathed.
[ link to this | view in chronology ]
Re: Re: 'More justifications for searches? Where do we sign?!'
Remember the de Menezes murder by the police?
Mis identification led to multiple close range head shots and a fatality by the Met (London police)
https://en.m.wikipedia.org/wiki/Death_of_Jean_Charles_de_Menezes
As is usual for these cases (innocent people execute by UK police) nobody in the police chain of command was jailed.
[ link to this | view in chronology ]
Reid Technique going high tech
Likewise with AFR claims, we should be naturally skeptical in case there is some degree of smoke and mirrors going on and they're trying to fudge the test results or game the system in some way.
Also, in a stadium full of people getting scanned, there would likely be hundreds, and possibly even thousands of people who are wanted by the police for some reason, mostly on minor offenses like failing to appear in court or pay a fine. Apparently most of them got missed and walked out of the stadium undetected.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Just wait until the New South Wales Police get this technology
[ link to this | view in chronology ]
All it takes is a match. 🔥
[ link to this | view in chronology ]
This post and the next one
"significant advantage" that no "co-operation" is required from a person.
Ah - but of course GDPR (see next post) is supposedly all about guaranteeing co-operation...
Except of course the law enforcement is exempt....
Quelle Surprise!
[ link to this | view in chronology ]
Ridiculous Article
[ link to this | view in chronology ]
Re: Ridiculous Article
"using Facial Recognition."
That doesn't work. Derp.
[ link to this | view in chronology ]
Re: Ridiculous Article
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Facial recognition isn't ready for prime time. Neither is law enforcement. Or rather, neither is humanity.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Surprising optimism by Techdirt author
Based on other recent reporting, including on Techdirt, of officers using grossly excessive force with no penalty, I'd think this sentence should be:
-- The worst case scenario for false positives triggered by law enforcement software is some unnecessary fatalities when the police shoot an innocent individual misidentified by the software as a match to an "armed and dangerous" fugitive cop-killer. If American cops were involved, figure some bystanders will get caught in the cross-fire too.
[ link to this | view in chronology ]
At least they will have their body camera turned on for once, likely with some request logs server-side (because there is no way the matching will happen on the devices themselves) to correlate with suspiciously missing footage.
[ link to this | view in chronology ]