from the Citizen-Suspect dept
Law enforcement agencies have embraced facial recognition. And contractors have returned the embrace, offering up a variety of "solutions" that are long on promise, but short on accuracy. That hasn't stopped the mutual attraction, as government agencies are apparently willing to sacrifice people's lives and freedom during these extended beta tests.
The latest example of widespread failure comes from the UK, where the government's embrace of surveillance equipment far exceeds that of the United States. Matt Burgess of Wired obtained documents detailing the South Wales Police's deployment of automated facial recognition software. What's shown in the FOI docs should worry everyone who isn't part of UK law enforcement. (It should worry law enforcement as well, but strangely does not seem to bother them.)
During the UEFA Champions League Final week in Wales last June, when the facial recognition cameras were used for the first time, there were 2,470 alerts of possible matches from the automated system. Of these 2,297 turned out to be false positives and 173 were correctly identified – 92 per cent of matches were incorrect.
That's the most gaudy number returned in response to the records request. But the other numbers -- even though they contain smaller sample sets -- are just as terrible. The following table comes from the South Wales Police FOI response [PDF]:
In all but three cases, the number of false positives outnumbered positive hits. (And in one of those cases, it was a 0-0 tie.) The police blame the 2,300 false positives on garbage intake.
A spokesperson for the force blamed the low quality of images in its database and the fact that it was the first time the system had been used.
The company behind the tech insists this is an end user problem.
The company behind the facial recognition system, NEC, told ZDNet last year that large watchlists lead to a high number of false positives.
And it illustrates this with a highly-questionable analogy.
"We don't notice it, we don't see millions of people in one shot ... but how many times have people walked down the street following somebody that they thought was somebody they knew, only to find it isn't that person?" NEC Europe head of Global Face Recognition Solutions Chris de Silva told ZDNet in October.
I think most people who see someone they think they know might wave or say "Hi," but only the weirdest will follow them around attempting to determine if they are who they think they are. Even if everyone's a proto-stalker like NEC's front man seems to think, the worst that could happen is an awkward (and short) conversation. The worst case scenario for false positives triggered by law enforcement software is some time in jail and an arrest record. The personal stake for citizens wrongly identified is not even comparable using de Silva's analogy.
If large watchlists are the problem, UK law enforcement is actively seeking to make it worse. Wired reports the South Wales Police are looking forward to adding the Police National Database (19 million images) to its watchlist, along with others like drivers license data stores.
No matter what the real issue is here, the South Wales Police believe there are no adverse effects to rolling out facial recognition tech that's wrong far more often than it's right. It states it has yet to perform a false arrest based on bogus hits, but its privacy assessment shows it's not all that concerned about the people swept up by poorly-performing software.
South Wales Police, in its privacy assessment of the technology, says it is a "significant advantage" that no "co-operation" is required from a person.
Sure, it's an "advantage," but one that solely serves law enforcement. It allows them to gather garbage images and run them against watchlists while hoping the false hits won't result in the violation of an innocent person's rights. But that's all they have: hope. The tech isn't ready for deployment. But it has been deployed and UK citizens are the beta testing group.
So, it will come as an unpleasant non-surprise that Axon (Taser's body cam spinoff) is looking to add facial recognition tech to cameras officers are supposed to deploy only in certain circumstances. This addition will repurpose them into always-on surveillance devices, gathering up faces with the same efficiency as their automated license plate readers. False positives will continue to be a problem and deployment will scale far faster than tech advancements.
UPDATE: Axon apparently takes issue with the final paragraph of this post. It has demanded a correction to remove an unspecified "error" and to smooth the corners off some "bold claims." Here's Axon's full statement:
At this point in time, we are not working on facial recognition technology to be deployed on body cameras. While we do see the value in this future capability, we also appreciate the concerns around privacy rights and the risks associated with misidentification of individuals. Accordingly, we have chosen to first form an AI Ethics Board to help ensure we balance both the risks and the benefits of deploying this technology. At Axon we are committed to ensuring that the technology we develop makes the world a better, and a safer place.
If there's anything to be disputed in the last paragraph of the post, it might be "looking to add facial recognition tech to its cameras." But more than one source (including the one linked in the paragraph) make the same claim about Axon looking at the possibility of adding this tech to its body camera line, so while Axon may not be currently working on it, it appears to be something it is considering. The addition of an ethics board is certainly the right way to approach this issue and its privacy concerns, but Axon's statement does not actually dispute the assertions I made in the post.
As for the rest of the paragraph, I will clarify that I did not mean Axon specifically will push for body cameras to become ALPRs but for faces. Axon likely won't. But police departments will. If the tech is present, it will be used. And history shows the tech will be deployed aggressively under minimal oversight, with apologies and policies appearing only after some damage has been done. To be certain, accuracy will be improved as time goes on. But as the UK law enforcement efforts show, deployment will far outpace tech advancements, increasing the probability of wrongful arrests and detentions.
Filed Under: facial recognition, false positives, law enforcement, south wales, uk