New Data On London Metro Police Facial Recognition Tech Shows It's Still Wrong 96 Percent Of The Time
from the targeting-violent-criminals-with-a-four-percent-success-rate dept
Is this good news or bad news? It's tough to say. The London Metro Police are proud of their many cameras and their cameras' many features, but there doesn't appear to be any improvement in the facial recognition tech its deploying.
Three Freedom of Information requests sent to the Metro Police last year returned documents showing its tech was reporting nothing but false positives. The first response reported a 98% failure rate. A follow-up request generated an admission of a 100% failure rate by the Metro's tech. Now another set of FOI requests has gathered more data from the Metro Police and it appears past reports of consistent failure were pretty indicative of future results.
Facial recognition technology used by London’s Metropolitan Police incorrectly identified members of the public in 96 per cent of matches made between 2016 and 2018.
Biometric photos of members of the public were wrongly identified as potential criminals during eight incidents across the two-year period, Freedom of Information (FoI) requests have revealed.
This may be a small sample size, but it was enough to subject a 14-year-old student to a police stop after the facial recognition software mistook him for a criminal.
The Metro Police are continuing to use the tech despite its relative uselessness. The Met does claim its deployments over the last couple of years have led to eight arrests, but it needs far more than that to offset the system's apparent desire to see the innocent punished.
As the Metro Police continues beta testing its tech on the general public, it's continuing to amass a collection of non-criminal faces in its facial recognition database. This has drawn some attention from Parliament members who have called this "unacceptable." There has been some improvement in one area since the last time the Metro Police were queried about its facial recognition tech. It used to hold onto all images for a year. Now, it only holds watchlist images for 30 days and deletes all non-hit images immediately.
Unfortunately, this spectacular run of failure hasn't moved Parliament to, you know, discourage use of the tech. And it appears those who publicly refuse the privilege of being misidentified as a criminal will have their complaints addressed by being turned into criminals.
In one incident, a 14 year-old black child in school uniform was stopped and fingerprinted by police after being misidentified by the technology, while a man was fined for objecting to his face being scanned on a separate occasion.
Problem solved. The system is only interested in criminals and only criminals would object to having their faces scanned by the Metro's faulty tech. Self-fulfilling prophecies are just another undocumented feature.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: facial recognition, false positives, london metro, london metro police, the tube
Reader Comments
Subscribe: RSS
View by: Time | Thread
Metro Police?
[ link to this | view in thread ]
Re: Metro Police?
[ link to this | view in thread ]
wow! so close to being right then! i'll bet the police are rubbing their hands together now they can bring in definite suspects on fictitious crimes! the UK Conservative Government must be over the Moon. bet they have a celebration party at the weekend!!
[ link to this | view in thread ]
We just need to put some faces of police into their database of photos. Actually, the Prime Minister might be a better idea. ;)
[ link to this | view in thread ]
And here I was thinking acosting what appears to be random people was a halmark of petty thungs. Silly me
[ link to this | view in thread ]
The software works great. It is just all the people there are masters of disguise.
[ link to this | view in thread ]
Re:
"The software works great. It is just all the people there are masters of disguise."
And for some reason a number of completely innocent people are similarly masters of disguise and have the habit of masquerading as criminals, apparently.
Some people have odd hobbies. What can you do?
/s just in case that wasn't blindingly obvious
[ link to this | view in thread ]
Re:
But what good would that do when the system is wrong 96% of the time?
[ link to this | view in thread ]
The purpose isn't "accurate identification." It's to build up such a bulk of incriminating data that refuting it is pointless.
Think of the Stasi in the DDR. They knew that the majority of the information they had was either grossly inaccurate or outright faked, but it didn't matter; seeing the boxes full of documents, even people who knew better thought "there there's smoke there's fire." Which was the purpose of building those files in the first place.
[ link to this | view in thread ]
Why always so negative?
Instead of claiming that it is wrong 96% of the time, why not be positive and say that it has a 96% success rate at identifying the wrong person?
SEE? that sounds so much better!
[ link to this | view in thread ]
Do the math
96% failure rate + 8 arrests means 192 innocent people were hassled by the police for no other reason that some machine claimed "they match the description of someone we're looking for".
[ link to this | view in thread ]
Boondoggle at Piccadilly Circus
New Data On London Metro Police Facial Recognition Tech Shows It's Still Wrong 96 Percent Of The Time
And It's Still funded 100 Percent Of The Time.
[ link to this | view in thread ]
Fined?
Very uneasy about this: "A man was fined for objecting to his face being scanned on a separate occasion". Don't see how he can be fined for that. Any more details on that case available? What actual crime did he allegedly commit to be fined and did it go through a court?
[ link to this | view in thread ]