London Metropolitan Police Deploy Facial Recognition Tech Sporting A 100% Failure Rate
from the TOP.-TECH. dept
Facial recognition tech isn't working quite as well as the agencies deploying it have hoped, but failure after failure hasn't stopped them from rolling out the tech just the same. I guess the only way to improve this "product" is to keep testing it on live subjects in the hope that someday it will actually deliver on advertised accuracy.
The DHS is shoving it into airports -- putting both international and domestic travelers at risk of being deemed terrorists by tech that just isn't quite there yet. In the UK -- the Land of Cameras -- facial recognition tech is simply seen as the logical next step in the nation's sprawling web o' surveillance. And Amazon is hoping US law enforcement wants to make facial rec tech as big a market for it as cloud services and online sales.
Thanks to its pervasiveness across the pond, the UK is where we're getting most of our data on the tech's successes. Well... we haven't seen many successes. But we are getting the data. And the data indicates a growing threat -- not to the UK public from terrorists or criminals, but to the UK public from its own government.
London cops have been slammed for using unmarked vans to test controversial and inaccurate automated facial recognition technology on Christmas shoppers.
The Metropolitan Police are deploying the tech today and tomorrow in three of the UK capital's tourist hotspots: Soho, Piccadilly Circus, and Leicester Square.
The tech is basically a police force on steroids -- capable of demanding ID from thousands of people per minute. Big Brother Watch says the Metro tech can scan 300 faces per second, running them against hot lists of criminal suspects. The difference is no one's approaching citizens to demand they identify themselves. The software does all the legwork and citizens have only one way to opt out: stay home.
Given these results, staying home might just be the best bet.
In May, a Freedom of Information request from Big Brother Watch showed the Met's facial recog had a 98 per cent false positive rate.
The group has now said that a subsequent request found that 100 per cent of the so-called matches since May have been incorrect.
A recent report from Cardiff University questioned the technology's abilities in low light and crowds – which doesn't bode well for a trial in some of the busiest streets in London just days before the winter solstice.
The tech isn't cheap, but even if it was, it still wouldn't be providing any return on investment. To be fair, the software isn't misidentifying people hundreds of times a second. In a great majority of scans, nothing is returned at all. The public records response shows the Metro Police racked up five false positives during their June 28th deployment. This led to one stop of a misidentified individual.
But even if the number of failures is small compared to the number of faces scanned, the problem is far from minimal. A number of unknowns make this tech a questionable solution for its stated purpose. We have no idea how many hot list criminals were scanned and not matched. We don't know how many scans the police performed in total. We don't know how many of these scans are retained and what the government does with all this biometric data it's collecting. About all we can tell is the deployment led to zero arrests and one stop instigated by a false positive. That may be OK for a test run (it isn't) but it doesn't bode well for the full-scale deployment the Met Police have planned.
The public doesn't get to opt out of this pervasive scanning. Worse, it doesn't even get to opt in. There's no public discussion period for cop tech even though, in the case of mass scanning systems, the public is by far the largest stakeholder. Instead, the public is left to fend for itself as law enforcement agencies deploy additional surveillance methods -- not against targeted suspects, but against the populace as a whole. This makes the number of failures unacceptable, even if the number is a very small percentage of the whole.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: facial recognition, london, metropolitan police
Reader Comments
Subscribe: RSS
View by: Time | Thread
This is great!
Thanks Met Police, for the free London trip. I hear it's quite dystopic this time of year!
[ link to this | view in chronology ]
While I...
"We have no idea how many hot list criminals were scanned and not matched."
is a problem. Headlines of 100% *failure* rate, but if there were NO "hot list criminals" in the "sample group", the software is 100% *effective*.
Putting ten or twenty test subjects in the crowd deliberately would give a more accurate reading. But then it would be a test rather than a deployment. Can't run tests after the sucker ... er... "client" has already bought the package, after all...
[ link to this | view in chronology ]
Re: While I...
Headlines of 100% failure rate, but if there were NO "hot list criminals" in the "sample group", the software is 100% effective
The 100% failure rate was the rate at which the software incorrectly identified people as a criminal, not the failure to identify any actual criminals (though that also happened). Thus, we are at 0% effective.
[ link to this | view in chronology ]
Re: While I...
[ link to this | view in chronology ]
Re: While I...
The 100% failure rate in the headline was, in the article, shown to reference the 100% false positive rate.
The line you quote references false negatives.
If the software had no criminals but flagged multiple people as criminals, it most certainly had a 100% failure rate - every flag was a failure.
There is in fact, in this data, no way to find anything but 100% failure. Best case, no criminals were present and all that was available to flag were innocents. 100% failure by finding criminals where none existed. Certainly not 100% effective.
The more criminals actually present and in the database, the worse it looks, because then it identified innocents as criminals, and failed to find the actual criminals.
[ link to this | view in chronology ]
Erm, I'd love to hear your logic on this one. Isn't the fourth the "right of the people to be secure in their persons, houses, papers, and effects"? How does that apply to recording a public street?
"if there were NO "hot list criminals" in the "sample group", the software is 100% *effective*."
This, however, is definitely true. But, there's likely to have been some testing with other subjects before these public ones and they probably can't legally target people without their consent (then said consent might lead to accusations of pre-programmed bias if the software is successful).
[ link to this | view in chronology ]
Re:
I own several rental buildings. I've got cameras on all of them, but was very careful to make sure none of them are catching views of the neighboring properties.
If a stoplight mounted camera has apartment or house windows in it's field of view, how can it not be a 4th violation? It's effectively surveilling that window.
[ link to this | view in chronology ]
Re: Re:
I understand privacy concerns and would certainly hope that those setting the cameras up were as mindful as you. But, I think it's really stretching the point to say that any camera surveillance in the public street is a violation.
[ link to this | view in chronology ]
Re: Re: Re:
I view them like the automated license plate readers.
Prior to the readers, an officer had to have a REASON to call in a plate. They didn't do it gratuitously, as it would annoy the hell out of the clerk running the search.
And they got ONLY the registered owner and information on the vehicle itself.
With the ALPR, they get not just that info, but it's linked in dozens of databases, giving all kinds of information "at the push of a button" that the officer does not need to know for any reason.
And the ALPR treats everyone as guilty. It scans EVERY plate that goes past it.
The same for the street cameras, especially where they infringe on private property. Would you put up with your neighbor setting up a camera to watch your teenaged daughters in your pool? Of course not. So why are the cops "special" in this matter?
I give all my tenants access to the camera footage, with real-time available for the one covering the front door.
Ever try to get footage from a traffic camera?
[ link to this | view in chronology ]
Re: Re: Re: Re:
Of course you wouldn't want someone filming your daughters. Too bad there's nothing you could legally do to make them stop. The cops aren't special because what you're implying with this example is untrue.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
Of course it will be abused, anyone who thinks otherwise is being a bit silly.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
"Of course you wouldn't want someone filming your daughters. Too bad there's nothing you could legally do to make them stop. "
What.
Seriously, look at...most examples of western law on this. If you manage to film someone in such a way that it intrudes upon or violates personal integrity of others then that is, by definition, breaking the law.
Hence why placing a camera is usually circumscribed with legal concerns - you WILL be held responsible for what it captures.
And as someone pointed out in this thread already, police appear to be getting away with this since there is literally no overview.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
Are they "getting away with it"? Or, is it just that there's different standards applied to someone deliberately targeting your property than applied to someone happening to catch a small part of your property when looking at the public street? If the scenario he fears is even happening at all.
[ link to this | view in chronology ]
Re: Re: Re: Re:
So, the same as anyone in the street can see, or sat in an opposing property. Do you consider a stakeout to be a violation as well, if they can see your house as well as the intended target?
"And the ALPR treats everyone as guilty. It scans EVERY plate that goes past it."
Every plate that's out being driven on public roads, sure. You appear to be having problem with the automation, not the camera, which is a different issue.
"So why are the cops "special" in this matter?"
Because they are in theory putting them up in order to monitor and protect the public streets, which is a big part of their job.
You may disagree with how they operate in reality, and how necessary the tools actually are. But, it's not exactly hard to see why a camera that may see the corner of your property that's visible from the public street while monitoring that street is different from one set up specifically to view your property.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
If a traffic cam is watching a private property, the police have access to the footage captured without a Warrant.
THAT is why it's a 4th violation (in the US).
It's also not unknown for traffic cameras to get "accidentally" moved so they're watching a window or driveway instead of the intersection.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
In the UK?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
Not if they're doing it from a public location.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
For all nearby properties that might be in shot, it just the one they intend to film? If the latter, they don’t need one just because your property is next to the public road they’re filming.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Re: Re:
Intruding upon someone's privacy is not the same thing as someone opening their private areas up for full display to the public. The only possible issue for the government is that it's not allowed to keep those images for very long unless except for any it has probably cause to connect to some actual crime.
[ link to this | view in chronology ]
Re: Re: Re:
For example, one could park their vehicle on the public road in a spot where parking is allowed and point several cameras at one particular private property. Use of binoculars will help in the test scenario.
Then simply wait for the police to show up ... or worse - the vigilantes.
[ link to this | view in chronology ]
Re:
I'd assume the manufacturer did *some* sampling using it's own employees.
But for the cops not to do the same makes it very difficult to claim the software match was sufficient cause for a stop. The days of people believing computers don't make mistakes are long gone.
Now, if along with their "hot list" of criminals, they scanned EVERY cop into the system, AND the system matched them frequently (and correctly), they'd have grounds to have such stops accepted as evidence (in the US, not up on Brit law).
And how could any cop decline to be scanned in? After all, if you've got nothing to hide, you've got nothing to worry about, right?
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
I'm not saying to add the cops to a test pool - I'm saying ALL cops should be REQUIRED to be in the database, and matches to them flagged.
Think what the bodycams were supposed to do.
If they're doing facial recognition on crowds of innocents, how can they justify NOT having the cops in that database?
Cops are a necessary evil. Early in the dim mists of time, society as a whole realized it was marginally better to have them on the inside urinating out than outside urinating in.
But they need watching, and what better tools to watch them than the ones they deploy against every non-cop?
[ link to this | view in chronology ]
Re: Re: Re: Re:
"I'm saying ALL cops should be REQUIRED to be in the database, and matches to them flagged."
They may well be - AFTER the tests are concluded. If you're calling for them to be in the tests themselves, you're asking for potentially biased tests - the results of which will be used to justify full rollout of this technology, both in the UK and US. I'm sure that's not what you mean to be asking for, but you are.
[ link to this | view in chronology ]
Re: Re:
The issues are A) people don't understand what they are telling the computer to do and B) We are bad at telling a computer how to do a bunch of things that we do at an instinctual level. C) Software Engineers and Police don't think the same way D) The computers at issue are dealing with maybes but are being programmed with yes or no responses.
I see it all the time in my office. People THINK the computer is just doing "whatever it wants". But they don't understand what the computer is doing, how it is doing it, so when it acts differently than they expect, its 'going crazy'.
[ link to this | view in chronology ]
Re: Re: Re:
As a software engineer, I commend you on your understanding of computers that many outside of the engineering fields don't get. Computers do what they are told, very quickly and very efficiently. We usually run into trouble with "software" and "data" because the humans involved don't fully understand what they've told the computer to do or don't understand how their code will act/react on data coming in from the real world.
Works in the lab/on my computer is often taken to mean that things will work the same in the real world ... and they never do.
[ link to this | view in chronology ]
Re: Re: Re:
That depends on how you define making a mistake. And it gets much fuzzier when you throw in machine learning (I'm not sure if this system uses that). Even without it, if the system is supposed to identify people in a database, and it has false positives, I'd say the system (meaning the software and hardware) made a mistake. Did the machine correctly follow the instruction sets that were fed to the processor? Yes, but we can look at computer systems at a higher level than that, and analyze whether they are fulfilling their function correctly. In this case, this system was not.
[ link to this | view in chronology ]
who makes balaclavas?
[ link to this | view in chronology ]
Re: who makes balaclavas?
[ link to this | view in chronology ]
Re: Re: who makes balaclavas?
[ link to this | view in chronology ]
Re: Re: Re: who makes balaclavas?
[ link to this | view in chronology ]
Re: Re: who makes balaclavas?
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
Aston Martins sure are nice.
[ link to this | view in chronology ]
Re:
"is there actually ANYTHING that the UK can do right"
Fudge, marmalade, various curds. They used to be very good at dry british humor as well but they've been slipping in that regard now that the flying circus isn't active any more.
[ link to this | view in chronology ]
https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorith m-ai
[ link to this | view in chronology ]
Several Points
So the police can accidentally video your living room window - but they should not keep that data (or any data) forever, or even for months.
The same applies to license plate data - perhaps the police compile lists of license plates. But they should not have a collection going back years, to show every movement you car has performed over that time - i.e. your complete movements for the past year. Maybe with a warrant they can collect ongoing data from these devices for specific individuals; the rest should be deleted in a reasonable time (a week? Two weeks?)
Building a similar inaccurate database of "facially identified" people with a flawed program is only rife for abuse. "Evidence" will incriminate perfectly innocent people. "Your face was identified walking toward the crime scene. We have video of you in your living room 3 weeks before where you wear the same shirt the perp did. Our license plate reader saw you drive by 4 blocks from the crime scene an hour before. Please come with us."
Of course, trying to test facial recognition with a location that contains probably one of the largest collection of different faces - major international tourist destinations - is sure to catch the largest possible incidence of doppelgangers. More interesting would be to see how many of these false positive faces were of other ethnic extractions. There's already articles suggesting the tech fails excessively for Chines and black faces.
[ link to this | view in chronology ]
Re: Several Points
I know the two local "traffic" cameras to my home are archived... forever.
And while the local cops only get real-time, the State cops can access the entire archive.
[ link to this | view in chronology ]
Zoom - Enhance - Rotate ... exact match again!
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
It's really kind of a fascinating middle-step between The Dark Knight and Westworld (Jonathan Nolan wrote or co-wrote all three).
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
Guess I missed them.
...you missed The Dark Knight?
At a certain point, "deliberately avoided" is the correct term to use.
[ link to this | view in chronology ]
Stretch goals are important!
btw - my alltime fav Dilbert cartoon is the one about safety accident goals and working harder to reach that number since there weren't enough safety accidents the previous year.
[ link to this | view in chronology ]
Testing testing testing
I would have chossen a more challenging yet controlled environment. The foyer to a number of the larger police stations. That would have provided a significant number of hits for fellons and cops alike.
[ link to this | view in chronology ]
Re: Testing testing testing
[ link to this | view in chronology ]
'You first'
Ignoring for a moment privacy implications, want to make sure that the tech is tested and accurate before it's aimed at the public? Aim the cameras at the entrance to police stations and government buildings first, with the public given the same access to that data as the police get to the data from cameras aimed at the public.
I suspect that accuracy(or lack thereof) would suddenly become a very important selling point, practically overnight.
[ link to this | view in chronology ]
Dogberry tech
"The tech is basically a police force on steroids -- capable of demanding ID from thousands of people per minute... The difference is no one's approaching citizens to demand they identify themselves."
To be fair, there is another difference: the tech does not detain those it cannot identify -- at least, not yet. Shakespeare himself made fun of watchmen who behave like that. ("Why, then, take no note of him, but let him go; and presently call the rest of the watch together and thank God you are rid of a knave.")
[ link to this | view in chronology ]