Yet Another Bad Idea: Dropping Facial Recognition Software Into Police Body Cameras
from the Citizen-Rolodex dept
The FBI (and other US government agencies) are already moving forward with facial recognition technology, which will allow law enforcement to scan people like license plates, if everything goes to plan. So far, consultation with the meddling public has been kept to a minimum, as have any government efforts to address civil liberties concerns.
Just because the public's been kept out of the loop (except for, you know, their faces and other personal information), doesn't mean members of the public aren't working hard to ensure police officers can start running faces like plates, even when there's no legitimate law enforcement reason for doing so.
Digital Barriers, a somewhat ironically-named tech company, is pushing its latest law enforcement offering -- one that supposedly provides real-time face scanning.
The software can pick out and identify hundreds of individual faces at a time, instantly checking them against registered databases or registering unique individuals in seconds.
Demonstrating the software at the Forensics Europe Expo 2017, vice president of Digital Barriers Manuel Magalhaes said the company was introducing the technology to UK forces.
He said: “For the first time they (law enforcement) can use any surveillance asset including a body worn camera or a smartphone and for the first time they can do real time facial recognition without having the need to control the subject or the environment.
“In real time you can spot check persons of interests on their own or in a crowd."
But why would you? Just because it can be done doesn't mean it should be done. This will basically allow officers to run records checks on everyone who passes in front of their body-worn cameras. There is nothing in the law that allows officers to run checks on everyone they pass. They can't even stop and/or frisk every member of the public just because they're out in public. Expectations of privacy are lowered on public streets, but that doesn't make it reasonable to subject every passerby to a records check. And that's without even factoring in the false positive problem. Our own FBI seems to feel a 15% bogus return rate is perfectly acceptable.
Like so much surveillance equipment sold to law enforcement agencies, Digital Barrier's offering was developed and tested in one of our many war zones. The head of the company is inordinately proud of the product's pedigree, which leads to a statement that could be taken as bigoted if it weren't merely nonsensical.
Mr Magalhaes continued: “If we can overcome facial recognition issues in the Middle East, we can solve any facial recognition problem here in the United Kingdom.
Hopefully, this just refers to the sort of issues normally found in areas of conflict (hit-and-miss communications infrastructure, harsher-than-usual working conditions, etc.), rather than hinting Middle Eastern facial features are all kind of same-y.
Taking the surveillance out of the Middle East isn't going to solve at least one logistical problem keeping this from becoming a day-to-day reality for already heavily-surveilled UK citizens. As is pointed out by officers in the discussion thread, Digital Barrier's real-time face scanning is going to need far more bandwidth than is readily available to law enforcement. One commenter notes they can't even get a strong enough signal to log in into their email out in the field, much less perform the on-the-fly facial recognition Digital Barrier is promising.
The other pressing issues -- according to the law enforcement members discussing the post -- is one far more aligned with the general public's. A couple of members point out no one PNC's entire crowds (referring to the UK's law enforcement database: the Police National Computer) and that doing so might not even be legal.
Unfortunately, the rank-and-file rarely get to make these decisions. These choices will be made by people who think the public needs to give til it hurts when safety and security are on the line. Dropping this capability into body cameras will make them more of an intrusion on the lives of citizens and far less likely to result in police accountability. Faces being linked automatically to databases full of personal info creates complications in obtaining camera footage. It won't result in improved policing, even though there are plenty of supporters who mistakenly believe "easier" is synonymous with "better."
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: body cameras, face recognition, police
Reader Comments
Subscribe: RSS
View by: Time | Thread
Like a one-way mirror
I would hope that the UK police are more 'camera friendly' than the US ones if they plan on rolling something like this out. Beyond the privacy concerns it would be just a titch hypocritical if the police objected to people recording them while they constantly record and check anyone they interact with.
[ link to this | view in chronology ]
Papers Please. No, forget the please, just give me your damn documents that verify your right to exist.
At some point there WILL be revolt, and it may spread worldwide. Then what do governments do? Are they really as short sighted as the auto-trading algorithms that Wall Street uses?
I think yes.
[ link to this | view in chronology ]
Re: Papers Please. No, forget the please, just give me your damn documents that verify your right to exist.
[ link to this | view in chronology ]
Re: Papers Please. No, forget the please, just give me your damn documents that verify your right to exist.
[ link to this | view in chronology ]
Re: Papers Please. No, forget the please, just give me your damn documents that verify your right to exist.
Just because they say I'm going to do it before the end times, doesn't mean I am.
They've screwed everything up way more than I ever could have.
[ link to this | view in chronology ]
Re: Re: Papers Please. No, forget the please, just give me your damn documents that verify your right to exist.
This could be taken as you want to be a fraction. :)
[ link to this | view in chronology ]
Re: Papers Please. No, forget the please, just give me your damn documents that verify your right to exist.
[ link to this | view in chronology ]
Perhaps this isn't such a bad idea...
What if there was an app for smart phones linked to a facial recognition database of police officers accused of excessive violence, all-around being a dick, harassment, etc. that immediately puts the camera into record mode, while calling a lawyer?
I'd say we could scale that back to convicted police officers, but since they're rarely if ever held accountable for anything, that database would be mostly useless.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
If a person can do it, a computer should be able to do it.
If the police were told 'Bob Jones is a bail jumper wanted for murder, here's his picture, be on the lookout' and then a cop pulls over someone that meets this description, I'd say that the officer is using his best judgement. I don't think anyone would fault the officer for doing this. I would also bet money, that the officer is wrong much higher than 15% of the time. I can see this technology being great at catching criminals assuming they don't take it too far.
I alluded to license plate scanners... if you just scan and dump when there is no reason to keep the information (or even a transient 72 hour hold), I don't see a big deal with this. A cop could do the exact same thing with his brain. You are just automating it... But the problem comes in when you begin to perpetually store this stuff and start using that data to cross check and query... THAT is where it crosses a line.
I would also say giving personal information to the officer about a person that he doesn't have a reason to know is also a step too far... hooking in facial recognition to Facebook or a database of non-violent felons. Assuming they are just doing this stuff behind the scenes in a computer somewhere (and the data is dumped after a period), I'm ok with this.
However, this isn't the software or technology's fault, it is how it is implemented and used.
And yes... I know that they will totally be abusing this... but assuming they put in proper safe guards (which they probably won't...) I would be fine with this.
[ link to this | view in chronology ]
Re: If a person can do it, a computer should be able to do it.
I want this for my own use.
[ link to this | view in chronology ]
Re: Re: If a person can do it, a computer should be able to do it.
[ link to this | view in chronology ]
Re: If a person can do it, a computer should be able to do it.
All of the railing is hollow, however. Until people get sufficiently pissed off to do something about it nothing will be done. And by the time people get sufficiently pissed off it may well be too late.
[ link to this | view in chronology ]
Re: Re: If a person can do it, a computer should be able to do it.
[ link to this | view in chronology ]
Re: If a person can do it, a computer should be able to do it.
[ link to this | view in chronology ]
How accurate?
Seems like it would be just a matter of time until the incorrect person is identified, runs because they're scared and gets killed.
[ link to this | view in chronology ]
Re: How accurate?
The article above states a false positive rate of 15%. To put that in perspective, Centurylink Field in Seattle has a listed max-capacity of 67,000 people, which means that over the course of a Seahawks game, as many as 10,050 people would be misidentified.
Not a problem - qualified immunity covers that scenario. Besides, "only guilty people run". /s
[ link to this | view in chronology ]
Re: Re: How accurate?
According to Wikipedia, Emirates Stadium, Home of the Arsenal FC in London, England has a capacity (according to Wikipedia) of "over 60,000".
Assuming a 15% false positive rate, grade-school math says ~9000 people per Arsenal game would be misidentified.
[ link to this | view in chronology ]
Re: Re: How accurate?
15% seems unacceptably high... I wonder how often it fails to identify someone, I bet that's even higher.
[ link to this | view in chronology ]
...which leads to a statement that could be taken as bigoted if it weren't merely nonsensical.
Hopefully, this just refers to the sort of issues normally found in areas of conflict (hit-and-miss communications infrastructure, harsher-than-usual working conditions, etc.),
So which is it? Is it nonsensical, or is it potentially referring to very real problems with deploying new technology in real-life situations.
I'm willing to give Tim the benefit of the doubt on leveling thinly supported racism accusations against his "opponent," but claiming the statement is nonsensical beforehand, and then providing several perfectly reasonable and easily deducible explanations immediately afterward is pushing the bounds of credulity.
[ link to this | view in chronology ]
Re:
Yes, there is a problem with deploying the technology, and _technologically_ it is irrelevant against whom you are deploying it.
It could simply mean that usage in Middle East was pretty much their only market prior, but that there is pretty much predicated on bigotry in the first place, even if Magalhaes didn't just happen to process a bit of endemic cultural racist script or outright intentionally other "Middle Easterners".
Was it worth noting? I probably would have noticed for a moment myself. But there is a lot of baked-in cultural bigotries, sometimes subtle, but generally the responses to potential moments in bigotry (conscious or unconscious) being pointed out are a bit telling.
[ link to this | view in chronology ]
It is already going to be applied
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Response to: Anonymous Coward on May 22nd, 2017 @ 3:22pm
[ link to this | view in chronology ]
There are people who keep being given the pavement taste test by police because a criminal has stolen their identity. Ever time their name is removed from police records, it gets added back in when police share data with another jurisdiction.
Now it'll only take roughly the same bone structure in your face. And with daily crowd-level scanning by large numbers of police, there's going to be A LOT of false positives.
Digital Barriers - Soon to be the answer to the trivia question "How are so many Britons able to identify an area based on the taste of the pavement?"
[ link to this | view in chronology ]
Facial Recognition
Not that I am concerned about it today. It wasn't hard to see where it was going, and what it could possibly be used for. Where it went after that is anyone's guess. I simply moved on, but I do know it existed. I worked on it!
I didn't find 15% acceptable. We were in the 3-5% false rate, and I didn't find that acceptable. In those cases, it simply said no match. It could distinguish between identical twins! That always amazed me. At that time, given a large enough database, that certainly beat the current software. By now it could have been much better!
[ link to this | view in chronology ]
Re: Facial Recognition
We're talking here about cameras on cops in crowds. Say, 30 cops walking around downtown London or at a major sporting event or concert. The cameras each scanning 2000 faces an hour.
That's 1200 false positives an hour. Compared to, what, one actual wanted person a week? That's still not practical.
It reminds me of a review of some speech-to-text software a few years back: Voice artists might get better results, but the average schmuck was only going to get a "mere" 98% accuracy. The hassle of correcting the software every 50 words was still enough to render it not worth using.
[ link to this | view in chronology ]
Re: Re: Facial Recognition
Text to speech software works remarkably well now for some applications. I use it daily for sending text messages because I can compose a 100 character message faster with my voice than my fingers. There might be an error or two in the text but most of the time the errors aren't substantial.
[ link to this | view in chronology ]
Re: Re: Facial Recognition
[ link to this | view in chronology ]
Re: Facial Recognition
I suppose it doesn't matter since cops can always claim they are better than FR at a thousand meters in the dark in rain and fog when it turns out either or both misidentified someone. Don't care what FR, DNA tests, ID, or anything else says, he's the guy! I had to empty three clips into him. He was gonna turn me into a newt.
[ link to this | view in chronology ]
It does not matter, they can just say their extensive training allows them to determine you need a good beatin and whatnot. Jeffy boy Sessions (who looks like Granny from the Beverly Hill Billies) is here to save us all from the depraved criminals out there who belong in prison ... gotta protect those dividends.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
[ link to this | view in chronology ]
I think its more a reference to the commonness of beards and face coverings than a 'they all look alike' thing.
[ link to this | view in chronology ]
And now the pendulum has swung the other way. Enjoy your fleeting privacy, Cushing. You asked for it.
[ link to this | view in chronology ]
Re: My_Name_Here
Why don't you just shut your fucking cakehole?
Cheers… Ishy
[ link to this | view in chronology ]
Re:
And, as you've pointed out, if they have power, they abuse it, as Tim has been saying for ages.
[ link to this | view in chronology ]
Another reason its good
[ link to this | view in chronology ]
Facial Recognition
I do agree wireless "real time" transmissions to a processing site would be the main bottleneck, even with compression of the data.
[ link to this | view in chronology ]