Yet Another Bad Idea: Dropping Facial Recognition Software Into Police Body Cameras

from the Citizen-Rolodex dept

The FBI (and other US government agencies) are already moving forward with facial recognition technology, which will allow law enforcement to scan people like license plates, if everything goes to plan. So far, consultation with the meddling public has been kept to a minimum, as have any government efforts to address civil liberties concerns.

Just because the public's been kept out of the loop (except for, you know, their faces and other personal information), doesn't mean members of the public aren't working hard to ensure police officers can start running faces like plates, even when there's no legitimate law enforcement reason for doing so.

Digital Barriers, a somewhat ironically-named tech company, is pushing its latest law enforcement offering -- one that supposedly provides real-time face scanning.

The software can pick out and identify hundreds of individual faces at a time, instantly checking them against registered databases or registering unique individuals in seconds.

Demonstrating the software at the Forensics Europe Expo 2017, vice president of Digital Barriers Manuel Magalhaes said the company was introducing the technology to UK forces.

He said: “For the first time they (law enforcement) can use any surveillance asset including a body worn camera or a smartphone and for the first time they can do real time facial recognition without having the need to control the subject or the environment.

“In real time you can spot check persons of interests on their own or in a crowd."

But why would you? Just because it can be done doesn't mean it should be done. This will basically allow officers to run records checks on everyone who passes in front of their body-worn cameras. There is nothing in the law that allows officers to run checks on everyone they pass. They can't even stop and/or frisk every member of the public just because they're out in public. Expectations of privacy are lowered on public streets, but that doesn't make it reasonable to subject every passerby to a records check. And that's without even factoring in the false positive problem. Our own FBI seems to feel a 15% bogus return rate is perfectly acceptable.

Like so much surveillance equipment sold to law enforcement agencies, Digital Barrier's offering was developed and tested in one of our many war zones. The head of the company is inordinately proud of the product's pedigree, which leads to a statement that could be taken as bigoted if it weren't merely nonsensical.

Mr Magalhaes continued: “If we can overcome facial recognition issues in the Middle East, we can solve any facial recognition problem here in the United Kingdom.

Hopefully, this just refers to the sort of issues normally found in areas of conflict (hit-and-miss communications infrastructure, harsher-than-usual working conditions, etc.), rather than hinting Middle Eastern facial features are all kind of same-y.

Taking the surveillance out of the Middle East isn't going to solve at least one logistical problem keeping this from becoming a day-to-day reality for already heavily-surveilled UK citizens. As is pointed out by officers in the discussion thread, Digital Barrier's real-time face scanning is going to need far more bandwidth than is readily available to law enforcement. One commenter notes they can't even get a strong enough signal to log in into their email out in the field, much less perform the on-the-fly facial recognition Digital Barrier is promising.

The other pressing issues -- according to the law enforcement members discussing the post -- is one far more aligned with the general public's. A couple of members point out no one PNC's entire crowds (referring to the UK's law enforcement database: the Police National Computer) and that doing so might not even be legal.

Unfortunately, the rank-and-file rarely get to make these decisions. These choices will be made by people who think the public needs to give til it hurts when safety and security are on the line. Dropping this capability into body cameras will make them more of an intrusion on the lives of citizens and far less likely to result in police accountability. Faces being linked automatically to databases full of personal info creates complications in obtaining camera footage. It won't result in improved policing, even though there are plenty of supporters who mistakenly believe "easier" is synonymous with "better."

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: body cameras, face recognition, police


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    That One Guy (profile), 22 May 2017 @ 12:30pm

    Like a one-way mirror

    I would hope that the UK police are more 'camera friendly' than the US ones if they plan on rolling something like this out. Beyond the privacy concerns it would be just a titch hypocritical if the police objected to people recording them while they constantly record and check anyone they interact with.

    link to this | view in thread ]

  2. icon
    Anonymous Anonymous Coward (profile), 22 May 2017 @ 12:31pm

    Papers Please. No, forget the please, just give me your damn documents that verify your right to exist.

    Just how many steps away from having government required serial numbers tattooed to our foreheads at birth, worldwide?

    At some point there WILL be revolt, and it may spread worldwide. Then what do governments do? Are they really as short sighted as the auto-trading algorithms that Wall Street uses?

    I think yes.

    link to this | view in thread ]

  3. identicon
    Anonymous Coward, 22 May 2017 @ 1:31pm

    Perhaps this isn't such a bad idea...

    What if there was an app for smart phones linked to a facial recognition database of police officers accused of excessive violence, all-around being a dick, harassment, etc. that immediately puts the camera into record mode, while calling a lawyer?

    I'd say we could scale that back to convicted police officers, but since they're rarely if ever held accountable for anything, that database would be mostly useless.

    link to this | view in thread ]

  4. icon
    JoeCool (profile), 22 May 2017 @ 2:01pm

    Re: Papers Please. No, forget the please, just give me your damn documents that verify your right to exist.

    Don't be silly! The tattoo's a BAR CODE, not a serial number. Much easier to scan that way.

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 22 May 2017 @ 2:08pm

    Time for the west to adopt the Asian style of surgical/dust masks in public?

    link to this | view in thread ]

  6. icon
    aerinai (profile), 22 May 2017 @ 2:10pm

    If a person can do it, a computer should be able to do it.

    So I'm definitely going to swim against the current on this one a little bit... This technology isn't inherently bad, just how it is used (like license plate scanners).

    If the police were told 'Bob Jones is a bail jumper wanted for murder, here's his picture, be on the lookout' and then a cop pulls over someone that meets this description, I'd say that the officer is using his best judgement. I don't think anyone would fault the officer for doing this. I would also bet money, that the officer is wrong much higher than 15% of the time. I can see this technology being great at catching criminals assuming they don't take it too far.

    I alluded to license plate scanners... if you just scan and dump when there is no reason to keep the information (or even a transient 72 hour hold), I don't see a big deal with this. A cop could do the exact same thing with his brain. You are just automating it... But the problem comes in when you begin to perpetually store this stuff and start using that data to cross check and query... THAT is where it crosses a line.

    I would also say giving personal information to the officer about a person that he doesn't have a reason to know is also a step too far... hooking in facial recognition to Facebook or a database of non-violent felons. Assuming they are just doing this stuff behind the scenes in a computer somewhere (and the data is dumped after a period), I'm ok with this.


    However, this isn't the software or technology's fault, it is how it is implemented and used.

    And yes... I know that they will totally be abusing this... but assuming they put in proper safe guards (which they probably won't...) I would be fine with this.

    link to this | view in thread ]

  7. identicon
    Anonymous Coward, 22 May 2017 @ 2:25pm

    Re: Papers Please. No, forget the please, just give me your damn documents that verify your right to exist.

    I think governments are pretty happy with the current situation. Most people carry around a tracking device and use cashless forms of payment. A revolt is unlikely while everyone continues to fund their own surveillance.

    link to this | view in thread ]

  8. icon
    Kay (profile), 22 May 2017 @ 2:26pm

    How accurate?

    I'm curious how accurate this particular implementation of the technology is...

    Seems like it would be just a matter of time until the incorrect person is identified, runs because they're scared and gets killed.

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 22 May 2017 @ 2:29pm

    ...which leads to a statement that could be taken as bigoted if it weren't merely nonsensical.

    Hopefully, this just refers to the sort of issues normally found in areas of conflict (hit-and-miss communications infrastructure, harsher-than-usual working conditions, etc.),

    So which is it? Is it nonsensical, or is it potentially referring to very real problems with deploying new technology in real-life situations.

    I'm willing to give Tim the benefit of the doubt on leveling thinly supported racism accusations against his "opponent," but claiming the statement is nonsensical beforehand, and then providing several perfectly reasonable and easily deducible explanations immediately afterward is pushing the bounds of credulity.

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 22 May 2017 @ 2:48pm

    It is already going to be applied

    Don't we already know that a prominent company is offering to supply cops around the country with free bodycams, in return for exclusive rights to the video from those cams? Do you honestly think they aren't planning on using any and every tool at their disposal to data-mine that video for as much as they possibly can? Soon we will see CCTV cams going up everywhere with similar terms turning public spaces into privacy free zones.

    link to this | view in thread ]

  11. identicon
    Anonymous Coward, 22 May 2017 @ 2:49pm

    I predict a future in which face-concealing clothing becomes the standard for everyone.

    link to this | view in thread ]

  12. icon
    That Anonymous Coward (profile), 22 May 2017 @ 2:51pm

    Re: Papers Please. No, forget the please, just give me your damn documents that verify your right to exist.

    I am really annoyed with people crediting me with the whole number assigned to people thing.
    Just because they say I'm going to do it before the end times, doesn't mean I am.

    They've screwed everything up way more than I ever could have.

    link to this | view in thread ]

  13. icon
    Anonymous Anonymous Coward (profile), 22 May 2017 @ 3:06pm

    Re: Re: Papers Please. No, forget the please, just give me your damn documents that verify your right to exist.

    I am really annoyed with people crediting me with the whole number assigned to people thing.

    This could be taken as you want to be a fraction. :)

    link to this | view in thread ]

  14. identicon
    Anonymous Coward, 22 May 2017 @ 3:16pm

    Re: If a person can do it, a computer should be able to do it.

    It's inevitable. The technology is going to rapidly get less expensive and better.

    I want this for my own use.

    link to this | view in thread ]

  15. identicon
    Anonymous Coward, 22 May 2017 @ 3:22pm

    Middle East difficulty may be a reference to the common use of burkas that obscure most of the face

    link to this | view in thread ]

  16. identicon
    Anonymous Coward, 22 May 2017 @ 3:23pm

    Response to: Anonymous Coward on May 22nd, 2017 @ 3:22pm

    And thick beards

    link to this | view in thread ]

  17. icon
    Roger Strong (profile), 22 May 2017 @ 4:01pm

    There are people who keep being given the pavement taste test by police because a criminal has stolen their identity. Ever time their name is removed from police records, it gets added back in when police share data with another jurisdiction.

    Now it'll only take roughly the same bone structure in your face. And with daily crowd-level scanning by large numbers of police, there's going to be A LOT of false positives.

    Digital Barriers - Soon to be the answer to the trivia question "How are so many Britons able to identify an area based on the taste of the pavement?"

    link to this | view in thread ]

  18. identicon
    Anonymous Coward, 22 May 2017 @ 4:17pm

    Re: If a person can do it, a computer should be able to do it.

    If the technology exists it will be abused. For now people rail against the technology. Then, when it is inevitably abused, people will rail against the abuse instead.

    All of the railing is hollow, however. Until people get sufficiently pissed off to do something about it nothing will be done. And by the time people get sufficiently pissed off it may well be too late.

    link to this | view in thread ]

  19. icon
    sigalrm (profile), 22 May 2017 @ 4:57pm

    Re: How accurate?

    "I'm curious how accurate this particular implementation of the technology is..."

    The article above states a false positive rate of 15%. To put that in perspective, Centurylink Field in Seattle has a listed max-capacity of 67,000 people, which means that over the course of a Seahawks game, as many as 10,050 people would be misidentified.

    "Seems like it would be just a matter of time until the incorrect person is identified, runs because they're scared and gets killed."

    Not a problem - qualified immunity covers that scenario. Besides, "only guilty people run". /s

    link to this | view in thread ]

  20. identicon
    Faceguy, 22 May 2017 @ 5:06pm

    Facial Recognition

    Don't fool yourself. There is software with a far better false positive rate than 15%. It would have been available back in 2008, but the big crash stopped any financial backing to build the company. A year later, it went belly up. I know it worked.

    Not that I am concerned about it today. It wasn't hard to see where it was going, and what it could possibly be used for. Where it went after that is anyone's guess. I simply moved on, but I do know it existed. I worked on it!

    I didn't find 15% acceptable. We were in the 3-5% false rate, and I didn't find that acceptable. In those cases, it simply said no match. It could distinguish between identical twins! That always amazed me. At that time, given a large enough database, that certainly beat the current software. By now it could have been much better!

    link to this | view in thread ]

  21. icon
    sigalrm (profile), 22 May 2017 @ 5:06pm

    Re: Re: How accurate?

    I just realized, the figure above is for American football, and Digital Barriers is a UK Company so let's try this:

    According to Wikipedia, Emirates Stadium, Home of the Arsenal FC in London, England has a capacity (according to Wikipedia) of "over 60,000".

    Assuming a 15% false positive rate, grade-school math says ~9000 people per Arsenal game would be misidentified.

    link to this | view in thread ]

  22. identicon
    Anonymous Coward, 22 May 2017 @ 5:08pm

    Re: Re: If a person can do it, a computer should be able to do it.

    You need to vote - all of you,no excuses!

    link to this | view in thread ]

  23. identicon
    Anonymous Coward, 22 May 2017 @ 5:10pm

    Re:

    The future is now

    link to this | view in thread ]

  24. identicon
    Anonymous Coward, 22 May 2017 @ 5:15pm

    Which has a higher rate of false positives ... facial regression or roadside drug tests?

    It does not matter, they can just say their extensive training allows them to determine you need a good beatin and whatnot. Jeffy boy Sessions (who looks like Granny from the Beverly Hill Billies) is here to save us all from the depraved criminals out there who belong in prison ... gotta protect those dividends.

    link to this | view in thread ]

  25. icon
    Anonymous Anonymous Coward (profile), 22 May 2017 @ 5:23pm

    Re:

    I thought he looked more like Recep Tayyip Erdogan, and you know, that creature from that movie.

    link to this | view in thread ]

  26. icon
    Roger Strong (profile), 22 May 2017 @ 5:28pm

    Re: Facial Recognition

    Suppose they get the false rate down to 2%.

    We're talking here about cameras on cops in crowds. Say, 30 cops walking around downtown London or at a major sporting event or concert. The cameras each scanning 2000 faces an hour.

    That's 1200 false positives an hour. Compared to, what, one actual wanted person a week? That's still not practical.

    It reminds me of a review of some speech-to-text software a few years back: Voice artists might get better results, but the average schmuck was only going to get a "mere" 98% accuracy. The hassle of correcting the software every 50 words was still enough to render it not worth using.

    link to this | view in thread ]

  27. icon
    orbitalinsertion (profile), 22 May 2017 @ 5:30pm

    Re:

    It's a standard problem with all manner of things which are designed or studied against the dominant culture (or WEIRD groups), which has inherent bigoted assumptions given they always want to deploy or generalize to the whole world. In fact, usually not against the group they took their assumed metrics from whatsoever. Even if the guy isn't making some kind of allusion to "they all look alike", he is still implying it because why bother mentioning "in the Middle East"?

    Yes, there is a problem with deploying the technology, and _technologically_ it is irrelevant against whom you are deploying it.

    It could simply mean that usage in Middle East was pretty much their only market prior, but that there is pretty much predicated on bigotry in the first place, even if Magalhaes didn't just happen to process a bit of endemic cultural racist script or outright intentionally other "Middle Easterners".

    Was it worth noting? I probably would have noticed for a moment myself. But there is a lot of baked-in cultural bigotries, sometimes subtle, but generally the responses to potential moments in bigotry (conscious or unconscious) being pointed out are a bit telling.

    link to this | view in thread ]

  28. icon
    orbitalinsertion (profile), 22 May 2017 @ 5:32pm

    Why don't they just troll phonebooks or go door to door every day?

    link to this | view in thread ]

  29. identicon
    Anonymous Coward, 22 May 2017 @ 5:33pm

    Re: If a person can do it, a computer should be able to do it.

    Let's just attach a minigun to the camera and be done with it.

    link to this | view in thread ]

  30. icon
    orbitalinsertion (profile), 22 May 2017 @ 5:37pm

    Re: Facial Recognition

    So they are/were leaps and bounds ahead of everyone else whose implementations still give significant false positive rates?

    I suppose it doesn't matter since cops can always claim they are better than FR at a thousand meters in the dark in rain and fog when it turns out either or both misidentified someone. Don't care what FR, DNA tests, ID, or anything else says, he's the guy! I had to empty three clips into him. He was gonna turn me into a newt.

    link to this | view in thread ]

  31. identicon
    Agammamon, 22 May 2017 @ 6:11pm

    >rather than hinting Middle Eastern facial features are all kind of same-y.

    I think its more a reference to the commonness of beards and face coverings than a 'they all look alike' thing.

    link to this | view in thread ]

  32. identicon
    My_Name_Here, 22 May 2017 @ 7:00pm

    I previously warned that insisting on increased usage of body cameras on police would cause problems. Footage of victims would be stored because Tim Cushing wants his thrills of catching police in "gotcha!" moments.

    And now the pendulum has swung the other way. Enjoy your fleeting privacy, Cushing. You asked for it.

    link to this | view in thread ]

  33. icon
    mhajicek (profile), 22 May 2017 @ 9:45pm

    Re: Papers Please. No, forget the please, just give me your damn documents that verify your right to exist.

    Don't be silly. Existing isn't a right, it's a privilege. Have you paid your lisence fee?

    link to this | view in thread ]

  34. icon
    mhajicek (profile), 22 May 2017 @ 9:49pm

    Re:

    It will be outlawed.

    link to this | view in thread ]

  35. identicon
    Anonymous Coward, 23 May 2017 @ 12:19am

    Another reason its good

    Cops will have a reason to keep their cameras on, instead of finding all the reasons for them to be off.

    link to this | view in thread ]

  36. identicon
    Anonymous Coward, 23 May 2017 @ 6:23am

    Re: Re:

    Hoodie == Terrorist

    link to this | view in thread ]

  37. identicon
    Anonymous Coward, 23 May 2017 @ 6:30am

    Re: Re: Facial Recognition

    Facial recognition software can be tuned. I'm guessing for civilian use, they are going to set the dials to accept some false negatives to avoid false positives.

    Text to speech software works remarkably well now for some applications. I use it daily for sending text messages because I can compose a 100 character message faster with my voice than my fingers. There might be an error or two in the text but most of the time the errors aren't substantial.

    link to this | view in thread ]

  38. icon
    Kay (profile), 23 May 2017 @ 8:23am

    Re: Re: How accurate?

    Thanks, I didn't see the false positive rate in the article I read. I even went to their web site, which doesn't seem to have any mention of this technology being deployed for LEO's.

    15% seems unacceptably high... I wonder how often it fails to identify someone, I bet that's even higher.

    link to this | view in thread ]

  39. identicon
    Anonymous Coward, 23 May 2017 @ 9:42am

    Re: Re: If a person can do it, a computer should be able to do it.

    I assume stalking is one of your hobbies?

    link to this | view in thread ]

  40. identicon
    Faceguy, 23 May 2017 @ 10:21am

    Facial Recognition

    In my experience, the higher the camera resolution, the lower the fail rate. I really doubt an officer's chest cam could have that high a storage for a day's worth of video. Now, the new 4k cameras would have cut the 2008 cameras fail rate down to 1-2 percent range. The system I worked on used two cameras, so image storage would have been doubled.

    I do agree wireless "real time" transmissions to a processing site would be the main bottleneck, even with compression of the data.

    link to this | view in thread ]

  41. icon
    Ishtiaq (profile), 23 May 2017 @ 11:30am

    Re: My_Name_Here

    @my name here (is shit)

    Why don't you just shut your fucking cakehole?

    Cheers… Ishy

    link to this | view in thread ]

  42. identicon
    Anon, 24 May 2017 @ 5:31am

    Re: Re: Facial Recognition

    The false positive rate of state-of-the-art face rec systems is under 0.5% FR at 1% FA and dropping all the time.

    link to this | view in thread ]

  43. identicon
    Wendy Cockcroft, 25 May 2017 @ 5:41am

    Re:

    As a matter of fact, he just wants them to be subject to the law, not above it. Why is that so hard for you to understand?

    And, as you've pointed out, if they have power, they abuse it, as Tim has been saying for ages.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.