Facial Recognition Company Says It Won't Sell To Law Enforcement, Knowing It'll Be Abused
from the taking-a-stand dept
We just recently wrote about employees at Amazon speaking out inside the company to complain about the company selling its face recognition tools (called: "Rekognition") to law enforcement. That resulted in the CEO of a maker of facial recognition software, Brian Brackeen, to publicly state that his company, Kairos, will not sell to law enforcement.
The full article is worth reading, but he notes that the technology will be abused and misused by law enforcement -- and often in a way that will lead to false arrests and murder:
Having the privilege of a comprehensive understanding of how the software works gives me a unique perspective that has shaped my positions about its uses. As a result, I (and my company) have come to believe that the use of commercial facial recognition in law enforcement or in government surveillance of any kind is wrong — and that it opens the door for gross misconduct by the morally corrupt.
To be truly effective, the algorithms powering facial recognition software require a massive amount of information. The more images of people of color it sees, the more likely it is to properly identify them. The problem is, existing software has not been exposed to enough images of people of color to be confidently relied upon to identify them.
And misidentification could lead to wrongful conviction, or far worse.
As he states later in the piece:
There is no place in America for facial recognition that supports false arrests and murder.
It's good to see this, and whether you support the police or not, we should appreciate this moment -- just as we should appreciate the people at Amazon who stood up and complained about this. Too often lately, the tech industry is getting slammed for not taking into account the impact of their technology in their rush to push forward innovation at any costs. I've always felt that that narrative is a bit exaggerated. I talk to a lot of entrepreneurs who really do think quite a lot about how their technology may impact the world -- both good and bad -- but it's good to see people in the industry speaking out publicly about how that might happen, and why they need to make sure not to oversell the technology in a way where it's likely to cause real harm.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: brian brackeen, face recognition, law enforcement
Companies: kairos
Reader Comments
Subscribe: RSS
View by: Time | Thread
Tragic
[ link to this | view in chronology ]
Re: Tragic
The evidence that we have seen doesn't really show that facial recognition will make people safer; in fact, there is a fair indication that the contrary would be true.
[ link to this | view in chronology ]
Re: Re: Tragic
And my comment wasn't specific to face recognition.
[ link to this | view in chronology ]
Re: Re: Re: Tragic
But, for the rest, I would counter that some tools cannot be used safely by anyone, even with best intentions.
[ link to this | view in chronology ]
Re: Re: Re: Re: Tragic
That said, a tool like facial recognition can be useful even with a very high error rate, simply because it can direct a human to take a closer look at a few faces in a sea of thousands of faces. From what I have seen, that is how the UK police use it, as a means of directing a trained officer to take a closer look, and not as an indication that they should go and arrest a face. U.S police practices on the other hand differ and it becomes much more dangerous in their hands.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Tragic
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Tragic
[ link to this | view in chronology ]
But will they use resellers who do?
More likely this is just a publicity stunt, and they've already filed a second set of incorporation papers, for the company that will be exclusively used for government business. And that company will of course, only hire people who have active secret clearances, so that if anybody talks about the product, they will get federal jail time.
This is how things are done now. Businesss' that do business with the state compartmentalize to allow the state to implement more brutal oversight over their employees. The company extrenalizes its internal security costs to the state, and the state gets to keep its accelerating slide into fascism a secret a little longer.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
"Facial Recognition Company Says It Won't Sell To Law Enforcement"
Because they have already sold it to all of them.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Even in the event of a compartmentalised corporate structure that allows abusive trade to be laundered out (as per AC's comment above), the mere acknowledgement of the issues is something that would inevitably come back to bite them later on. Few such secrets ever seem likely to last forever.
Today, this man and his company show every sign of having conducted themselves with honour. A measure of respect is something they've earned. :)
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
Well not with that attitude it won't! Clearly the tech would work perfect, if not better, if those lazy slackers would just nerd harder!
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
I feel much better
So I only will be abused by commercial interest then?
[ link to this | view in chronology ]
Re: I feel much better
[ link to this | view in chronology ]
[ link to this | view in chronology ]