Congress Members Demand Answers From, Investigation Of Federal Facial Rec Tech Users
from the software-terrible-with-names,-worse-with-faces dept
The ACLU's test of Amazon's facial recognition software went off without a hitch. On default settings, the software declared 28 Congressional members to be criminals after being "matched" with publicly-available mugshots. This number seemed suspiciously low to cynics critical of all things government. The number was also alarmingly high, as in an incredible amount of false positives for such a small data set (members of the House and Senate).
Amazon argued the test run by the ACLU using the company's "Rekognition" software was unfair because it used the default settings -- 80% "confidence." The ACLU argued the test was fair because it used the default settings -- 80% confidence. Amazon noted it recommended law enforcement bump that up to 95% before performing searches but nothing in the software prompts users to select a higher setting for more accurate results.
This upset members of Congress who weren't used to be called criminals… at least not by a piece of software. More disturbing than the false positives was the software's tendency to falsely match African-American Congressional reps to criminal mugshots, suggesting the act of governing while black might be a criminal activity.
Congressional members sent a letter to Amazon the same day the ACLU released its report, demanding answers from the company for this abysmal performance. Ron Wyden has already stepped up to demand answers from the other beneficiaries of this tech: federal law enforcement agencies. His letter [PDF] reads like an expansive FOIA request, only one less likely to be arrive with redactions and/or demands the scope of the request be narrowed.
Wyden is asking lots of questions that need answers. Law enforcement has rushed to embrace this technology even as multiple pilot programs have generated thousands of bogus matches while returning a very small number of legitimate hits. Wyden wants to know what fed agencies are using the software, what they're using it for, and what they hope to achieve by using it. He also wants to know who's supplying the software, what policies are governing its use, and where it's being deployed. Perhaps most importantly, Wyden asks if agencies using facial recognition tech are performing regular audits to quantify the software's accuracy.
That isn't the only facial recognition letter-writing Wyden has signed his name to. The Hill reports Congressional reps have also sent one to the Government Accountability Office, asking it to open an investigation into facial recognition software use by federal agencies.
"Given the recent advances in commercial facial recognition technology - and its expanded use by state, local, and federal law enforcement, particularly the FBI and Immigration and Customs Enforcement - we ask that you investigate and evaluate the facial recognition industry and its government use," the lawmakers wrote.
The letter, signed by Rep. Jerrold Nadler and Sens. Ron Wyden, Cory Booker, Christopher Coons (D-Del.) and Ed Markey (D-Mass.), asks the GAO to examine "whether commercial entities selling facial recognition adequately audit use of their technology to ensure that use is not unlawful, inconsistent with terms of service, or otherwise raise privacy, civil rights, and civil liberties concerns."
The public has a right to know what public surveillance methods are being deployed against it and how accurate or useful these tools are in achieving agencies' stated goals. Privacy expectations all but vanish when the public goes out in public, but that doesn't mean their daily movements can automatically be considered grist for a government surveillance mill. Whatever privacy implications there are likely have not been addressed pre-deployment if recent surveillance tech history is any indication. Before the government wholeheartedly embraces tech with unproven track results, federal agencies need to spend some quality time with the people they serve and their overseers that act as a proxy for direct supervision.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: congress, criminals, face recognition, matching, rekognition, ron wyden
Companies: amazon
Reader Comments
The First Word
“... SuperSenator Ron Wyden is fully on this case and will vanquish Amazon within a fortnight, just like he did with NSA.
Subscribe: RSS
View by: Time | Thread
[ link to this | view in thread ]
I'd expect law enforcement to turn down confidence to 20%
Racial profiling does not get a dim view by the courts because it wouldn't deliver results but because it has nothing to do with individual justice and personal responsibility and the results it delivers partly are a self-fulfilling prophesy.
Now maximizing short-term law enforcement results when faced with a populace with 2% crime rate and a populace with 1% crime rate does not mean focusing 2/3 of your attempts on group 1 and 1/3 of your attempts on group 2 but rather focusing 100% of your attempts on group 1.
That's troubling. It's also one of the reasons prejudice is not just stupid but actually effective, and still we cannot afford to entertain it systematically in a society based on individual responsibility and justice.
At any rate, the problem of being misidentified that is worrying the given congress members is that such misidentification comes with physical danger: you may get beaten up to the point of death or shot dead. And that's not actually something that should happen to even someone correctly identified as a criminal.
Because the job of the police is to deliver the suspects to justice, not to deliver justice. If the consequences of misidentification weren't considerably more fatal in the U.S. than in civilized countries, this would be easier to shrug off.
[ link to this | view in thread ]
Side wagers, anyone?
[ link to this | view in thread ]
Re:
That should get rid of most false positives.
[ link to this | view in thread ]
But.. can they demand that they turn it up... or presumably don't turn it down? Or, are they given free range with a contract that just says they're not responsible for false positives if it's set too low?
That's going to be one of several problems going forward with this tech. Like tasers, speed cameras, body cameras and so forth, law enforcement in the US do seem to have a tendency to misuse the tools. If this becomes popular, I'd expect numerous court battles where lawyers for one side try to prove that the setting was too low, while Amazon have to decide whether to protect their exiting clients or their reputation.
Then, of course, the big one - when somebody's recognised, how do the cops react? In a decent world, they should simply be using this as a tool to help them do things they couldn't normally do easily, like identifying suspects in large crowds. But, we all know it'll be another case of "the computer says X" replacing common sense. It should be like a GPS being able to mark out a route more quickly and accurately than a person with a map, but it may end up be more like people driving into rivers because the GPS told them to.
The racial aspect is the final one - the tech itself will improve as more people are scanned, and the discrepancy is more likely to be one of the data collected than a deliberate bias. But, those using it could certainly wish to be biased, and when you combine being able to set "we don't really care about accuracy" with "we'll react as if it's 100% accurate every time", you do have some dangerous and deadly situations coming up.
[ link to this | view in thread ]
Re:
... SuperSenator Ron Wyden is fully on this case and will vanquish Amazon within a fortnight, just like he did with NSA.
[ link to this | view in thread ]
Someone stock up on Rorschach masks....
Someone's gonna make a killing on the Rorschach mask business. ;)
[ link to this | view in thread ]
https://www.vanityfair.com/news/2018/07/can-mark-zuckerberg-beat-fake-news-before-it-breaks-us
combin es with AI artificial recognition software combine.
[ link to this | view in thread ]
[ link to this | view in thread ]
They are more upset about the the bad PR than the fake identification. Their ROI maybe in jeopardy as a result of these revelations - so sad.
[ link to this | view in thread ]
Default Settings
The other question that pops, is if Amazon recommends that law enforcement use a 95% confidence rating, and law enforcement is their primary sales target, then why isn't the default setting 95%?
[ link to this | view in thread ]
Re:
I can tell you exactly what's going to happen;
Cop: [Has blurry image of suspect] Run this guy through facial rec.
Tech: Sorry, no hits.
Cop: Well try turning down the confidence.
Tech: It's at 80% and there's still no hits.
Cop: So go lower!
Tech: If I go any lower, it's even more likely to match some random person.
Cop: Just do your damn job and get me a match!
Tech: OK, it says this is your suspect. 45 year old father of two with no record.
Cop: Hey guys, we got a name for the asshole who took a shot at Jimmy last night. Let's go fuck him up!
[ link to this | view in thread ]
Re: Someone stock up on Rorschach masks....
Wearing a mask in public is actually illegal in some areas. And if people start doing it on a regular basis, you can be sure it will be made illegal everywhere else as well.
[ link to this | view in thread ]
[ link to this | view in thread ]
Re:
I had a boss who was notoriously abusive for this. They did not care at all about something until it directly affected them.
Childcare support? Flex hours to take care of personal needs? They were an abuse of company resources and signs of greedy employees. Employees should pay out of their own pocket for those things since the company did not benefit from the employee using them. Yah this Boss was a selfish asshole.
I'll give you one guess what happened when that boss finally had a kid of their own.
Did you guess that suddenly those perks were one of the best things to ever happen to the company? why yes you would be right! They very loudly patted themselves on the back for coming up with those company perks.
I'll give you one more guess what happened to those perks once the boss no longer needed to pay for childcare when their kids went to public school.
[ link to this | view in thread ]
Re: Default Settings
Because the software will only be as good as the input it gets, so 100% likely means that any photo it receives with any kind of artefact or blur will never return a match. That would make it effectively worthless, since anything taken outside of a studio would likely have things that only make it a 99.99% match at best.
Think of when you search on Google and you misspell a word - the software can detect that the a degree and return what it thinks you're looking for. Now imagine if it always returned zero results every time you did that instead.
"The other question that pops, is if Amazon recommends that law enforcement use a 95% confidence rating, and law enforcement is their primary sales target, then why isn't the default setting 95%"
I'd imagine because the software performs best at the lower setting, but for law enforcement purposes less false positives are preferred. Setting things that high for other purposes might make the software not reliable enough for customers who have less stringent requirements.
[ link to this | view in thread ]
Re: Re: Default Settings
I'd imagine because the software performs best at the lower setting, but for law enforcement purposes less false positives are preferred.
Ideally, yes, however more false positives means more excuses to engage in a search and/or 'have a chat' with someone. Much like 'probable cause on four legs' I suspect that a low accuracy would not be seen as a negative by a good number of those making use of the tech.
[ link to this | view in thread ]
Re: Re: Re: Default Settings
The problem is that I had to say "when" and not "if" in the above sentence, and sadly that's not something this software can solve one way or another, unless they make it useless. Which could also be a good thing depending on your point of view, but they are not going to do that.
[ link to this | view in thread ]
Re: I'd expect law enforcement to turn down confidence to 20%
And yet it happens every day
[ link to this | view in thread ]
Thanks for that!
[ link to this | view in thread ]
Re: Default Settings
Sorta like those crosswalk buttons.
[ link to this | view in thread ]
Re: Re: I'd expect law enforcement to turn down confidence to 20%
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re: Re:
[ link to this | view in thread ]
"regular audits to quantify the software's accuracy"
If we should be having frequent audits for reasonable-suspicion mechanisms that are sciencier than a drug dog, we should have them for drug dogs. And cheap field tests.
That we don't have them is indicative the court system doesn't want its methods scrutinized too closely, yet it's been thoroughly established they cannot be trusted without oversight.
So where's our oversight of the DoJ and the courts?
[ link to this | view in thread ]
Why is that comma?
[ link to this | view in thread ]
Comma
{Congress members demand} {answers from} [and] {Investigation of} {Federal Facial-Recognition Technology Users.}
I think the comma is there to help parse. It's an awkward sentence trying to combine multiple ideas.
[ link to this | view in thread ]
Re: Re:
[ link to this | view in thread ]
Re: Re:
[ link to this | view in thread ]
Re:
That comma stands for "as well as an" and it's not like the distraction of an overdone (I'd downcase "From" and "Of") headline capitalization helps picking the grammar apart.
So formally this is grammatical, and the grammar nazis will just rough up the headline writer and leave him in the gutter for mocking them rather than actually charging him with an actionable crime.
[ link to this | view in thread ]
Re: Default Settings
[ link to this | view in thread ]
Re: Re:
<trying and failing to imagine what it would take to tarnish a politician's image further than it already is>
[ link to this | view in thread ]
Re: Re: Default Settings
You Keep Using That Word, I Do Not Think It Means What You Think It Means
-- Inigo Montoya
[ link to this | view in thread ]
Wearing a mask is actually illegal
How about wearing hijab with face covering such as bushiyya? After SCOTUS has issued rulings that freedom of religion is more important than protecting employees or public accommodations¹ are we going to say that the police need to scan your face is more important than religion?
Then there's the matter of facepaint, given Juggalo clown paint defeats facial recognition. Another intersection between first amendment rights and [the color of] national security.
¹ SCOTUS' ruling opinions emphatically insisted this is not what we're saying but its rulings have triggered new lawsuits based on implications nonetheless, ones that have not been dismissed straight away based on the SCOTUS disclaimers.
[ link to this | view in thread ]
Re: Re: Re:
And given that many of them are minorities, I sincerely doubt this is the first time they've been racially profiled.
[ link to this | view in thread ]