Congress Members Want Answers After Amazon's Facial Recognition Software Says 28 Of Them Are Criminals
from the but-they're-all-crooks-amirite dept
Hey, American citizens! Several of your Congressional representatives are criminals! Unfortunately, this will come as a completely expected news to many constituents. The cynic in all of us knows the only difference between a criminal and a Congressperson is a secured conviction.
We may not have the evidence we need to prove this, but we have something even better: facial recognition technology. This new way of separating the good and bad through the application of AI and algorithms is known for two things: being pushed towards ubiquity by government agencies and being really, really bad at making positive identifications.
At this point it's unclear how much Prime members will save on legal fees and bail expenditures, but Amazon is making its facial recognition tech ("Rekognition") available to law enforcement. It's also making it available to the public for testing. ACLU took it up on its offer, spending $12.33 to obtain a couple dozen false hits using shots of Congressional mugs.
In a test the ACLU recently conducted of the facial recognition tool, called “Rekognition,” the software incorrectly matched 28 members of Congress, identifying them as other people who have been arrested for a crime.
The members of Congress who were falsely matched with the mugshot database we used in the test include Republicans and Democrats, men and women, and legislators of all ages, from all across the country.
The bad news gets worse.
The false matches were disproportionately of people of color, including six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-Ga.).
And here's the chilling lineup of usual suspects according to Amazon's Rekognition:
Using 25,000 publicly-available mugshots and Rekognition's default settings, the ACLU picked up a bunch of false hits in very little time. This is only a small portion of what's available to law enforcement using this system. Agencies have access to databases full of personal info and biometric data for hundreds of thousands of people, including people who've never been charged with a crime in their lives.
The obvious downside to a false hit is, at minimum, the unjustified distribution of identifying info to law enforcement officers to confirm/deny the search results. At most, it will be the loss of freedom for someone wrongly identified as someone else. Recourse takes the form of lawsuits with a high bar for entry and slim likelihood of success, thanks to several built-in protections for law enforcement officers.
Amazon continues to market this system to law enforcement agencies despite its apparent shortcomings. Very little has been written about the successes of facial recognition technology. There's a good reason for this: there aren't that many. There certainly haven't been enough to justify the speedy rollout of this tech by a number of government agencies.
This little experiment has already provoked a response from Congressional members who are demanding answers from Amazon about the ACLU's test results. Amazon, for its part, claims the ACLU's test was "unfair" because it used the default 80% "confidence" setting, rather than the 95% recommended for law enforcement. The ACLU has responded, noting this is the default setting on Rekognition and nothing prompts the user -- which could be a law enforcement officer -- to change this setting to eliminate more false positives. In any event, at least Congress is talking about it, rather than nodding along appreciatively as federal agencies deploy the tech without public consultation or mandated privacy impact reports turned in.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: congress, face recognition, john lewis, rekognition
Companies: aclu, amazon
Reader Comments
Subscribe: RSS
View by: Time | Thread
Goose, Gander, Good, can you feel now?
It is too bad that they weren't actually taken into custody and held for some time (it would be illegal to arrest a congress person on their way to a vote) in order for someone up there in the ethereal levels of government to take notice. If they are as vulnerable as the rest of us, they might put aside their quest for power, and do something for the rest of us.
I have little hope, but this might give them a nudge in the right direction.
[ link to this | view in chronology ]
Re: Goose, Gander, Good, can you feel now?
...you...have no idea who John Lewis is, do you.
[ link to this | view in chronology ]
Re: Re: Goose, Gander, Good, can you feel now?
A little, but he wasn't subject to false facial recognition then. There were other, also ill considered, reasons for his arrests.
[ link to this | view in chronology ]
Re: Re: Re: Goose, Gander, Good, can you feel now?
I would guess that quite a few of the people in that list are familiar with those things.
[ link to this | view in chronology ]
Famous people getting profiled
I've seen the effect more often from famous people getting caught up in police shenanigans when they're just trying to duck the crowds to do some light shopping.
The old Henry V trick of kings wandering about engaging their subjects while disguised in order to pick up the pulse of the common public has appeared in history a few times. It would make sense for our officials to try it occasionally if they cared about public opinion.
[ link to this | view in chronology ]
Re: Famous people getting profiled
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Only 28 what happened to the rest? Why were they not branded too?
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Are they concerned about the alarmingly low identification rate?
[ link to this | view in chronology ]
Need more info
Does anyone know whether or not Rekognition uses skin color for matching? I think most facial recognition techniques do not use skin color at all [citation needed].
[ link to this | view in chronology ]
Re: Need more info
[ link to this | view in chronology ]
Re: Need more info
Facial recognition software has no race bias. If the results are "disproportionate" then the database of facial structures must represent the biased class in greater numbers. Don't blame the tool.
[ link to this | view in chronology ]
Re: Re: Need more info
https://en.wikipedia.org/wiki/Algorithmic_bias
It's easy to say the computer doesn't care so it's impartial - but it often isn't.
[ link to this | view in chronology ]
Re: Re: Re: Need more info
[ link to this | view in chronology ]
Re: Re: Re: Re: Need more info
[ link to this | view in chronology ]
Re: Re: Need more info
In theory, it should have no racial bias.
However, in practice, it assigns a probability that two images are of the same face based on how likely it is that two people would have the same [insert list of facial features].
If the algorithm fails to account for multicollinearity (that is, the fact that two data points often show up together and thus the existence of the second doesn't prove much once you know the existence of the first), then it can absolutely be racially biased. A poorly programmed algorithm, trained mostly on white faces, could easily conclude that two black people who share features uncommon to white faces, but common among black faces, look enough alike to be flagged as the same person. To have, really, a bias that all black people look alike, which would be incredibly racist.
Does this algorithm have that kind of bias? I don't have enough information to know. It's certainly happened in the past.
But categorically ruling it out seems foolish. And contending that it's because the sample is larger is especially so: these kinds of algorithm are more accurate when they have more data to train themselves on, so underrepresented racial groups are more likely to trigger false positives than overrepresented ones.
[ link to this | view in chronology ]
Re: Re: Re: Need more info
That's a very good point
[ link to this | view in chronology ]
Re: Re: Need more info
So tacking on "with a computer" disperses all blame? A lot of prejudice is statistically valid and individually unjust. As long as our judicial system punishes people individually rather than for having features correlated to criminals, being racially unbiased, treating everybody as an individual until proven differently, is hard work. It's also necessary in order not to cause self-fulfilling prophesies and have society progress as a whole, reward individual virtue, and be visible in similar ways to all constituents so that they can vote and campaign in a qualified manner.
"with a computer" does not magically disperse the bias reflecting our current society and its history. Nor does "with statistics".
[ link to this | view in chronology ]
Re: Need more info
False positives would be expected to be "disproportionately of people of color" whenever that segment of the population is disproportionately criminal.
The widespread adoption of facial recognition by police could spark a boom in "defensive" plastic surgery, especially of the extreme variety. While the "old" Michael Jackson might have resembled many common criminals, the "new" Michael Jackson really didn't resemble any other human on the planet, making any potential false positive extremely unlikely.
https://www.forbes.com/sites/mzhang/2015/07/01/google-photos-tags-two-african-americans-as-gorillas- through-facial-recognition-software/
[ link to this | view in chronology ]
Re: Re: Need more info
...or if that segment of the popukation is disproportionately criminalIZED. Consider: they are matching to convicted criminals, right? So if, say, black people are more likely to be convicted of the same crime than white people? If, just for a hypothetical, black people are more likely to be arrested in the first place on marijuana-related charges, despite drug use actually being relatively even among most segments of society? Or, say, minorities being less likely to get out of convictions due to having poorer legal representation and more bias against them in a courtroom setting?
[ link to this | view in chronology ]
Sigh.
[ link to this | view in chronology ]
Re: Need more info
Me and my brother grew up thinking for a while that our grandfather was black because of what a dark tan he always had year round. From what my mother said it built up over time because of the lack of sun screen protection products that worked back in his days.
[ link to this | view in chronology ]
Re: Need more info
One known issue is that some algorithms use the light reflecting off the nose as a factor and darker skin results in less reflectivity and thus less variance in the reflection. In general, facial recognition algorithms consistently have the best results on the majority ethnicity of the country where they were designed.
[ link to this | view in chronology ]
Re: Need more info
And I bet what's chapping the buttocks of the leadership class is they are being confused with the other class.
[ link to this | view in chronology ]
Re: Re: Need more info
[ link to this | view in chronology ]
Re: Re: Need more info
Already addressed this downthread, but here it is again:
It would probably be helpful to actually look up the people who have been snagged by this racially-biased mismatch and find out who they are and what they're about rather than make ignorant generalizations about people in Congress. If you seriously believe that John Lewis's real problem with being racially profiled is that he doesn't want to be mistaken for a poor person, then congratulations on having no fucking idea who John Lewis is.
[ link to this | view in chronology ]
I'd say it got it just right.
[ link to this | view in chronology ]
So if I
[ link to this | view in chronology ]
So if I'm a prime member..
Sign up for Prime, and don't be recognized as a Felon! What a sales tactic! :)
[ link to this | view in chronology ]
5 of 435 is 1.2%. The ACLU is a few hundred trouble-makers
out of 200 million Americans#. So this non-story is at best driven by 0.000-something% and then 1.2%, while you ignore a hundred items of high importance. Typical Techdirt Tempest-in-a-Thimble.
And anyway, WHAT THE HELL IS THE POINT OF CRITICIZING BETA SOFTWARE?
Next story, please. Probably have to wait 2 hours for another ginned-up fanboy-feeding re-write from several days ago.
Using reasonable definition of "American": doesn't include you antis who want it changed to European feudalism or globalism, incoherent malcontents-without-a-cause (not even up to rebels, just grrr and stuff), nor those here illegally.[ link to this | view in chronology ]
Re: 5 of 435 is 1.2%. The ACLU is a few hundred trouble-makers
Societies prosecute social outliers. The peredo principle scales, even if you prune data. Which is to say that there will ALWAYS be outliers, even when you get rid of all of the current people you thought were outliers.
People who think they haven't perpetrated at least one felony in their life, haven't read much law. What we're talking about here is a technical system, that is objectively evaluating candidates for prosecution within a subjective sociological system, that has a false positive ratio of 100%. Because all of us are criminals.
There is no question that this is going to go completely off the rails. There is a ratio that the accuracy of law enforcement can not scale beyond without serious consequences. The law is simply not fit to become cybernetically enhanced, regardless of how good the tech is. As the pressure mounts, the most likely outcome, is that descrimination on non-legal basis' will become the pressure valve. Racism, sexism, etc. Will be determinant more than the crime, because the field of prosecution will be abundant, and the prosecutors will therefore be compelled to choose. And you can be assured, they won't choose people like them.
[ link to this | view in chronology ]
"not fit to become cybernetically enhanced"
The FBI was formed to target the 20th century mobs and foreign espionage elements on US soil. After the cold war ended, it has not transitioned well, which is why now it singles out mentally disabled people and frames them for terrorist-like activities.
Similarly, cannabis is becoming decriminalized and cocaine and meth have dropped off. Heroin is on the rise thanks to the opiate crisis, but arresting people who were hooked by their own doctors doesn't look good. So the DEA has also turned to entrapment and busts with false evidence.
Part of the problem is that we have these agencies which were meant to attack certain types of crime. But if they succeed in actually reducing that crime (or the crime reduces on its own due to other circumstances) then they lose sweet, sweet budget money, and they have to maintain a high conviction rate, even if it's manufactured.
[ link to this | view in chronology ]
Re: 5 of 435 is 1.2%. The ACLU is a few hundred trouble-makers
[ link to this | view in chronology ]
Re: Re: 5 of 435 is 1.2%. The ACLU is a few hundred trouble-makers
[ link to this | view in chronology ]
Re: 5 of 435 is 1.2%. The ACLU is a few hundred trouble-makers
To get bugs fixed?
[ link to this | view in chronology ]
Re: 5 of 435 is 1.2%. The ACLU is a few hundred trouble-makers
[ link to this | view in chronology ]
Re: 5 of 435 is 1.2%. The ACLU is a few hundred trouble-makers
[ link to this | view in chronology ]
That’s your 1,000,002 anomaly. Congratulations!!!
[ link to this | view in chronology ]
Re: That’s your 1,000,002 anomaly. Congratulations!!!
False positives is how his favorite copyright enforcement corporations make bank by suing anyone and everyone.
[ link to this | view in chronology ]
Wait...
[ link to this | view in chronology ]
"At most, it will be the loss of freedom..." Hardly.
[ link to this | view in chronology ]
Re: "At most, it will be the loss of freedom..." Hardly.
[ link to this | view in chronology ]
Re: Re: "At most, it will be the loss of freedom..." Hardly.
[ link to this | view in chronology ]
Only 28?
[ link to this | view in chronology ]
In other news...
[ link to this | view in chronology ]
Really? Looks to me like at least half appear to be white people.
[ link to this | view in chronology ]
Re:
And if you take a group of people that is 81% white and extract a sample that is only half white, that's called disproportionate.
[ link to this | view in chronology ]
How does this compare to humans?
[ link to this | view in chronology ]
Re: How does this compare to humans?
Are you saying that pictures of Congress persons aren't mugshots?
[ link to this | view in chronology ]
Re: Re: How does this compare to humans?
[ link to this | view in chronology ]
Re: How does this compare to humans?
[ link to this | view in chronology ]
Re: Re: How does this compare to humans?
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Rep. John Lewis
[ link to this | view in chronology ]
I'd want answers too!
[ link to this | view in chronology ]
Nice Amazon!
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
80% confidence means that for every "positive" there is a 20% chance that it's a "false positive". That, on average, is a 20% error rate or 2 in 10 false positives.
[ link to this | view in chronology ]
Re:
No, the confidence level is what percentage of a person's face matches that of another photo. Think of it this way;
If you were to tell a program to match words with 60% the same letters, it would match the words "Aloha" and "Alone", however if you increased the threshold to 80% or higher, the words would no longer match, because only 60% of the letters are the same.
[ link to this | view in chronology ]
Congress Wants Answers after Amazon's Facial Recognition Software Says 28 Of Them Are Criminals
Question: Why wasn't the number members of Congress identified as criminals much higher than 28?
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Google Face Recognition
[ link to this | view in chronology ]
Re: Google Face Recognition
[ link to this | view in chronology ]
Re: Re: Google Face Recognition
...based on self-reported data. If one suspected Raul to be a crook, one might wish to check a different source of data.
[ link to this | view in chronology ]
I haven't read the comments, I'm sure this hasn't been said yet
[ link to this | view in chronology ]
Re: I haven't read the comments, I'm sure this hasn't been said yet
[ link to this | view in chronology ]
Re: Re: I haven't read the comments, I'm sure this hasn't been said yet
[ link to this | view in chronology ]
Re: Re: Re: I haven't read the comments, I'm sure this hasn't been said yet
[ link to this | view in chronology ]
At best Amazon uses it for advertising purposes (acquitted of manslaughter? Check out our Amazon Basics 22-piece kitchen knife set!) At worst people with the same name as felons can't shop online.
[ link to this | view in chronology ]
HOLD IT..
And if anyone of US, was Identified..
Lets put them in jail and send them to court, and have them PROVE they are NOT who we KNOW they are..
[ link to this | view in chronology ]
80% is enough...
[ link to this | view in chronology ]
I'm sure that Amazon's CEO will be appearing in a congressional hearing over this once they learn of what happened.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Precognition, Phrenology and Politicans
Congress Members Want Answers After Amazon's Facial Recognition Software Says 28 Of Them Are Criminals
Mayhap Amazon's Facial Recognition Software also has a precognition feature built in and is able to discern which Congress Member will turn to a life of political crime in the future.
[ link to this | view in chronology ]
I feel pretty sure that there are way more that 25k mugshots that could be used even if only publicly available ones were all that LEO had access to.
Makes you wonder how many would have been flagged if a larger set than 25k had been used.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
That joke is way funnier the sixth time!
[ link to this | view in chronology ]
OMG
[ link to this | view in chronology ]
False Identification
Congress will gleefully carve an exemption for itself, and maybe judges, "senior" government officials and certain other "elite" just as they have done for TSA and other measures by passing a law requiring such processes be built in to any scanning systems.
[ link to this | view in chronology ]
It's the training data
If you're training the system on mugshots, then what is it learning to recognize? Criminal types.
So then everyone acts all surprised when it recognizes congress critters.
[ link to this | view in chronology ]
Criminal types
If you're training the system on mugshots, then what is it learning to recognize? Criminal types.
That assumes that those people who end up processed in the legal system are actually criminal. There's a lot of evidence that a significant number of arrests and convictions may be false. (We have no system to test it, and the current prison system is very resistant to challenges to convictions.)
Though yes, it should belie any patterns of profiling that the police use in choosing their suspects.
[ link to this | view in chronology ]
Re: It's the training data
That joke is way funnier the seventh time!
[ link to this | view in chronology ]
Amazon facial recognition
[ link to this | view in chronology ]
Re: Amazon facial recognition
That joke is way funnier the eighth time!
[ link to this | view in chronology ]