Facial Recognition Software That Returns Incorrect Results 20% Of The Time Is Good Enough For The FBI
from the 80%-of-the-time,-it-works-EVERY-time dept
When deploying technology that has the potential to put actual human beings behind bars, what should be the acceptable margin of error? Most human beings, especially those who haven't committed any crime due to their natural aversion to being housed with actual criminals, would prefer (as if they had a choice) this number to be as close to zero as humanly (and technologically) possible.
The FBI, on the other hand, which possesses the technology and power to nudge people towards years of imprisonment, apparently feels a one-in-five chance of bagging the wrong man (or woman) is no reason to hold off on the implementation of facial recognition software.
Documents acquired by EPIC (Electronic Privacy Information Center) show the FBI rolled out a ton of new tech (under the name NGI -- "Next Generation Identification") with some very lax standards. While fingerprints are held to a more rigorous margin of error (5% max -- which is still a 1-in-20 "acceptable" failure rate), facial recognition is allowed much more leeway. (The TAR [True Acceptance Rate] details begin on page 247.)
NGI shall return the correct candidate a minimum of 85% of the time when it exists in the searched repository, as a result of facial recognition search in support of photo investigation services.The FBI's iris recognition program is subjected to a similar lack of rigor.
NGI shall return the incorrect candidate a maximum of 20% of the time, as a result of facial recognition search in support of photo investigation services.
NGI shall return the correct candidate a minimum of 98% of the time when it exists in the searched repository, as a result of iris recognition search in support of iris investigation services.These documents date back to 2010, so there's every reason to believe the accuracy of the software has improved. Even so, the problem is that the FBI decided potentially being wrong 20% of the time was perfectly acceptable, and no reason to delay implementation.
NGI shall return the incorrect candidate a maximum of 10% of the time, as a result of iris recognition search in support of iris investigation services.
Presumably, the FBI does a bit more investigation on hits in its NGI database, but it's worrying that an agency like this one -- one that hauls people in for statements wholly dependent on an FBI agent's interpretation (the FBI remains camera-averse and uses its own transcriptions of questioning as evidence) -- would so brazenly move forward with tech that potentially could land every fifth person in legal hot water, simply because the software "thought" the person was a bad guy.
Making this worse is the fact that the FBI still hasn't updated its 2008 Privacy Impact Assessment, despite the fact it told Congress in 2012 that it had a new assessment in the works.
On top of the brutal (but "acceptable") margin of error is the fact that the FBI has made a habit of deploying nearly every form of privacy-invasive technology without putting together even the most minimal of guidelines or privacy-aware policies. Apparently, these concerns only need to be dealt with when and if they're pointed out by OIG reports or lawsuits brought by privacy advocates.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: facial recognition, failure, fbi
Reader Comments
Subscribe: RSS
View by: Time | Thread
80% is 100%
[ link to this | view in thread ]
[ link to this | view in thread ]
Damn that is AWESOME !!
So, 85% for ANY ONE TIME, and the person walks past 6 cameras, what are the odds that ONE of them will detect him ?
So the camera misses him when he walks in, what about next time when he walks out, or do you think it would not search that face again, or would not even search each face multiple times with each scan (or frame), each one having an 85% chance of his detection.
See what I am coming from here.
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re: 80% is 100%
[ link to this | view in thread ]
All suspects are equal...
[ link to this | view in thread ]
Instead of punishing "bad companies" why not help them get on the right side of the law?
I see an emphasis on punishment and little on actually helping solve any problems, to the point I don't think most people even understand why they are being punished.
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
I do not understand why they would focus on iris scanning? The only reasonable idea would be for a database of carreer criminals to get recognized at the station. Iris scanning is far too rarely used in civil society to grant significant clues anyway! Fingerprints, DNA and video/foto surveillance will be available on or around most crime scenes. Iris scans will not!
[ link to this | view in thread ]
It really just comes down to a true lack of good algorithms to detect someones Identity. However; I do not see this as a problem as long as a real person then takes the time to confirm the identity.
[ link to this | view in thread ]
[ link to this | view in thread ]
[ link to this | view in thread ]
[ link to this | view in thread ]
Re:
Sarcasm aside, you seem to miss this important part of the quote: "NGI shall return the incorrect candidate a maximum of 20% of the time"
In other words, 1 out of every 5 times, it's not that the system has failed to detect the person being looked for, it's that it identifies the wrong person. Now, consider not only the problems with sending (presumably) armed officers against innocent people, but the cost and wasted time involved in sorting out the incorrect data. Is it really worth it at this point compared to more traditional policing?
[ link to this | view in thread ]
Important question here: for what are these results used? It sure the hell isn't imprisonment. That's what the broken court system is for. And hey 20% is better than the 100% because you didn't show "respect" to a cop and guess what, that happens right now.
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Well, I guess we know how to bring James Clapper to justice then
I figure we only need to find about 10 crimes with video footage of the crime to get at least 1 match to Clapper, then we can send him to jail for robbing for the gas station! That's as close as charging him for his illegal spying on America and lying to congress as we'll get.
[ link to this | view in thread ]
This is total BS ..
The tool is not used to arrest anyone anyway. It is used as an investigative tool to combine with other information. Even if our software gets a 99% hit on fingerprints - something that can be beat by wearing gloves - an investigator has to look over the fingerprints and make the final determination. A computer can not go to court and testify.
This whole thing is about one of the stupidest things I have read on techdirt.
[ link to this | view in thread ]
You've got the statistics backwards.
This facial recognition system needs an error rate of more like 0.00002% before it's going to be much use for anything.
[ link to this | view in thread ]
Re: You've got the statistics backwards.
[ link to this | view in thread ]
CreepyDOL add on
[ link to this | view in thread ]
Re: Re: 80% is 100%
[ link to this | view in thread ]
Re: You've got the statistics backwards.
Hey if you want to make an omelet you gotta break a few eggs and a liitle collateral damage never hurt anyone (important), right?
[ link to this | view in thread ]
Response to: Anonymous Coward on Oct 15th, 2013 @ 5:00am
[ link to this | view in thread ]
Re:
It depends on how it's used. If you're talking about access to special secure areas, OK.
If we're talking about systems that are surveilling people in public places, though, this is still a really huge problem. Not only because of the huge waste of time & energy incurred by having to have someone verify a person' identity in person, but because it would be more than a small inconvenience for a lot of innocent people.
What if you're walking down the street, are pegged as a potential terrorist by a camera, and a cop comes to check you out? First, that's a terrifying thing for most people right off the bat.
Second, what if you don't have any identification? Does the cop let you go or haul you into the police station for positive ID? If he just lets you go, then there's the out for any actual terrorists, and the entire system is instantly worthless.
Or are you arguing that we should be required to have ID on us at all times now? "Your papers, please..."
[ link to this | view in thread ]
[ link to this | view in thread ]
Re: Re: Re: 80% is 100%
[ link to this | view in thread ]
Re: Re:
How did I get a 5% success rate, you ask? It's all in how you fiddle with the numbers. Let me walk you threw it.
There are 316,000,000 people living in the United States. 20% falsely identified is 15,800,000. I don't know how many people the FBI are looking for, but let's assume a generous 1,000,000 people. That's 850,000 people correctly identified. 850,000 people out of 15,800,000 flags. That's 5.379%.
This does assume a lot of things. One, I really don't think the FBI is looking for a million people let alone has pictures of all of them clear enough to feed into the software. Two, this assumes that every single person only ever walks past one camera. The success rate drops dramatically as people walk past more cameras.
It would cost far too much money to use a system that in the end probably has a success rate lower then 1%.
[ link to this | view in thread ]
Re: Re: Re:
[ link to this | view in thread ]
Re: Re: You've got the statistics backwards.
[ link to this | view in thread ]
[ link to this | view in thread ]
Re: This is total BS ..
It doesn't worry me that the FBI is looking to adopt facial recognition and I probably agree with you that this article complains about its accuracy without knowing how it will be used. I am worried about how they will use it. Do not fool yourself into thinking the FBI will not use facial recognition to arrest someone. It may not be the only factor in the arrest but, as with fingerprints, law enforcement tends to be eagerly biased in favor of its usage and tends to disregard what science says about the level of doubt.
[ link to this | view in thread ]
Completely and totally SDRAWKCAB
I would like to find some software that eliminates 85% of my busy work, too.
[ link to this | view in thread ]
Re: Re: This is total BS ..
Once LEOs identify a suspect, they focus on making that target guilty, not actually investigating everything.
[ link to this | view in thread ]
Re: Completely and totally SDRAWKCAB
[ link to this | view in thread ]
Re: Re:
That would be a very stupid system. It's simple. Humans are quite good at facial recognition. Someone in your proposed scenario is presumably already being shown a picture of the person that's been flagged. Just put the picture the camera's matched them to next to it and let the cop check for any differences that the computer missed. This should keep the rate of false positives at about the same level as when humans just watched the cameras without help.
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]