Facial Recognition Software That Returns Incorrect Results 20% Of The Time Is Good Enough For The FBI

from the 80%-of-the-time,-it-works-EVERY-time dept

When deploying technology that has the potential to put actual human beings behind bars, what should be the acceptable margin of error? Most human beings, especially those who haven't committed any crime due to their natural aversion to being housed with actual criminals, would prefer (as if they had a choice) this number to be as close to zero as humanly (and technologically) possible.

The FBI, on the other hand, which possesses the technology and power to nudge people towards years of imprisonment, apparently feels a one-in-five chance of bagging the wrong man (or woman) is no reason to hold off on the implementation of facial recognition software.

Documents acquired by EPIC (Electronic Privacy Information Center) show the FBI rolled out a ton of new tech (under the name NGI -- "Next Generation Identification") with some very lax standards. While fingerprints are held to a more rigorous margin of error (5% max -- which is still a 1-in-20 "acceptable" failure rate), facial recognition is allowed much more leeway. (The TAR [True Acceptance Rate] details begin on page 247.)

NGI shall return the correct candidate a minimum of 85% of the time when it exists in the searched repository, as a result of facial recognition search in support of photo investigation services.

NGI shall return the incorrect candidate a maximum of 20% of the time, as a result of facial recognition search in support of photo investigation services.
The FBI's iris recognition program is subjected to a similar lack of rigor.
NGI shall return the correct candidate a minimum of 98% of the time when it exists in the searched repository, as a result of iris recognition search in support of iris investigation services.

NGI shall return the incorrect candidate a maximum of 10% of the time, as a result of iris recognition search in support of iris investigation services.
These documents date back to 2010, so there's every reason to believe the accuracy of the software has improved. Even so, the problem is that the FBI decided potentially being wrong 20% of the time was perfectly acceptable, and no reason to delay implementation.

Presumably, the FBI does a bit more investigation on hits in its NGI database, but it's worrying that an agency like this one -- one that hauls people in for statements wholly dependent on an FBI agent's interpretation (the FBI remains camera-averse and uses its own transcriptions of questioning as evidence) -- would so brazenly move forward with tech that potentially could land every fifth person in legal hot water, simply because the software "thought" the person was a bad guy.

Making this worse is the fact that the FBI still hasn't updated its 2008 Privacy Impact Assessment, despite the fact it told Congress in 2012 that it had a new assessment in the works.

On top of the brutal (but "acceptable") margin of error is the fact that the FBI has made a habit of deploying nearly every form of privacy-invasive technology without putting together even the most minimal of guidelines or privacy-aware policies. Apparently, these concerns only need to be dealt with when and if they're pointed out by OIG reports or lawsuits brought by privacy advocates.

NGI System Requirements

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: facial recognition, failure, fbi


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Capt ICE Enforcer, 15 Oct 2013 @ 3:51am

    80% is 100%

    It is perfect. 80% of the time it works all the time, who cares about the other 27%. (Anchorman)

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 15 Oct 2013 @ 4:02am

    I can't wait for the first NSA employee to be wrongfully identified as terrorist...

    link to this | view in thread ]

  3. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 15 Oct 2013 @ 4:06am

    85% of the time IF THE PERSON IS THERE, it finds them !!!!

    Damn that is AWESOME !!

    So, 85% for ANY ONE TIME, and the person walks past 6 cameras, what are the odds that ONE of them will detect him ?

    So the camera misses him when he walks in, what about next time when he walks out, or do you think it would not search that face again, or would not even search each face multiple times with each scan (or frame), each one having an 85% chance of his detection.

    See what I am coming from here.

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 15 Oct 2013 @ 4:12am

    Re:

    I'm guessing your [sic] EFFECIENT solar panels only work 80% of the time too, and you considered it a good thing that needed no further improvement.

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 15 Oct 2013 @ 4:26am

    Re:

    85% accuracy is fine for showing personalised adverts. It will probably result in more deaths than those due to terrorism if it is used to send armed men through house doors.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 15 Oct 2013 @ 4:39am

    Re: 80% is 100%

    That doesn't make sense!

    link to this | view in thread ]

  7. icon
    RyanNerd (profile), 15 Oct 2013 @ 4:40am

    All suspects are equal...

    But some suspects are more unequal than others.

    link to this | view in thread ]

  8. identicon
    Anonymous Coward, 15 Oct 2013 @ 4:43am

    Instead of catching "bad guys" why not educate them on ways that they could be on the right side before punishing someone?

    Instead of punishing "bad companies" why not help them get on the right side of the law?

    I see an emphasis on punishment and little on actually helping solve any problems, to the point I don't think most people even understand why they are being punished.

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 15 Oct 2013 @ 4:54am

    Re:

    We are sorry we killed your son/daughter, he was erroneously flagged by our system as a dangerous person and our teams took no chances when he/she tried to run, was only after that we confirmed that it was not the person we were looking for

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 15 Oct 2013 @ 4:57am

    Uhm. If it rejects a correct hypothesis 15 % of the time it is unproblematic. If it accepts a wrong hypothesis 20 % of the time it is a catastrophy for civil rights if it is the sole evidence! Normally anything below 75 % is considered too random or plain wrong and anything below 90 % efficiency is speculative. Accepting the problematic error-rate to be 20 % is problematic, but hopefully it will be improvable on the software side before becoming standard issue, which makes it less worrying.

    I do not understand why they would focus on iris scanning? The only reasonable idea would be for a database of carreer criminals to get recognized at the station. Iris scanning is far too rarely used in civil society to grant significant clues anyway! Fingerprints, DNA and video/foto surveillance will be available on or around most crime scenes. Iris scans will not!

    link to this | view in thread ]

  11. identicon
    Anonymous Coward, 15 Oct 2013 @ 4:59am

    I have written both facial and iris recognition programs and the current algorithms currently used in both of these biometric identification systems have about that failure rate.

    It really just comes down to a true lack of good algorithms to detect someones Identity. However; I do not see this as a problem as long as a real person then takes the time to confirm the identity.

    link to this | view in thread ]

  12. identicon
    Anonymous Coward, 15 Oct 2013 @ 4:59am

    of course it is! it doesn't matter whether the person caught is the right person or not, as long as someone is caught! at least then they can continue to ramp up how skilled they are and how whatever crap scheme they can name as having helped them can be kept going!

    link to this | view in thread ]

  13. identicon
    Anonymous Coward, 15 Oct 2013 @ 5:00am

    A 20% failure rate finding a needle in a haystack seems acceptable to me. As long as they go like "Hum...wrong needle, try again" when it gets it wrong, I see no problem with it (well, aside from the obvious privacy concerns).

    link to this | view in thread ]

  14. icon
    Ninja (profile), 15 Oct 2013 @ 5:01am

    I'm sure we would get a higher success rate if we chose a politician in the Government and declared him/her to be a corrupt or someone in law enforcement and declared him/her to be megalomaniac.

    link to this | view in thread ]

  15. icon
    PaulT (profile), 15 Oct 2013 @ 5:10am

    Re:

    So, as long a potential terrorist keeps walking past monitored cameras, eventually you'll get him? What a great system, well worth the cost and loss of privacy for innocent citizens!

    Sarcasm aside, you seem to miss this important part of the quote: "NGI shall return the incorrect candidate a maximum of 20% of the time"

    In other words, 1 out of every 5 times, it's not that the system has failed to detect the person being looked for, it's that it identifies the wrong person. Now, consider not only the problems with sending (presumably) armed officers against innocent people, but the cost and wasted time involved in sorting out the incorrect data. Is it really worth it at this point compared to more traditional policing?

    link to this | view in thread ]

  16. identicon
    Anonymous Coward, 15 Oct 2013 @ 5:27am

    Smells a lot like fear mongering here. Oh no the big bad man is out to get everyone.
    Important question here: for what are these results used? It sure the hell isn't imprisonment. That's what the broken court system is for. And hey 20% is better than the 100% because you didn't show "respect" to a cop and guess what, that happens right now.

    link to this | view in thread ]

  17. identicon
    Michael, 15 Oct 2013 @ 6:31am

    Re:

    Edward Snowden

    link to this | view in thread ]

  18. identicon
    Anonymous Coward, 15 Oct 2013 @ 7:56am

    Well, I guess we know how to bring James Clapper to justice then

    I guess we know how to bring James Clapper to justice then. We charge him with a different crime committed by someone else that the facial recognition software says was done by him.

    I figure we only need to find about 10 crimes with video footage of the crime to get at least 1 match to Clapper, then we can send him to jail for robbing for the gas station! That's as close as charging him for his illegal spying on America and lying to congress as we'll get.

    link to this | view in thread ]

  19. identicon
    dt, 15 Oct 2013 @ 8:16am

    This is total BS ..

    The fact that the system has 80% accuracy doesn't mean it breaks 20% of the time. It is a legal statement to protect the developer of the software. I have worked on NGI for the FBI.

    The tool is not used to arrest anyone anyway. It is used as an investigative tool to combine with other information. Even if our software gets a 99% hit on fingerprints - something that can be beat by wearing gloves - an investigator has to look over the fingerprints and make the final determination. A computer can not go to court and testify.

    This whole thing is about one of the stupidest things I have read on techdirt.

    link to this | view in thread ]

  20. icon
    Chris-Mouse (profile), 15 Oct 2013 @ 8:16am

    You've got the statistics backwards.

    15% of the time the software will incorrectly identify a terrorist as an innocent person. From a security standpoint, that's not a problem, the next camera will catch them. The problem is that 20% of the time the system will flag an innocent person as a terrorist. That's a massive problem, and here's why. Boston's Logan airport handles about 2.5 million passengers every month. 20% of that is about 500,000 false alarms every month, or one false alarm about every five seconds 24 hours a day, seven days a week. Just how long do you think the security people will put up with that before they start treating every alarm as a false alarm, and ignoring it totally?
    This facial recognition system needs an error rate of more like 0.00002% before it's going to be much use for anything.

    link to this | view in thread ]

  21. identicon
    Anonymous Coward, 15 Oct 2013 @ 8:24am

    Re: You've got the statistics backwards.

    It wouldn't be much of an alarm so much as a bias-list. Which of course takes your attention off of the 15%. Although there is reason to be concerned about it. While there are perfectly sensible ways of using tools like these the security state is manifestly /not/ sensible, with a level of hysteria that would be an insult to inaccurate 19th century stereotypes of women.

    link to this | view in thread ]

  22. identicon
    alternatives(), 15 Oct 2013 @ 8:27am

    CreepyDOL add on

    So where are these public (or even pay per pic) repositories of pictures for us to submit the biometric info we obtain to link names to faces?

    link to this | view in thread ]

  23. icon
    Bergman (profile), 15 Oct 2013 @ 8:49am

    Re: Re: 80% is 100%

    And that is why you do not have a lucrative career as a government consultant.

    link to this | view in thread ]

  24. icon
    Bergman (profile), 15 Oct 2013 @ 8:53am

    Re: You've got the statistics backwards.

    Simple solution. Attach a 7.62mm "personal defense weapon" to every camera so that they can open fire on positive terrorist matches.

    Hey if you want to make an omelet you gotta break a few eggs and a liitle collateral damage never hurt anyone (important), right?

    link to this | view in thread ]

  25. identicon
    Anonymous Coward, 15 Oct 2013 @ 9:34am

    Response to: Anonymous Coward on Oct 15th, 2013 @ 5:00am

    This would be like mistaking 20% if the hay stack for needles. Are you beginning to see the issue?

    link to this | view in thread ]

  26. icon
    John Fenderson (profile), 15 Oct 2013 @ 9:55am

    Re:

    However; I do not see this as a problem as long as a real person then takes the time to confirm the identity.


    It depends on how it's used. If you're talking about access to special secure areas, OK.

    If we're talking about systems that are surveilling people in public places, though, this is still a really huge problem. Not only because of the huge waste of time & energy incurred by having to have someone verify a person' identity in person, but because it would be more than a small inconvenience for a lot of innocent people.

    What if you're walking down the street, are pegged as a potential terrorist by a camera, and a cop comes to check you out? First, that's a terrifying thing for most people right off the bat.

    Second, what if you don't have any identification? Does the cop let you go or haul you into the police station for positive ID? If he just lets you go, then there's the out for any actual terrorists, and the entire system is instantly worthless.

    Or are you arguing that we should be required to have ID on us at all times now? "Your papers, please..."

    link to this | view in thread ]

  27. identicon
    Anonymous Coward, 15 Oct 2013 @ 10:01am

    Maybe they can ask the Casinos for help.

    link to this | view in thread ]

  28. identicon
    Anonymous Coward, 15 Oct 2013 @ 10:17am

    Re: Re: Re: 80% is 100%

    (I think he was quoting the next line from anchorman)

    link to this | view in thread ]

  29. icon
    Chronno S. Trigger (profile), 15 Oct 2013 @ 10:37am

    Re: Re:

    I would assume that the FBI are only using the system for high security areas or on people they're already watching. A 5% success rate would just cost far too much if they tried to do this for everybody.

    How did I get a 5% success rate, you ask? It's all in how you fiddle with the numbers. Let me walk you threw it.

    There are 316,000,000 people living in the United States. 20% falsely identified is 15,800,000. I don't know how many people the FBI are looking for, but let's assume a generous 1,000,000 people. That's 850,000 people correctly identified. 850,000 people out of 15,800,000 flags. That's 5.379%.

    This does assume a lot of things. One, I really don't think the FBI is looking for a million people let alone has pictures of all of them clear enough to feed into the software. Two, this assumes that every single person only ever walks past one camera. The success rate drops dramatically as people walk past more cameras.

    It would cost far too much money to use a system that in the end probably has a success rate lower then 1%.

    link to this | view in thread ]

  30. icon
    John Fenderson (profile), 15 Oct 2013 @ 11:14am

    Re: Re: Re:

    I'm remembering recent misuses of face recognition technologies on a mass scale, such as at the Super Bowl. The clear intention is to, ultimately, scan everyone who passes by any camera automatically.

    link to this | view in thread ]

  31. identicon
    Anonymous Coward, 15 Oct 2013 @ 11:46am

    Re: Re: You've got the statistics backwards.

    Some surveillance cameras come with a weapon attached, its called a Hellfire missile, and they are relying on long range target recognition. The named people killed have an interest in staying 'dead' when a mistake is made, as the US stops looking for them, makes you think doesn't it?

    link to this | view in thread ]

  32. identicon
    Jake, 15 Oct 2013 @ 12:46pm

    Just to be scrupulously fair to the FBI here, does anyone happen to know what percentage of false positives are generated by having a police officer and/or a witness pick a suspect out of a line-up or from a selection of photographs?

    link to this | view in thread ]

  33. icon
    aldestrawk (profile), 15 Oct 2013 @ 3:54pm

    Re: This is total BS ..

    The ethics of using an ID system that is not completely accurate depends entirely on how it is used. Yeah, 20% for false positives is a maximum and real world usage will likely show better results. If you think the FBI is incapable of intentionally abusing an ID system or making gross mistakes then look at the case of Brandon Mayfield. On the basis of fingerprint identification, and the fact he converted to Islam in the late 1980's, he was arrested in 2004 and held for over two weeks as a material witness. The FBI first claimed his fingerprints were a 100% match with those found on a bag from the Madrid train bombing. It turned out from the information discovered during the lawsuit brought by Mayfield that there were 20 individuals in the US whose fingerprints were SIMILAR to the one found in Spain. The FBI investigate all of them. Because of Mayfield's Islamic beliefs he became the prime suspect despite not having left the US in over 10 years. Furthermore, before his arrest, Spanish authorities said his fingerprints were not a match. The FBI disregarded all this and arrested him anyway.

    It doesn't worry me that the FBI is looking to adopt facial recognition and I probably agree with you that this article complains about its accuracy without knowing how it will be used. I am worried about how they will use it. Do not fool yourself into thinking the FBI will not use facial recognition to arrest someone. It may not be the only factor in the arrest but, as with fingerprints, law enforcement tends to be eagerly biased in favor of its usage and tends to disregard what science says about the level of doubt.

    link to this | view in thread ]

  34. identicon
    J Durocher, 15 Oct 2013 @ 8:17pm

    Completely and totally SDRAWKCAB

    FBI agents already manually compare suspect photos to their database. It's labor intensive. So they put out a spec saying we'd like to purchase some software that will eliminate *at least* 85% of this manual review.

    I would like to find some software that eliminates 85% of my busy work, too.

    link to this | view in thread ]

  35. icon
    btrussell (profile), 16 Oct 2013 @ 3:25am

    Re: Re: This is total BS ..

    "The FBI disregarded all this and arrested him anyway."

    Once LEOs identify a suspect, they focus on making that target guilty, not actually investigating everything.

    link to this | view in thread ]

  36. icon
    John Fenderson (profile), 16 Oct 2013 @ 11:44am

    Re: Completely and totally SDRAWKCAB

    I have an even better idea for reducing law enforcement's workload: fewer and better laws.

    link to this | view in thread ]

  37. icon
    Torg (profile), 18 Oct 2013 @ 4:27pm

    Re: Re:

    "What if you're walking down the street, are pegged as a potential terrorist by a camera, and a cop comes to check you out? First, that's a terrifying thing for most people right off the bat."

    That would be a very stupid system. It's simple. Humans are quite good at facial recognition. Someone in your proposed scenario is presumably already being shown a picture of the person that's been flagged. Just put the picture the camera's matched them to next to it and let the cop check for any differences that the computer missed. This should keep the rate of false positives at about the same level as when humans just watched the cameras without help.

    link to this | view in thread ]

  38. identicon
    davidbarcomb, 3 Dec 2014 @ 6:23pm

    Re:

    I see. Thanks for the info

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.