FBI's Facial Recognition Database Still Huge, Still Inaccurate, And DOJ Shows Zero Interest In Improving It
from the ALL-YOUR-FACE-ARE-BELONG-TO-US dept
The FBI's biometric database continues to grow. Its Next Generation Identification system (NGI) is grabbing everything it can from multiple sources, compiling millions of records containing faces, tattoos, fingerprints, etc. from a blend of criminal and non-criminal databases. It went live in 2014, but without being accompanied by the Privacy Impact Assessment (PIA) it promised to deliver back in 2012.
Lawsuits and pressure from legislators finally forced the FBI to comply with government requirements. That doesn't mean the FBI has fully complied, not even two years past the rollout. And it has no interest in doing so in the future. It's currently fighting to have its massive database exempted from federal privacy laws.
Much of the information we have about the FBI's NGI database has come from outside sources. The EFF and EPIC have forced documentation out of the agency's hands via FOIA lawsuits. And now, the Government Accountability Office (in an investigation prompted by Sen. Al Franken) is turning over more information to the public with its review of the system.
What the GAO found is more bad news. The FBI is all about collecting data. It has little interest in ensuring the data is accurate or even usable.
The FBI’s system searches not just its own database, but also photo databases maintained by seven participating states, the US Department of State – which issues passports – and the US Department of Defense, shared among federal law enforcement agencies and the participating agencies, though access on the state level is obtained through the FBI.
This is only part of the NGI. To amass the 411 million photos it has collected to this point, the FBI dumps in the contents of a national criminal database.
[T]he GAO report found a much larger program, run by the criminal justice information services division of the FBI (CJIS), called Facial Analysis, Comparison and Evaluation, or Face, which “conducts face recognition searches on NGI-IPS and can access external partners’ face recognition systems to support FBI active investigations”.
The multiple inputs -- which allow criminal and non-criminal biometric data to intermingle -- still return an alarmingly high number of false positives. According to data obtained by EPIC, the facial recognition portion shows an error rate of 15-20% in the top 50 results returned from searches. That was the error rate in 2010. We can assume the hit rate has improved since then, but we have no way of knowing what the current error rate is because the FBI is uninterested in policing the accuracy of its database.
From the GAO report [PDF]:
Prior to deploying NGI-IPS, the FBI conducted limited testing to evaluate whether face recognition searches returned matches to persons in the database (the detection rate) within a candidate list of 50, but has not assessed how often errors occur. FBI officials stated that they do not know, and have not tested, the detection rate for candidate list sizes smaller than 50, which users sometimes request from the FBI… Additionally, the FBI has not taken steps to determine whether the face recognition systems used by external partners, such as states and federal agencies, are sufficiently accurate for use by FACE Services to support FBI investigations
The GAO report also points out the FBI has been severely delinquent in its obligations to the public. Reports it was supposed to deliver prior to rollout have only just recently appeared, including one release apparently prompted by the GAO's assessment of the NGI program.
NGI-IPS has been in place since 2011, but DOJ did not publish a System of Records Notice (SORN) that addresses the FBI's use of face recognition capabilities, as required by law, until May 5, 2016, after completion of GAO's review. The timely publishing of a SORN would improve the public's understanding of how NGI uses and protects personal information.
The GAO has made six recommendations to the agency, three of which are being disputed by the DOJ. According to the DOJ, the reason for the mandatory reports being delivered after-the-fact doesn't need to be examined because the FBI "has established practices that protect privacy and civil liberties beyond the requirements of the law." This sounds like the FBI has "nothing to hide," which is at odds with the lack of responsiveness by the agency to demands for updated PIAs and SORNs over the last eight years.
The DOJ also disagrees that it should have to audit the facial recognition database's "hit rate," something that was only 80-85% accurate five years ago. (In fact, the FBI's specifications consider 85% accuracy to be acceptable when returning lists of possible suspects.) The DOJ claims the database can never return a false positive because it apparently has enough manpower and resources to chase down every bogus lead.
In its response, DOJ stated that because searches of NGI-IPS produce a gallery of likely candidates to be used as investigative leads instead of for positive identification, NGI-IPS cannot produce false positives and there is no false positive rate for the system.
The GAO understandably disagrees. Accuracy is important, especially if the FBI is going to put innocent people under investigation… or overlook potentially dangerous suspects.
Without actual assessments of the results from its state and federal partners, the FBI is making decisions to enter into agreements based on assumptions that the search results may provide valuable investigative leads. In addition, we disagree with DOJ’s assertion that manual review of automated search results is sufficient. Even with a manual review process, the FBI could miss investigative leads if a partner does not have a sufficiently accurate system.
The DOJ apparently still feels a 20% chance of putting the wrong person under investigation is still acceptable. And it still believes that it's so far ahead of the privacy curve that it doesn't need to apprise the public of the potential privacy implications of its massive biometric database. The information forced out of its hands by litigants and outside agencies shows the FBI is far more interested in collection than dissemination -- that it should be able to take all it wants from the public without having to hand out anything in return.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: accuracy, doj, facial recognition, fbi, next generation identification, ngi, privacy
Reader Comments
Subscribe: RSS
View by: Time | Thread
Investigating the wrong people is fine
/s
[ link to this | view in chronology ]
When your job is to make sure you have at least a vague excuse for doing something, you're not interested in the accuracy of your data?
[ link to this | view in chronology ]
Why the call for accuracy?
[ link to this | view in chronology ]
it's a taboo..
[ link to this | view in chronology ]
So, can this database be used to convict someone?
[ link to this | view in chronology ]
Re: So, can this database be used to convict someone?
[ link to this | view in chronology ]
Re: Re: So, can this database be used to convict someone?
[ link to this | view in chronology ]
Be curious if people will roll over and accept it when they finally remove the 2nd amendment that keeps getting in their way of total control. by that I mean they cannot control an armed population through fear and terror tactics.
[ link to this | view in chronology ]
Re:
It seems to be working pretty well so far.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Re: Re:
No armed rebellion could ever succeed in America; a) you couldn't get enough people to join in and b) you'd be outgunned, outnumbered, and out on every media platform as an evil terrorist and threat to national security, etc.
You're better off working towards a peaceful solution via the democratic system.
[ link to this | view in chronology ]
Why? An inaccurate database makes it much easier to get warrants for searches etc. based on a potential match. It also makes good faith much easier to justify for the same reason.
[ link to this | view in chronology ]
False positives are a feature, not a bug
If a system had a 60% accuracy rate for example it would return hits on an innocent person four out of every ten times.
To someone that prioritizes protecting the innocent over finding/punishing the guilty that's four people searched/investigated that shouldn't have been.
For someone who prioritizes finding/punishing the guilty over protecting the innocent on the other hand that's four searches/investigations that they otherwise wouldn't have been able to do, four more 'chances to find a criminal'.
[ link to this | view in chronology ]
You may not care that it's only the 2nd amendment, but if they get away with that, the others aren't far behind.
Soon, if you're under investigation the only right you'll have is the right to run for president.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
"He had a gun! I saw it!"
[ link to this | view in chronology ]
If we were to disarm the people of the US...
[ link to this | view in chronology ]
Movies make shity policy...
This isn't the first time we've seen this sort of magical thinking burn us. Billions of dollars wasted on programs that just don't work in reality, as those championing them are sure if they did it in the movies we can do it IRL.
Where is the push back for these failed bad movie plot elements? Why do they keep spending money on magical things that those selling can't actually deliver what they promised. Why don't they demand examples run by outside firms rather than computer renderings of how they imagine it could work in 20 years of unlimited spending?
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Facial recognition
[ link to this | view in chronology ]
Re: Facial recognition
[ link to this | view in chronology ]