Oversight Committee Finds FBI's Facial Recognition Database Still Filled With Innocent People, Still Wrong 15% Of The Time
from the bigger-but-no-better dept
The House Oversight Committee finally took on the FBI's Facial Recognition Program and discovered what critics have been saying about it for years: it's broken, filled with innocent Americans, and completely out of control.
Approximately half of adult Americans’ photographs are stored in facial recognition databases that can be accessed by the FBI, without their knowledge or consent, in the hunt for suspected criminals. About 80% of photos in the FBI’s network are non-criminal entries, including pictures from driver’s licenses and passports. The algorithms used to identify matches are inaccurate about 15% of the time, and are more likely to misidentify black people than white people.
These aren't new criticisms. While the accuracy of database searches has gotten (slightly) better over the past seven years (it was only 80% "right" in 2010), nothing else has changed. The FBI is working with local agencies, like state drivers license issuers, to ensure its facial recognition database is continually stocked with non-criminal entries.
The database continues to expand, as does its application. As was covered during the hearing, multiple body camera vendors are offering products that provide real-time face scanning, which turns routine patrol work into low-key surveillance. As it stands now, biometric databases and scanning are the real Wild West, but filled with rogue law enforcement efforts, rather than the other way around. Not only did the FBI deploy its biometric database well ahead of its Privacy Impact Assessment, it did so with nothing in the way of legal guidance. Several years later, this hasn't changed either.
“No federal law controls this technology, no court decision limits it. This technology is not under control,” said Alvaro Bedoya, executive director of the center on privacy and technology at Georgetown Law.
The Government Accountability Office's take on the FBI's facial recognition database hasn't improved much since last year. The FBI is still adding as many state databases to its central biometric storage as possible, while its oversight -- both the Inspector General's office and its Congressional overseers -- is being stiff-armed and stonewalled by an agency extremely adverse to attempts to curb its powers. As it expands the database -- and as more vendors and government agencies make use of the collected data -- the number of false positives will only increase. This could make life extremely difficult for any number of Americans.
“It doesn’t know how often the system incorrectly identifies the wrong subject,” explained the GAO’s Diana Maurer. “Innocent people could bear the burden of being falsely accused, including the implication of having federal investigators turn up at their home or business.”
The FBI's testimony attempted to downplay this aspect by claiming the database is only used for "investigative leads," rather than identification of suspects. But that doesn't do anything to change the scenarios presented by the GAO. A mistaken lead could easily turn into a false accusation and being under investigation definitely would result in the feds dropping by a person's work or home.
It's been nearly a decade since the FBI began working on this database and there's nothing to show for it but a slight uptick in accuracy. Civil liberties concerns remain unaddressed while the agency focuses on what's important to IT: adding as many people as possible to the database. It has yet to demonstrate its real-world effectiveness, appearing to be far more interested in the "collect it all" tactics of the intelligence agency it clearly idolizes and emulates.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: face recognition, fbi, house oversight committee
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in thread ]
As for the black people being misidentified, I guess black people all look the same even to the computer program?
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re: Re:
If you are suspected to be dangerous, many innocent behaviors from your point of view can be misinterpreted by a fearful cop.
[ link to this | view in thread ]
[ link to this | view in thread ]
Re: Re:
Reacting in the wrong manner when confronted by police gets you shot.
Sometimes.
Sometimes all it takes is breathing to get shot by a cop.
Let me put this another way: A lot of folks are afraid of being killed by a terrorist. But the statistics show that the random person is three hundred times (times, not percent) more likely to be killed by a beat cop than a terrorist.
Logically I should be much more afraid of my neighborhood cop than Osama Bin Laden (even if he wasn't dead.) And I am.
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
This is far too nice a way of describing the problem. It would be more accurate to say that "No Federal law authorizes the FBI to recklessly create a database so likely to be abused, yet the FBI has done so anyway." The mere existence of this program demonstrates that the FBI has decided to follow the adage that it is better to beg forgiveness than to ask permission. History tells us that such forgiveness from its supposed overseers is almost guaranteed.
[ link to this | view in thread ]
Re:
I've read that darker skin has lower contrast than lighter skin. Which makes it more difficult for the facial recognition algorithms.
Just saying.
[ link to this | view in thread ]
Re: Re: Re:
[ link to this | view in thread ]
The issue is the 15% false match.
[ link to this | view in thread ]
misidentify more black people than white people
So the FBI's algorithm says "they all look alike to me"?
Maybe it is a legitimate problem like inadequate training data for the AI. Or that the AI inaccurately makes certain facial measurements that are compared. It seems to me that if the developers of this can make it work for white people, they can make it work for black people. It's just a technical problem.
Regardless of race / color, of the entire 15 % that are misidentified, maybe someone should be looking at WHY they are misidentified. What metrics or other factors caused these two person's photographs to be considered the same person? Can the algorithm be tweaked for that? Or can these be introduced into the training data as a definite mismatch to improve the AI training?
[ link to this | view in thread ]
Re:
If they did that, I wonder how low the error rate might go?
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re: misidentify more black people than white people
Maybe yes, maybe no. By necessity, this is data that they took from unwilling, and often uninformed, members of the public. Even if we assume that they took only curated imagery (i.e. driver license official photographs), they are still reliant on the state to collect a quality photograph. If the photography environment is poor (e.g. using only the overhead general light instead of a specific flash when the image is captured), the picture may not have enough quality for the algorithm to work well. This is particularly likely to show bias against dark-skinned subjects, since by definition they reflect less ambient light, so you need more light on them in order to capture defining features. If the algorithm is given only low-quality input photographs, low-quality analysis is much more likely. If dark-skinned people photograph poorly in environments not designed specifically for quality imagery, then photographs of blacks will "all look alike" in the database.
[ link to this | view in thread ]
Re: Re:
Why restrict it to wrongfully convicted? Accusation alone is unfortunately quite damning. Wrongfully arrested, even if found not guilty, is worse still. Wrongful conviction beats both those, but all three groups have a legitimate grievance if the wrongful actions were triggered by the poor quality of this database.
[ link to this | view in thread ]
Re: Re: misidentify more black people than white people
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re: Re:
Welcome to "Big Data". (SEE False positive paradox)
[ link to this | view in thread ]
Re: Re:
And, pray tell. how does one "react" properly?
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re: Re: Re:
[ link to this | view in thread ]
Re: Re: Re:
[ link to this | view in thread ]
Re: "three hundred times"
2. Terrorism hits US people pretty much at random (except with a bias toward urban concentrations of people). Killings by the police do not hit at anything like random. Judging from the Washington Post database, over 90% are straightforward cases of self defense. Even most of the causes célèbres of BLM involve felonious activity or struggling with or fleeing from the police.
So no, the random person is about as likely, not 300 times more likely, to be killed by terrorism as by police, unless you mean "random" very literally, but misleadingly, to mean the selection of random people which includes those who tried to kill police.
[ link to this | view in thread ]
Re: Re: Re:
The recent shooting in Texas of an unarmed man is a good example. The guy had pcp in his car. Random? Random is when you were walking down the street and are hit by a stray policeman's bullet.
[ link to this | view in thread ]
Is there greater risk in leaving things as they are, or is their greater risk that 3/4 of the States will ratify some greater infringement to our liberties?
Recall that the first and best limit to government is a limit on the amount it has available to spend. That is why the first order of business must be how to stop government from having access to as much money as it wants to create out of thin air and limit it only to the amount of money the voters are willing to remit in taxes.
[ link to this | view in thread ]
Re:
That is kind of how the whole thing works.
I would love to see new amendments on abortion and immigration. Let's just settle it once and for all, no more interpreting the constitution, let us just decide if we all it or not. Bring up a constitutional congress and settle it.
[ link to this | view in thread ]
[ link to this | view in thread ]