NIST Study Confirms The Obvious: Face Masks Make Facial Recognition Tech Less Useful, More Inaccurate
from the for-now... dept
At the end of last year, the National Institute of Standards and Technology (NIST) released its review of 189 facial recognition algorithms submitted by 99 companies. The results were underwhelming. The tech law enforcement and security agencies seem to feel is a game changer is just more of the same bias we've been subjected to for years without any AI assistance.
Asian and African American people were up to 100 times more likely to be misidentified than white men, depending on the particular algorithm and type of search. Native Americans had the highest false-positive rate of all ethnicities, according to the study, which found that systems varied widely in their accuracy.
The faces of African American women were falsely identified more often in the kinds of searches used by police investigators where an image is compared to thousands or millions of others in hopes of identifying a suspect.
Who were the winners in NIST's facial recognition runoff? These guys:
Middle-aged white men generally benefited from the highest accuracy rates.
We have some good news and bad news to report from the NIST's latest facial recognition study [PDF]. And the good news is also kind of bad news. (The bad news contains no good news, though.)
The bad news is that the COVID-19 pandemic is still ongoing. This leads to the good news: face masks -- now a necessity and/or requirement in many places -- are capable of thwarting facial recognition systems.
Using unmasked images, the most accurate algorithms fail to authenticate a person about 0.3% of the time. Masked images raised even these top algorithms’ failure rate to about 5%, while many otherwise competent algorithms failed between 20% to 50% of the time.
But that's also bad news. This increases the chance of both false positives and false negatives. Both of these are unwelcome side effects of face coverings. The tiny bit of good news is that it generates mostly unusable images for passive systems (like those installed in the UK) that collect photos of everyone who passes by their lenses. The other small bit of good news in this bad news sandwich is this: face masks reduce the risk of bogus arrests/detainments.
While false negatives increased, false positives remained stable or modestly declined.
NIST also noticed a couple of other quirks in its study. Mask coverage obviously matters. The more that's covered, the less likely it is software will draw the correct conclusion. But color also matters. Black masks produced more bad results than blue masks.
Companies producing facial recognition tech (89 algorithms were tested by NIST for this project) aren't content to wait out the pandemic. Many are already working on algorithms that use fewer features to generate possible matches. This is also bad news. While the tech may be improving, working around masks by limiting the number of data points needed to make a match is just going to generate more false positives and negatives. But companies are already training their AI on face-masked photos, many of which are being harvested from public accounts on social media websites. Dystopia is here to stay. The pandemic has only accelerated its arrival.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: face masks, facial recognition, nist, pandemic
Reader Comments
Subscribe: RSS
View by: Time | Thread
support Techdirt
" Black masks produced more bad results than blue masks"
In case you were on the fence about getting a "The content of this mask are is no longer available due to a copyright claim" mask.
[ link to this | view in thread ]
Remembering
I seem to remember reports from a few years back in which people would deliberately wear makeup or other head coverings which severely lowered the accuracy of facial recognition systems. It's good to see that not much has changed, and that they can still often be fooled. But now that people have a deliberate excuse to wear a mask, I can't wait to see masks and deliberately confusing features combined.
Picture a mask with an image of someone else's face on it.
[ link to this | view in thread ]
Perhaps a way to convince the Anti-government types?
If you tell them that it's harder for the government to track you, if you wear a mask, perhaps a few more COVIDiots will wear a mask. Who am I kidding, they're not anti-government or anti-surveillance state, they're against those things for themselves- they encourage it for anyone else.
[ link to this | view in thread ]
Re: support Techdirt
Already Got Mine
[ link to this | view in thread ]
Re: Remembering
You're starting to get it Koby, keep at it and we'll welcome you with open (digital) arms…
[ link to this | view in thread ]
Before anyone says "Whoa! 0.3% sounds pretty good! And 5% ain't too shabby either," let's remember the base rate fallacy involved in these numbers. Example 3 has the least math. A 0.3% failure rate is seriously bad, 5% is way worse, and the rest, 20% to 50% failure rate are in the "you can't be serious" category.
[ link to this | view in thread ]
Soon your facemask will have a unique barcode for your easy identification.
[ link to this | view in thread ]
Re: Re: Remembering
A disappointing comment. I reckon that I agree with about 80% of the articles written here. Am I not welcome here unless I meet some minimum threshold? 90% 95%? Or do you not tolerate anyone who doesn't agree with you completely?
[ link to this | view in thread ]
Re: Re: Re: Remembering
"I reckon that I agree with about 80% of the articles written here."
Can you point to one that wasn't dumb contrarianism or completely missing the point of the article (even sometimes missing the reality of your own argument)? Because usually you just say stupid shit, reply once or twice then disappear from the thread forever when people start telling you how wrong you are.
[ link to this | view in thread ]
Re: Re: Re: Remembering
"Am I not welcome here unless I meet some minimum threshold? 90% 95%? "
It's not the percentage of agreement which is the problem. It's the fact that in the articles where you feel compelled to grind an axe the arguments you bring to the table are more often than not flat-out lies.
"disagreeing" is having an argument with multiple people where the same facts are hashed out and multiple view points clash over the interpretation or on where the line is drawn in a compromise.
What you often do, however, is that for a few topics, like free speech online, your entire line of argumentation boils down to the metaphor of saying "Behold! My argument!" after which you squat down and take a dump.
THAT is the main issue here.
[ link to this | view in thread ]
Masks...
F: Why do you wearing a mask? Were you burned by acid or something like that?
W: Oh, no. It's just that they're terribly comfortable. I think everyone will be wearing them in the future.
[ link to this | view in thread ]
Almost. Try RFID.
[ link to this | view in thread ]