Clearview Finally Submits AI For Independent Testing; Only Tests Feature It Isn't Actually Selling
from the competing-for-a-high-score-no-one-cares-about dept
At long last, Clearview has finally had its AI tested by an independent party. It has avoided doing this since its arrival on the facial recognition scene, apparently content to bolster its reputation by violating state privacy laws, making statements about law enforcement efficacy that are immediately rebutted by law enforcement agencies, and seeing nothing wrong with scraping the open web for personal information to sell to government agencies, retailers, and bored rich people.
Kashmir Hill reports for the New York Times that Clearview joined the hundreds of other tech companies that have had their algorithms tested by the National Institute of Standards and Technology.
[M]ore than two years after law enforcement officers first started using the company’s app, Clearview’s algorithm — what allows it to match faces to photos — has been put to a third-party test for the first time. It performed surprisingly well.
In a field of over 300 algorithms from over 200 facial recognition vendors, Clearview ranked among the top 10 in terms of accuracy, alongside NTechLab of Russia, Sensetime of China and other more established outfits.
That seems to confirm CEO Hoan Ton-That's continuous claims that Clearview's AI is one of the most accurate in the business. But there's a huge caveat to his claims and these test results. This test does not reflect real-world use of Cleaview's tech.
But the test that Clearview took reveals how accurate its algorithm is at correctly matching two different photos of the same person, not how accurate it is at finding a match for an unknown face in a database of 10 billion of them.
No one's buying access to Clearview to perform this task. And that certainly isn't what Clearview is selling or promising to potential customers when it pitches its 10 billion image database. So, Clearview calling this test result "an unmistakable validation" of its tech is, well, pure bullshit. It doesn't validate anything. All it possibly shows is that Clearview could be used to verify identity by matching faces -- something that might be useful for unlocking a phone or providing access to restricted areas.
What it doesn't show is Clearview's accuracy when it compares an uploaded photo to its billions of scraped images. Supposedly, Clearview will be allowing NIST to run a 1-to-many test of its AI (Ton-That says that will happen "shortly"). If that happens, we'll finally be able to see if Clearview's AI is as accurate as its CEO has repeatedly said it is.
Even if it is accurate, it's still facial recognition tech -- something that comes with a lot of inherent drawbacks. Every AI tested by NIST showed some form of bias, like performing better when checking white male faces for matches. And it won't change the fact that Clearview's database is the product of web scraping -- something that's not illegal but definitely questionable. Internet users may agree to share information with others and the sites they use, but no one affirmatively agrees to allow Clearview to scoop up that information and sell it to government agencies. That's what has resulted in it being sued in a couple of states, and being kicked out of Canada entirely.
Clearview has no product without the unwilling and unaware contributions of millions of internet users. Even if its AI isn't as terrible as we're all free to believe it is, it will still just be the bottom feeder in a murky cesspool of facial recognition tech providers. Floating to the top of the NIST's tank may give it better copy for its marketing materials, but it won't earn it any respect.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: ai, facial recognition, privacy
Companies: clearview
Reader Comments
Subscribe: RSS
View by: Time | Thread
Clearview, the Theranos of facial recognition...
[ link to this | view in chronology ]
coward!
[ link to this | view in chronology ]
Just worth noting: while ClearView was kicked out of Canada, it took its database of Canadian images with it. To get your images removed, you needed to send them proof of identity. I attempted that three times, and never ended up providing them exactly what they wanted without leaking too much PII to them in the process, and now the removal period is over.
So they've still got my photos and can use them everywhere in the world but Canada.
Thankfully, there appears to only be a single low-res image of me available to scrape, and a BUNCH of images of people with the same name as me (as well as a bunch of misattributed images of others associated with my name). So it's highly unlikely they've got a match in the first place.
[ link to this | view in chronology ]
Re:
'If you want us to remove your picture you must provide us even more detailed personal information', what could possibly go wrong with that...
[ link to this | view in chronology ]
Re: Stop scraping our images
And now across the pond, Database firm Clearview AI told to remove photos taken in Australia https://www.bbc.com/news/technology-59149236
[ link to this | view in chronology ]
Re: Re: Stop scraping our images
That's across the Pacific. "The pond" refers to the Atlantic specifically.
[ link to this | view in chronology ]
'But ONLY that part.'
It's like having your architectural abilities tested by having officials test a very specific part of a building you put together and nothing else; whether that part is good doesn't matter if the rest of it is a complete death-trap and you're only selling those other parts to potential customers.
[ link to this | view in chronology ]
Re: 'But ONLY that part.'
It's more like having your structural brace tested to see how it handles horizontal load, when you market it on its vertical load capacity.
[ link to this | view in chronology ]
Its so cute, they are going to tout these results everywhere... and maybe just maybe in tiny 2pt font will mention what was tested.
Don't bother me with details!!!! NIST says its in the top 10 and thats good enough for me... sign us up for the panopticon special!
[ link to this | view in chronology ]
The courts here in Australia have told them to knock it off, so we'll see, I suppose. The arrogance of that guy is breathtaking. What a self entitled shit he is.
[ link to this | view in chronology ]