Law Enforcement's New Facial Recognition Toy Scrapes Photos From Websites, Serves Up 'Matches' In Seconds
from the be-your-own-worst-enemy dept
The biggest collection of biometric data isn't housed by any government agency. In fact, it's not owned by any single private company in the world. It's the internet itself, which houses multiple billions of face photographs that one company is using to give law enforcement perhaps its sketchiest facial recognition tool yet. Kashmir Hill has the full report for the New York Times.
Until recently, Hoan Ton-That’s greatest hits included an obscure iPhone game and an app that let people put Donald Trump’s distinctive yellow hair on their own photos.
[...]
His tiny company, Clearview AI, devised a groundbreaking facial recognition app. You take a picture of a person, upload it and get to see public photos of that person, along with links to where those photos appeared. The system — whose backbone is a database of more than three billion images that Clearview claims to have scraped from Facebook, YouTube, Venmo and millions of other websites — goes far beyond anything ever constructed by the United States government or Silicon Valley giants.
The scraping of photos presumably continues. There's a market for this -- a growing market filled with law enforcement agencies that like the billions of images and the lower cost of Clearview. They're no longer limited to government databases or required to sign MOUs with other agencies to obtain access.
The law enforcement agencies that have used it love it. But that ends the list of the app's fans. Certainly no one concerned about the unreliability of facial recognition tech is going to be pleased there's an unproven upstart matching faces against the Internet. The multiple sites scraped by Ton-That's software aren't happy either, claiming the scraping violates their many Terms of Service.
But that's something for the courts to decide! (Badly, most likely.) Until the CFAA suits start rolling in, Clearview is capitalizing on the success stories being passed around by law enforcement agencies. Not bad for an app first conceived as a way to vet babysitters or allow hotels to greet guests by name.
In February, the Indiana State Police started experimenting with Clearview. They solved a case within 20 minutes of using the app. Two men had gotten into a fight in a park, and it ended when one shot the other in the stomach. A bystander recorded the crime on a phone, so the police had a still of the gunman’s face to run through Clearview’s app.
They immediately got a match: The man appeared in a video that someone had posted on social media, and his name was included in a caption on the video. “He did not have a driver’s license and hadn’t been arrested as an adult, so he wasn’t in government databases,” said Chuck Cohen, an Indiana State Police captain at the time.
This is just one example. There are many more in Kashmir Hill's article. Clearview's pitch to cop shops talks about identifying sex offenders, John Doe corpses, and identity fraud suspects. The Gainesville, Florida police department ran cold cases against the app and came up with more than 30 suspects.
Clearview's accuracy is still unproven. That a detective ran the app and created a list of 30 suspects does not mean the Gainesville PD is any closer to solving these cases. But it may be much closer to rounding up innocent people and trying to put them behind bars. Clearview's AI was not tested by the National Institute of Standards and Technology in its recent examination of facial recognition tech, but that's probably a good thing. It would only have made the average results worse.
[T]he company said its tool finds matches up to 75 percent of the time. But it is unclear how often the tool delivers false matches, because it has not been tested by an independent party…
Some of that unpredictability comes from Clearview's unique "database:" images scraped from all over the internet and "matched" using an unknown set of constraints. With enough pushback from the sites "donating" their users' images to Clearview, the app would become useless -- no better than cops running reverse image searches from security camera footage and hoping for the best. This statement of confidence from the app's developer is less than reassuring.
“A lot of people are doing it [scraping sites],” Mr. Ton-That shrugged. “Facebook knows.”
Yep. And some of the scrapees might take steps to prevent future scraping. Or any number of competitors can scrape the same sites, dump it into some serviceable AI software, and tell cops they've just identified the guy wanted for an unsolved murder from three decades ago. Whether or not that pans out, the cops at least have someone they can talk to, rather than a dusty file no one's bothered to look at for the last two decades.
Clearview is just another step towards the elimination of privacy in public. And it's being fed unwittingly by sites and services that offer a way for people to connect. Even Clearview's CEO seems taken aback by the implications of the software he's selling to government agencies. But it's not enough to stop him from doing it.
Filed Under: ai, facial recognition, hoan thon-that, law enforcement, peter thiel, scraping
Companies: clearview ai