Documents Show Hundreds Of Cops Have Run Clearview Searches, Often Without Their Employers' Knowledge Or Permission

from the doing-stuff-just-because-no-one-said-not-to dept

An impressive trove of public records obtained by BuzzFeed shows just how pervasive facial recognition tech is. Law enforcement agencies are embracing the tech, often with a minimum of accountability or oversight. That's how toxic tech purveyors like Clearview -- whose software relies on a multi-billion image database scraped from the web -- get their foot in the door to secure government contracts.

Despite being used for years, facial recognition tech has yet to prove it's capable of recognizing the right faces more often than the wrong ones. The accuracy gets even worse when it's deployed to recognize faces of women and minorities -- and given law enforcement's history of disproportionate enforcement -- it will be minorities harmed by the inaccurate tech more often than not.

What BuzzFeed has done with these Clearview records is compile a searchable database that allows readers to see if their local agencies have tried out the tech. Clearview's tech has yet to be subjected to outside review and its method of obtaining images -- scraping them from public posts on the web -- leaves a lot to be desired in terms of accuracy. (Unfortunately, as the EFF's Dave Maas points out, this doesn't mean BuzzFeed has made the dataset public -- only its interpretation of the data. But we'll take what we can get.)

The upshot? Lots and lots of experimentation. The downside? Very little oversight or explicit permission. According to the information BuzzFeed obtained, more than 335 US law enforcement agencies have at least tried out Clearview's facial recognition AI, and many of those searches had nothing to do with investigations.

Several of the responding agencies appear to be paying little attention to the actions of their employees:

Officials at 34 of those organizations said they were unaware that their employees had signed up for free trials until our questions prompted them to look.

Meanwhile, others pretended they had no responsive documents until asked twice:

Officials at another 69 entities at first denied their employees had used Clearview but later determined that some of them had.

While a tally of 335 law enforcement agencies may seem minute in comparison to the total number of law enforcement agencies in the United States, it would be wise to remember this dataset is far from complete. Nearly 100 agencies refused to answer definitively whether or not their employees had explored Clearview's offerings. Nearly 1,200 agencies have yet to turn over requested documents.

Meanwhile, Clearview has been out there touting law enforcement successes few law enforcement agencies will acknowledge. It also encourages the perception it is currently partnering with hundreds of cop shops while trying to divert attention away from its willingness to grant access to anyone interested in its unproven tech, including government agencies in countries known for their human rights abuses.

This data shows there's more interest in Clearview than law enforcement (as a whole) is willing to admit publicly. It also shows cops are playing with unproven tech -- quite possibly using images of people who aren't suspected of anything -- without the knowledge of their supervisors or government officials charged with overseeing their actions. What's left unsaid -- or unresponded to -- is at least as concerning as what has been admitted publicly. There's a rogue AI on the loose that links cops to a database filled with billions of images scraped from websites without their -- or their end users' -- approval.

Clearview is entirely problematic. But these agencies' willingness to exploit and examine this tech is even more so, considering the damage to rights and civil liberties the careless use of unproven facial recognition tech can cause.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: facial recognition, police, privacy
Companies: clearview


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Chris-Mouse (profile), 7 Apr 2021 @ 3:04pm

    It would be interesting to see what would happen if the police department were to run photographs of all their officers through the Clearview search. I suspect the result would be a lot less enthusiasm for facial recognition afterwards.

    link to this | view in chronology ]

    • icon
      That Anonymous Coward (profile), 7 Apr 2021 @ 6:45pm

      Re:

      Sadly no.

      Clearview only exists to take a photo of an unknown & tell you who they are by doing some voodoo with the billions of scraped images.

      Given the history of police using the information databases they already have access to to stalk ex's, hit on cuties they pulled over, & other really creepy rule violating actions (that end up with no real punishments) its more likely they were using them to try and get laid.

      Sadly no one knows how good Clearview is, on the 'upside' their training dataset includes multiple skin tones & facial features, but its still a black box that shouldn't exist in legal proceedings.
      Their sketchy history & actions combined with the founder being besties with notorious floor pooper Chuck isn't helping.

      Given how police like to omit facts from cases once they've decided on a target, one wonders if one of them was dumb enough to use this tech to secure a conviction. Given the departments who claimed oh we never did that & then admitted oh I guess we did one has to wonder if they'll lie about verifiable facts can they be trusted in court.

      For all of these advancements in tech that they keep saying we need... can anyone actually produce cases where it worked?
      I mean we've got people railroaded into jails again b/c they relied on unproven tech that has a racial bias a few times now, but no real success stories. I mean I wonder if anyones run it on the Jan 6 protesters the way they've used them in other cases... I mean all of those white faces are perfect for the racist tech to id.

      link to this | view in chronology ]

  • identicon
    Pixelation, 7 Apr 2021 @ 6:54pm

    Color me shocked

    A slap on the wrist and a "Don't do it again, wink, wink". Until there are consequences for this, it will keep happening.

    link to this | view in chronology ]

  • icon
    Tanner Andrews (profile), 8 Apr 2021 @ 2:11am

    Optimist is Expecting Things to be Better than Real Life

    A slap on the wrist

    Such a harsh punishment is unlikely to be administered to police officers running Clearview searches.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.