In Response To Getting Sued, Clearview Is Dumping All Of Its Private Customers

from the thanks-for-the-half-assed-largesse,-asses dept

This is the first good news we've heard from Clearview since its exposure by Kashmir Hill for the New York Times back in January. In response to a lawsuit filed against it in Illinois accusing it of breaking that state's privacy laws with its scraping of images and personal info from a multitude of social media platforms, Clearview has announced it's cutting off some of its revenue stream.

Clearview AI — the controversial face-tracking company known for scraping more than 3 billion photos from social media sites including Facebook and Twitter — said it is ending its relationships with non–law enforcement entities and private companies amid regulatory scrutiny and several potential class action lawsuits.

Responding to one of those lawsuits, Clearview claimed in legal documents filed in an Illinois federal court on Wednesday that it was taking those voluntary actions and would “avoid transacting with non-governmental customers anywhere.”

Clearview's motion to dismiss [PDF] argues this lawsuit is now mooted since it will no longer be doing business with any Illinois entities and has apparently found some way to sort out personal info on Illinoians from the rest of scraped stash. Take a couple of grains of salt before reading the following:

Moreover, Clearview has taken steps to secure and implement limits regarding the retention of any Illinois photos. Clearview is terminating access rights to its app for all account holders based in Illinois and is terminating the accounts of any non-law enforcement or government entity.

This is better but it's still nothing great. Governments can still use a privacy law-violating app. That's still a big problem. While private companies may be using Clearview's app the most, it's the company's government customers that are more of a problem. Unproven tech may get you booted from the local mall for resembling a shoplifting suspect, but law enforcement agencies can ruin your life and take away several of your freedoms. And Clearview isn't exactly selective about who it sells to, so plenty of overtly abusive governments will still get to use untested facial recognition software to destroy lives without worrying about niceties like due process.

Presumably, this will end the use of Clearview's AI by Illinois government agencies. But reporters at BuzzFeed couldn't get any confirmation from the Chicago PD that its plug had been pulled.

While the Chicago Police Department did not return a request for comment, a spokesperson for the Illinois secretary of state confirmed the office had been using Clearview “for about six months” to “assist other law enforcement agencies.”

Law enforcement agencies make up most of Clearview's customers in the state. If Clearview's sworn statements are true, they should all find their accounts deleted in the very near future. But Clearview hasn't been too honest when dealing with judgments passed by the court of public opinion, so it's probably wise to hold our golf applause for this lawsuit-prompted pullout until more facts are in.

We also shouldn't read too much into its declaration that it will be restricting gathering data on Illinois residents. Clearview's one-state filtering system apparently relies on metadata -- a lot of which is stripped by sites when users upload photos. It also will avoid scraping data and images from sites with words like "Chicago" or "Illinois" in the URL, which will leave a lot of Illinois-based websites open for Clearview's scraping business, no matter what it may be asserting in court. Sure, this is Clearview "taking steps" to prevent violating state law, but these two specifics can hardly be considered ambulatory.

The better news is this:

In addition, the startup said it was implementing an “opt-out mechanism” to allow people to exclude photos from its database.

Again, no congratulations until we see this implemented, and only if there's some outside auditing done to ensure Clearview is actually doing the things it says it is. For that matter, the AI still hasn't been independently examined, so we don't even know what sort of false positive rate Clearview's four-billion-image database is capable of generating.

Clearview is still as sketchy as ever and only slightly less dangerous following this move. As long as it still sells its scraped database and unproven AI to governments, it's still a threat to millions of people.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: facial recognition, governments, privacy, surveillance
Companies: clearview, clearview ai


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 11 May 2020 @ 10:49am

    Bullshit.

    Presumably, this will end the use of Clearview's AI by Illinois government agencies.

    At this site, we're already familiar with parallel construction. Illinois agencies that want data merely have to ask the FBI to "help a buddy out, eh?"

    In a January New York Times article on the company, they had someone who came out and said almost exactly that:

    Mr. Clement, now a partner at Kirkland & Ellis, wrote that the authorities don’t have to tell defendants that they were identified via Clearview, as long as it isn’t the sole basis for getting a warrant to arrest them. Mr. Clement did not respond to multiple requests for comment.

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 11 May 2020 @ 11:21am

    They're currently shredding millions of documents, having hard drives pulverized to dust, because the stuff on them exposes Clearview to not just millions, not just billions but TRILLIONS of dollars in damages.

    Categorized how-to manipulations against ethic groups, LGBTQ etc. All would result in millions upon millions of lawsuits worldwide, arrests, piercing of the corporate veil, and imprisonment.

    So quick..pretend we dumped some customers, "destroy" their records......and have plausible deniability why they just spend 30 million dollars having vast amounts of data erased beyond all hope of recovery.

    link to this | view in thread ]

  3. icon
    ECA (profile), 11 May 2020 @ 12:41pm

    Anopther lovly history

    "it's the company's government customers that are more of a problem. Unproven tech may get you booted from the local mall for resembling a shoplifting suspect,"

    WHO is old enough to remember being ostracized after getting out of jail??
    What were the opinions of people that had been in prison.??
    What is said here, has BIG ramifications..
    Either they are merging the data with state/fed Data, which means they will need ALLOT of pictures to put together, and get right..
    OR they are going to have a 3rd party get the data, and the 2 sell the data to the gov/state/anyone with enough money, and WONT tell how they ID'd you.
    99% of the problem is implementation..and TIME. Other countries have done this or TRIED to do it. And learned the problems, as well as how NOT to do the job.

    link to this | view in thread ]

  4. icon
    Koby (profile), 11 May 2020 @ 1:54pm

    Re: Anopther lovly history

    OR they are going to have a 3rd party get the data, and the 2 sell the data to the gov/state/anyone with enough money, and WONT tell how they ID'd you.

    Sounds just like radar cameras. 3rd party manufacture, can't inspect the proprietary evidence, and no accountability. History repeating indeed.

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 11 May 2020 @ 3:00pm

    "Clearview's motion to dismiss argues this lawsuit is now mooted since it will no longer be doing business with any Illinois entities"

    LMAO. Seriously? That's like someone who slanders you and makes a motion to dismiss because they won't do it again. That argument doesn't fly. Since they have already violated the privacy of so many Americans, they can still be sued for that. Holding up your hand and promising never to do it again doesn't protect you from being sued. Clearwater could easily violate American's privacy again in the future.

    Hate to say it but there's no way their motion to dismiss is granted because there is nothing preventing Clearwater from engaging in that same activity in the future. The only way to ensure that companies don't engage in deceptive behavior is through lawsuits.

    "I promise never to do it again" is not a valid enough reason for a court to dismiss a lawsuit. It's laughable.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 11 May 2020 @ 3:40pm

    It's a smart move. Clearview doesn't have to worry about pesky things like bad publicity, lawsuits from customers or even ethics. I'm surprised they didn't make this move earlier.

    link to this | view in thread ]

  7. icon
    BG (profile), 12 May 2020 @ 3:05am

    I'm sure it will be simple and uncomplicated ...

    "In addition, the startup said it was implementing an “opt-out mechanism” to allow people to exclude photos from its database."

    I'm certain they will take inspiration from the Hitch-hikers Guide To The Galaxy and make the opt-out process as reasonable and straightforward as viewing the planning notice for destroying the earth from the aforementioned book.

    link to this | view in thread ]

  8. identicon
    anonymous, 13 May 2020 @ 1:28am

    I hope they delete the data, too

    Because then they'd be on the hook for Spoliation of evidence, and judges really don't like it when defendants delete evidence.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.