Lawsuit Says Clearview's Facial Recognition App Violates Illinois Privacy Laws

from the OR-DOES-IT-[dramatic-side-eye] dept

Clearview has gathered a whole lot of (negative) attention ever since its exposure by Kashmir Hill for the New York Times. The facial recognition app developed by Hoan Ton-That (whose previous app was a novelty that allowed users to transpose President Trump's distinctive hairdo on their own heads) relies on scraped photos to perform its questionable magic. Rather than limiting themselves to law enforcement databases, cops can upload a photo and search a face against pictures taken from dozens of websites.

The company's marketing materials claim cops have access to 3 billion face photos via Clearview -- all pulled from public accounts linked to names, addresses, and any other personal info millions of unwitting social media users have uploaded to the internet.

Its marketing materials also claims it has been instrumental in solving current crimes and generating suspect lists for cold cases. So far, very few of these claims seem to be based on fact. That's only one of the company's issues. Another is the heat it's drawing from companies like Twitter and Facebook who claim photo scraping violates their terms of service. That's one for the courts and it's only a matter of time before someone sues.

Someone has sued, but it's not an affected service provider. It's some guy from Illinois trying to fire up a class action lawsuit against the company for violating his home state's privacy laws. Here's Catalin Cimpanu of ZDNet with the details:

According to a copy of the complaint obtained by ZDNet, plaintiffs claim Clearview AI broke Illinois privacy laws.

Namely, the New York startup broke the Illinois Biometric Information Privacy Act (BIPA), a law that safeguards state residents from having their biometrics data used without consent.

According to BIPA, companies must obtain explicit consent from Illinois residents before collecting or using any of their biometric information -- such as the facial scans Clearview collected from people's social media photos.

"Plaintiff and the Illinois Class retain a significant interest in ensuring that their biometric identifiers and information, which remain in Defendant Clearview's possession, are protected from hacks and further unlawful sales and use," the lawsuit reads.

Hmm. This doesn't seem to have much going for it. And, believe it or not, it's not a pro se lawsuit. Whether it's possible to violate a privacy law by scraping public photos remains to be litigated, but it would seem the word "public" is pretty integral here. Unless Clearview found some way to scrape photos not published publicly, the lawsuit is dead in the water.

It shouldn't take too long for a judge to declare public and private legally contradictory. This lawsuit was composed by a member of the bar, but it reads more like a Facebook note the lawyer published accidentally. From the lawsuit [PDF]:

Without obtaining any consent and without notice, Defendant Clearview used the internet to covertly gather information on millions of American citizens, collecting approximately three billion pictures of them, without any reason to suspect any of them of having done anything wrong, ever.

Wat? Just because Clearview is aggressively pitching LEOs doesn't mean Clearview can only scrape photos of people it suspects of wrongdoing. Yes, it's disturbing Clearview has decided to make its stalker-enabling AI available to people who can hurt, maim, jail, and kill you, but there's nothing on any law book that says collecting pictures of faces can only be done if the people are probably criminals -- even if the targeted end users of this software are people who go after criminals.

Putting it in a sworn document doesn't make it any less ridiculous. But it does get more ridiculous.

[A]lmost none of the citizens in the database has ever been arrested, much less been convicted. Yet these criminal investigatory records are being maintained on them, and provide government almost instantaneous access to almost every aspect of their digital lives.

Facebook is collecting photos of people, almost none of which have been criminally charged. They reside in Facebook's database. Facebook is publicly searchable, and public profiles can be searched for photos, even by law enforcement officers. Is Facebook breaking state law by "collecting" photos of innocent people? No rational person would argue that it is. And yet, this is the same argument and it's no less stupid just because an actual lawyer is involved.

Look, I also don't want Clearview pushing this "product," much less to people with the power to do incredible amounts of damage to anyone the AI mistakes for a criminal. But this isn't going to fix anything. The lawsuit makes better points about Clearview's end of the deal, which makes it easier for it to look over law enforcement's shoulder. Since Clearview hosts all the pictures on its own servers, it can see what cops are looking for and do its own digging into the personal lives of anyone cops might be thinking about targeting. That's an ugly byproduct of this service and Clearview hasn't said anything about siloing itself off from government queries.

The claims in this suit are almost certain to fail. Clearview streamlines processes cops can perform on their own, like reverse image searches and browsing of social media accounts. Actions you can perform one person at a time without violating the Constitution (or state law), you can most likely do in bulk. For now. A more realistic approach would be to take edge cases to the Supreme Court, which has been more receptive to expanding the boundaries of citizens' expectations of privacy in the digital era. This lawsuit may raise limited awareness about Clearview (and discovery could be very interesting) but it's not going to end Clearview's scraping or deter law enforcement from using it. And it's certainly not going to earn a payout for the plaintiff.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: facial recognition, illinois, privacy, privacy law
Companies: clearview, clearview ai


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    BIPA, 30 Jan 2020 @ 3:38pm

    @Tim Cushing
    You might want to check the Wikipedia Article to that law.

    Especially Reference 12 (law360), Facebook has been sued succesfully over the same law.

    And they had a license for those images (granted by uploading), the company scrapping most probably didn't have any license for those images.

    Just because somebody postet their picture publicly doesn't mean you can use it for whatever you want.

    link to this | view in chronology ]

  • icon
    Norahc (profile), 30 Jan 2020 @ 7:10pm

    Next week's news:

    "Ring purchases Clearview and will merge it with Rekognition into a new program called ClearlyRekognized"

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 31 Jan 2020 @ 1:49am

    The problem is Facebook

    Providing an API to allow the automated download of details, possibly including pictures, of some large sub-set of its userbase. What could go wrong?

    Anyone remember Cambridge Analytica?

    link to this | view in chronology ]

  • icon
    Graham Cobb (profile), 31 Jan 2020 @ 3:05am

    GDPR

    Whether or not it violates the Illinois law, this clearly violates GDPR and also moral rights.

    Me publishing my photo in public, does not give you the right to process it. Personal data is owned by the person and permission must be granted to process it.

    link to this | view in chronology ]

    • identicon
      BIPA, 31 Jan 2020 @ 6:37am

      Re: GDPR

      Question would be if GDPR does apply, which it only would if Clearview would do business within the EU i.e. have clients within the EU.

      In other words, if they don't have clients in the EU or offered their service to such, GDPR doesn't matter.

      Even if some people seem to think so, the EU never said their laws apply outside the EU, all they did was to change the place of business to where the client is from where the service provider is.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 31 Jan 2020 @ 7:37am

    That which is public is not private.

    Anything done in public has no expectation of privacy.

    Technology does not change that simple fact. The genie isn't going back in the bottle; people simply need to wise up.

    link to this | view in chronology ]

    • identicon
      Bruce C., 31 Jan 2020 @ 7:50am

      Re:

      Ah, but was it public? There is no indication if Clearview was scraping data marked as "friends only" or "private" in their sources.

      And even if the photos were public, the Illinois law would seem to prohibit their use to generate proprietary/private biometric data.
      IANAL.

      link to this | view in chronology ]

    • icon
      Graham Cobb (profile), 31 Jan 2020 @ 8:05am

      Re:

      GDPR isn't about privacy, it is about processing. Even if the pictures are public, that does not authorise anyone to process the pictures.

      The regulation does allow some cases where processing is permitted without agreement from the subject: specifically for domestic purposes (so you can keep an address book for your own family use with names, addresses and photos scraped from the Internet if you wish).

      Any other processing requires approval from the subject.

      In effect, GDPR makes the question not one about ownership but about purpose and intention.

      Moral rights (a concept not used in UK law as far as I know but certainly valid in some European law codes) is more about ownership I believe.

      link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.