Clearview Says Section 230 Immunizes It From Vermont's Lawsuit Over Alleged Privacy Violations

from the assuming-the-judge-will-call-this-theory-'novel'-which-isn't-a-compl dept

Clearview is currently being sued by the attorney general of Vermont for violating the privacy rights of the state's residents. As the AG's office pointed out in its lawsuit, users of social media services agree to many things when signing up, but the use of their photos and personal information as fodder for facial recognition software sold to government agencies and a variety of private companies isn't one of them.

[T]he term "publicly available" does not have any meaning in the manner used by Clearview, as even though a photograph is being displayed on a certain social media website, it is being displayed subject to all of the rights and agreements associated with the website, the law, and reasonable expectations. One of those expectations was not that someone would amass an enormous facial-recognition-fueled surveillance database, as the idea that this would be, permitted in the United States was, until recently, unthinkable.

Thus, when an individual uploads a photograph to Facebook for "public" viewing, they consent to a human being looking at the photograph on Facebook. They are not consenting to the mass collection of those photographs by an automated process that will then put those photographs into a facial recognition database. Such a use violates the terms under which the consumer uploaded the photograph, which the consumer reasonably expects will be enforced.

This is somewhat the same point multiple companies have made with their (ultimately ineffective) cease-and-desist orders: we have not agreed to allow Clearview to harvest data from our sites and sell that collected data to others.

Whether or not selling this scraped collection to law enforcement agencies is unlawful in Vermont remains to be seen. But Clearview is fighting back in court, raising a truly questionable Section 230 defense against the AG's lawsuit.

Clearview is represented by Tor Ekeland, who has been truly useful in defending people against bogus prosecutions. But Ekeland appears to believe Section 230 is a net loss for the public, so it's interesting to see him raise it as a defense here.

Clearview's motion to dismiss [PDF] compares Clearview to Google, claiming its bots crawl the web and cache images (and other data) on servers. However, Clearview claims it collects "far less data" than comparable search engines. According to its filing, Clearview does not collect any identifying info either -- at least not intentionally. It only harvests photos and their metadata. The company says only 10% of the photos in its 4-billion photo database have any metadata attached.

However, this doesn't mean the software can't compile a staggering amount of information on a person and return this long list in response to an uploaded facial photo. To comply with California data privacy laws, Clearview has given state residents the opportunity to see what Clearview has gathered on them. It's a lot.

The depth and variety of data that Clearview has gathered on me is staggering. My profile contains, for example, a story published about me in my alma mater’s alumni magazine from 2012, and a follow-up article published a year later.

It also includes a profile page from a Python coders’ meetup group that I had forgotten I belonged to, as well as a wide variety of posts from a personal blog my wife and I started just after getting married.

The profile contains the URL of my Facebook page, as well as the names of several people with connections to me, including my faculty advisor and a family member…

Clearview's assertions about the personal information it intentionally gathers are meant to head off the Vermont AG's claims that the company is violating state privacy laws. It's also meant to portray Clearview as no more damaging to privacy than search engines like Google and no more nefarious than a Google search. The problem is cops are less likely to trust a Google search and more likely to trust a company that says it has 4 billion images and 600 law enforcement "partners," even if the search results are equally questionable.

But on to the Section 230 argument, which is kind of amazing in its audacity.

Clearview is entitled to immunity under the CDA because: (1) Defendant is an interactive computer service provider or user; (2) Plaintiff’s claims are based on “information provided by another information content provider;” and (3) Plaintiff’s claims would treat Defendant as the “publisher or speaker” of such information.

These are the base claims. There's more to it. But this is a company raising a defense afforded to service providers who host third party content. Here, there is no third party content -- at least not in the sense that we're used to. The "third parties" Clearview deals with are government agencies, who contribute no content of their own and only search the database of scraped photos using uploaded images.

In addition, the Vermont AG is not seeking an injunction against Clearview because of any particular content in its database. For example, the lawsuit is not predicated on defamatory content a user created. Instead, it's suing Clearview because its method of database compilation ignores the state's privacy laws. It's hard to imagine how Section 230 fits this particular action, but this filing attempts to do exactly that.

First, Clearview asserts it's a search engine just like Google. And if Google can't be sued for violating privacy rights of users of other sites whose personal photos/information show up in Google searches, neither can Clearview.

The Attorney General seeks to prohibit Clearview from accessing or using publicly distributed photos, none of which Clearview AI created. Clearview AI's republication of third-party content is the result of its search engine algorithm, which in this instance happens to be a biometric facial algorithm. The underlying technology does not transform Clearview into an information content provider that would be ineligible for CDA immunity. The CDA protects the publication of search engine results.

[...]

Clearview's publication of its biometric facial algorithms results does not make it an information content provider any more than Google becomes one when it publishes its search algorithm results. Simply put, “[i]f a website displays content that is created entirely by third parties, ... [it] is immune from claims predicated on that content.”

If Clearview did not create the content, it cannot be held responsible for its use of it -- even if end users never specifically agreed to be part of a database accessible by law enforcement.

The motion also says Clearview cannot be viewed as a publisher, since all it has done is created (with scraped content) a searchable database of third-party content.

Vermont’s complaint cannot change the fact that it is targeting Clearview for performing the exact same functions as corporations like Google and Microsoft. Vermont claims that, “at a minimum, Clearview ‘must obtain the other party’s consent before’ using consumers’ photos from any website.” Google, in contrast, is said to “respect the Terms of Service of the websites they visit." But Google searches are filled with information that individuals wanted to remain private, such as nonconsensually distributed intimate images. Nevertheless, Google has repeatedly been protected by §230 because courts have correctly viewed it as a publisher.

[...]

Clearview’s use of a complex algorithm does not negate the fact that it is performing the traditional role of a publisher. The Second Circuit emphatically rejected a claim that Facebook’s “matching” algorithm deprecated its status as a publisher. The Second Circuit has stated that, “we find no basis in the ordinary meaning of ‘publisher,’ the other text of Section 230, or decisions interpreting Section 230, for concluding that an interactive computer service is not the ‘publisher’ of third-party information when it uses tools such as algorithms that are designed to match that information with a consumer's interests."

The state has responded [PDF] to Clearview's Section 230 assertions and has made the obvious point: this legal action isn't being brought over content generated by third parties that would normally be met with immunity arguments. It's being brought over Clearview's acquisition and use of third-party content that state residents never agreed to being harvested/used by the facial recognition tech company.

The injurious information that Clearview claims gives it Section 230 immunity are the photographs that it screen-scraped. But the photographs themselves, as they were posted on the internet by their owners, are not injurious, and do not give rise to any of the State’s claims. Put another way, the State’s cause of action is not properly against millions of individuals who posted anodyne photographs on the internet.

The State’s claims are not based on the specific information at issue. The photographs themselves do not give rise to any of these claims. The State’s claims are for unfairness, deception, and fraudulent acquisition of data. Compl. ¶¶ 76-86. Specifically, Clearview’s conduct in acquiring the photographs through fraudulent means (see Section VI infra, discussing use of term “fraudulent”), storing them without proper security, applying facial recognition technology in a manner meant to violate privacy and infringe civil rights, and providing access to the database to whomever wanted it without concern for the safety or rights of the public, give rise to the State’s claims.

The state's counter-argument hinges on a close reading of Section 230 -- one that turns on a certain voluntary action by third parties.

Section 230 requires that the information at issue be provided by the third-party content provider. Again, the common thread in Section 230 cases is that the Information Content Provider posted the offensive information on the defendant’s servers. Here, no Vermont consumer could have intentionally provided any photographs to Clearview’s servers, because prior to the discovery in January of this year that Clearview had 3 billion photographs in a New York Times exposé, the general public did not know that Clearview existed.

It also points out how Clearview differs from the search engines it tries to compare itself to favorably.

Clearview is not a search engine like Google or Bing. Clearview’s App does something that no other company operating in the United States, including search engine companies, has ever done. In fact, search engine companies that are capable of creating a product like Clearview’s refused to do so for ethical reasons.

The state isn't impressed by Clearview's arguments and sums everything up with this:

For the fact pattern to apply, the photographs themselves would have to somehow be unfair or deceptive and the State’s claims would more properly be brought against the individuals.

In essence, the lawsuit isn't about objectionable content hosted by Clearview, but objectionable actions by Clearview itself. That's why Section 230 doesn't apply. I'm not sure how the local court will read this, but it would seem readily apparent that Section 230 does not immunize Clearview in this case.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: facial recognition, privacy, scanning, section 230, vermont
Companies: clearview, clearview ai


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 1 Jun 2020 @ 2:52pm

    Conspiracy theory time: somebody with an interest in making §230 look bad is paying them to file nonsense like this.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 1 Jun 2020 @ 7:33pm

      Re:

      Conspiracy theory: lawyers who pay hackers to defame litigious celebrities would be out of business if Section 230 were abolished, because the tech companies would be liable.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 1 Jun 2020 @ 8:45pm

      Re:

      I like the idea behind this, but isn't it really quite farfetched since the 230 nonsense is being brought up by the defense? but that doesn't rule it out. absence of evidence ≠ evidence of absence, after all

      link to this | view in chronology ]

    • icon
      PaulT (profile), 1 Jun 2020 @ 10:03pm

      Re:

      That's nice, but we should just stick to the more likely explanation - when a defence is allowed, it will be attempted even by those who are clearly guilty. That is not a good excuse for having it removed for those who genuinely need it.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 1 Jun 2020 @ 3:33pm

    I was unaware of the additional features attributed to section 230.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 1 Jun 2020 @ 3:36pm

      Re:

      Additional nonexistent features of 230 are what the haters hate it for. The additional imaginary feature crowd is quite large.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 1 Jun 2020 @ 7:02pm

        Re: Re:

        I find it amazing how some feel they can just make shit up and everyone will just accept it as real, no one will ever check - lol.

        link to this | view in chronology ]

    • icon
      That One Guy (profile), 1 Jun 2020 @ 10:38pm

      Re:

      That's the funny thing about 230, it seems to have this strange effect on people wherein it causes a decent percentage of them to wildly hallucinate what is and is not in it.

      Usually this impacts those arguing against it but apparently it can also impact those attempting to use it as a legal defense.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 2 Jun 2020 @ 8:28pm

        Re: Re:

        It's amazing what John Smith can do when he puts his mind to it, along with some vapor rub for his knees and a lack of need to breathe oxygen.

        link to this | view in chronology ]

  • identicon
    Anonymous Coward, 1 Jun 2020 @ 3:38pm

    How is this not copyright a violation for every picture?

    If you or I tried to copy every bit of data and pictures we could scrounge from the internet, we would be sued by the holders of the copyright, and rightfully so. Unless Clearview has a clear transfer of copyright for each and every picture, they are going to be sued out of existence. I know several lawsuits involved tiny images generated from larger ones but these are literally copies of work that other people own the rights too. They never authorized this company to receive a copy or gave them the right to profit from them.

    link to this | view in chronology ]

    • icon
      Chris-Mouse (profile), 1 Jun 2020 @ 4:55pm

      Re: How is this not copyright a violation for every picture?

      That depends on what they do with it.
      If Clearview scrapes the pictures and then only feeds them into the facial recognition AI, then that would be a transformative use, similar to the Google Books usage that was found to be legal. But if Clearview included a copy of my picture in the report they send to a client, then that would be different. Would it be infringement? I suspect not, but only a court could decide.

      If you feel like taking them to court, good luck proving that your picture was not only scraped, but also sent onward to a Clearview client.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 1 Jun 2020 @ 8:18pm

      Re: How is this not copyright a violation for every picture?

      They should argue implied license, which is how the search engines get away with it.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 1 Jun 2020 @ 5:07pm

    One of those expectations was not that someone would amass an enormous facial-recognition-fueled surveillance database...

    Here are some more un-expectations:

    • that the photo would be pinned to a dart board and used as a target by a vengeful ex-
    • that the photo would be silk-screened on to a flag.
    • that the photo would be used as the basis for a commemorative gold coin.
    • that the photo would be used by law enforcement to determine if you were breaking the law.
    • that the photo would be used by national intelligence (rather than a private company) to build up a facial recognition database.

    I'm pretty sure "I didn't intend my photo to be used that way" isn't a very good legal argument.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 2 Jun 2020 @ 2:28am

    I find the premise of Clearview as deplorable but its practice unavoidable. You’re not going to stop unethical scrapers from building some kind of face=trained model from recognizing other face son the web, they just might have a slightly smaller dataset to scrape. The biggest complaint I have is with Clearview is that it’s not suitable for police as a basis for identification. It’s just too inaccurate. It’s not much better than running a face through Yandex (Google’s algorithim does not match similar looking faces, I think this is on purpose).

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 2 Jun 2020 @ 8:43am

    "The biggest complaint I have is with Clearview is that it’s not suitable for police as a basis for identification."

    Any specific reasons as to why?

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 2 Jun 2020 @ 9:14am

      Re:

      Facial recognition is nowhere near the standard required for police to use to decide who to arrest, or visit with a swat team.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 3 Jun 2020 @ 12:03pm

    VALID use for Clearview

    So with all the rampant police abuse being captured at the public protests (where the people show up to protest and the police show up to RIOT), we have a need to identify these 'gems of society' to publicly call them out on their BS.

    What better use of a unproven facial recognition system than to look for all the 'bad apples' in the bunches, and since we know just how reliable Clearview claims to be, we can just assume that any officer identified as a bad apple, must really be a bad apple, I mean there's no way unproven snake oil like Clearview could be wrong, am I right?

    I mean if it's okay for the police to use on us, it should be fine for us to use it on the Police in the current situation. Someone could create a huge public database of all the bad apples identified during the police riots (what is really happening right now) and publicly post and present that information to the world (not just the US, everyone should know about these people).

    And if they don't like being targeted in this manner, then perhaps the technology is really not ready for widespread use and shouldn't be used to support even larger police fishing expeditions...

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.