FTC's Latest Fine Of YouTube Over COPPA Violations Shows That COPPA And Section 230 Are On A Collision Course
from the this-could-be-an-issue dept
As you probably heard, earlier this week, the FCC fined Google/YouTube for alleged COPPA violations in regards to how it collected data on kids. You can read the details of the complaint and proposed settlement (which still needs to be approved by a judge, but that's mostly a formality). For the most part, people responded to this in the same way that they responded to the FTC's big Facebook fine. Basically everyone hates it -- though for potentially different reasons. Most people hate it because they think it's a slap on the wrist, won't stop such practices and just isn't painful enough for YouTube to care. On the flip side, some people hate it because it will force YouTube to change its offerings for no good reason at all and in a manner that might actually lead to more privacy risks and less content for children.
They might all be right. As I wrote about the Facebook fine and other issues related to privacy, almost every attempt to regulate privacy tends to make things worse, in part, because people keep misunderstanding how privacy works. Also, most of the "complaints" about how this "isn't enough," are really not complaints directed at the FTC, but at Congress, because the FTC can only do so much under its current mandate.
Separately, since this fine focused on COPPA violations, I'll separately note that COPPA has always been a ridiculous law that makes no real sense -- beyond letting politicians and bureaucrats pretend they're "protecting the children" -- while really creating massive unintended consequences that do nothing to protect children or privacy, and do quite a bit to make the internet a worse place.
But... I'm not even going to rehash all of that today. Feel free to dig into the past links yourselves. What's interesting to me is something specific to this settlement, as noted by former FCC and Senate staffer (and current Princeton professor), Jonathan Mayer: the FTC, in this decision, appears to have significantly changed its interpretation of COPPA, and done so in a manner that is going to set up something of a clash with Section 230. What happened is a little bit subtle, so it requires some background.
The key feature of COPPA -- and the one you're probably aware of whether or not you know it -- is that it has specific rules if a site is targeting children under the age of 13. This is why tons of sites say that you need to be over 13 to use them (including us) -- in an attempt to avoid dealing with many of the more insane parts of COPPA compliance. Of course, in practice, this just means that many people lie. Indeed, as danah boyd famously wrote nearly a decade ago, COPPA seems to be training parents to help their kids lie online -- which is kinda dumb.
Of course, the key point under COPPA is not actually the "under 13" users, but rather whether or not a website or online service is "directed to children under 13 years of age." Indeed, in talking about it with various lawyers, we've been told that most sites (including our own) shouldn't even worry about COPPA because it's obvious that such sites aren't "directed to children" as a whole and therefore even if a few kids sneak in, they still wouldn't be violating COPPA. In other words, the way the world has mostly interpreted COPPA is that it's not about how whether any particular piece or pieces of content are aimed at children -- but whether the larger site itself is aimed at children.
This new FTC settlement agreement changes that.
There’s a subtle and key legal move in today’s FTC-YouTube privacy settlement. Online services often dodge COPPA by claiming that *in aggregate* they aren’t child-directed. But in this case, FTC said *specific* channels were *separate* child-directed services, and COPPA applies.
— Jonathan Mayer (@jonathanmayer) September 4, 2019
Basically, the FTC has decided that, under COPPA, it no longer needs to view the service as a whole, but can divide it up into discrete chunks, and determine if any of those chunks are targeted at kids. To be fair, this is well within the law. The text of COPPA clearly says in definitional section (10)(A)(ii) that "a website or online service directed to children" includes "that portion of a commercial website or online service that is targeted to children." It's just that, historically, most of the focus has been on the overall website -- or something that is more distinctly a "portion" rather than an individual user's channel.
Except, that under the law, it seems that it should be the channel operator who is held liable for violations of COPPA under that channel, rather than the larger platform. In fact, back in 2013, the last time the FTC announced rules around COPPA it appears to have explicitly stated, that it would apply COPPA to the specific content provider who was directed at children and not at the general platform they used. This text is directly from that FTC rule, which went through years of public review and comment before being agreed upon:
... the Commission never intended the language describing ‘‘on whose behalf’’ to encompass platforms, such as Google Play or the App Store, when such stores merely offer the public access to someone else’s child-directed content. In these instances, the Commission meant the language to cover only those entities that designed and controlled the content...
But that's not what the FTC is doing here. And so it appears that the FTC is changing the definition of things, but without the required comment and rulemaking process. Here, the FTC admits that channels are "operators" but then does a bit of a two-step to say that it's YouTube who is liable.
YouTube hosts numerous channels that are “directed to children” under the COPPA Rule. Pursuant to Section 312.2 of the COPPA Rule, the determination of whether a website or online service is directed to children depends on factors such as the subject matter, visual content, language, and use of animated characters or child-oriented activities and incentives. An assessment of these factors demonstrates that numerous channels on YouTube have content directed to children under the age of 13, including those described below in Paragraphs 29-40. Many of these channels self-identify as being for children as they specifically state, for example in the “About” section of their YouTube channel webpage or in communications with Defendants, that they are intended for children. In addition, many of the channels include other indicia of child-directed content, such as the use of animated characters and/or depictions of children playing with toys and engaging in other child-oriented activities. Moreover, Defendants’ automated system selected content from each of the channels described in Paragraphs 29-40 to appear in YouTube Kids, and in many cases, Defendants manually curated content from these channels to feature on the YouTube Kids home canvas.
Indeed, part of the evidence that the FTC relies on is the fact that YouTube "rates" certain channels for kids.
In addition to marketing YouTube as a top destination for kids, Defendants have a content rating system that categorizes content into age groups and includes categories for children under 13 years old. In order to align with content policies for advertising, Defendants rate all videos uploaded to YouTube, as well as the channels as a whole. Defendants assign each channel and video a rating of Y (generally intended for ages 0-7); G (intended for any age); PG (generally intended for ages 10+); Teen (generally intended for ages 13+); MA (generally intended for ages 16+); and X (generally intended for ages 18+). Defendants assign these ratings through both automated and manual review. Previously, Defendants also used a classification for certain videos shown on YouTube as “Made for Kids.”
That's a key point that the FTC uses to argue that YouTube knows that its site is "directed at" children. But here's the problem with that. Section 230 of the Communications Decency Act, specifically the often forgotten (or ignored) (c)(2) is explicit that no provider shall be held liable for any moderation actions, including "any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable." One way to do that is... through content labeling policies, such as those that YouTube used and described by the FTC.
So here, YouTube is being found partially liable because of its content ratings, which is being shown as evidence that it's covered by COPPA. But, CDA 230 makes it clear that there can't be any such liability from such a rating system.
This won't get challenged in court (here) since Google/YouTube have agreed to settle, but it certainly does present a big potential future battle. And, frankly, given the way that some courts have been willing to twist and bend CDA 230 lately, combined with the general "for the children!" rhetoric, I have very little confidence that CDA 230 would win.
Filed Under: cda 230, coppa, ftc, moderation, privacy, section 230
Companies: google, youtube