Dangerous Ruling On DMCA Safe Harbors May Backfire On Hollywood
from the be-careful-what-you-wish-for,-mpaa dept
Late last week an important, but disappointing, ruling came down from the 9th Circuit appeals court. The ruling in the case of Mavrix Photographs v. LiveJournal found that volunteer moderators could be deemed agents of a platform, and thus it's possible that red flag knowledge of infringement by one of those volunteer moderators could lead to a platform losing its safe harbors. There are a lot of caveats there, and the ruling itself covers a lot of ground, so it's important to dig in.
The case specifically involved a site hosted on LiveJournal called "Oh No They Didn't" (ONTD) which covers celebrity news. Users submit various celebrity stories, and ONTD has a bunch of volunteer moderators who determine what gets posted and what does not. Some of the images that were posted were taken by a paparazzi outfit named Mavrix. Rather than send DMCA takedowns, Mavrix went straight to court and sued LiveJournal. LiveJournal claimed that it was protected by the DMCA safe harbors as the service provider and the lower court agreed. This ruling sends the case back to the lower court, saying that its analysis of whether or not the volunteer moderators were "agents" of LiveJournal was incomplete, and suggests it tries again.
There are a number of "tricky" issues involved in this case, starting with this: because ONTD became massively big and popular, LiveJournal itself got a bit more involved with ONTD, which may eventually prove to be its undoing. From the decision by the court:
When ONTD was created, like other LiveJournal communities, it was operated exclusively by volunteer moderators. LiveJournal was not involved in the day-to-day operation of the site. ONTD, however, grew in popularity to 52 million page views per month in 2010 and attracted LiveJournal’s attention. By a significant margin, ONTD is LiveJournal’s most popular community and is the only community with a “household name.” In 2010, LiveJournal sought to exercise more control over ONTD so that it could generate advertising revenue from the popular community. LiveJournal hired a then active moderator, Brendan Delzer, to serve as the community’s full time “primary leader.” By hiring Delzer, LiveJournal intended to “take over” ONTD, grow the site, and run ads on it.
As the “primary leader,” Delzer instructs ONTD moderators on the content they should approve and selects and removes moderators on the basis of their performance. Delzer also continues to perform moderator work, reviewing and approving posts alongside the other moderators whom he oversees. While Delzer is paid and expected to work full time, the other moderators are “free to leave and go and volunteer their time in any way they see fit.” In his deposition, Mark Ferrell, the General Manager of LiveJournal’s U.S. office, explained that Delzer “acts in some capacities as a sort of head maintainer” and serves in an “elevated status” to the other moderators. Delzer, on the other hand, testified at his deposition that he does not serve as head moderator and that ONTD has no “primary leader.”
It's this oversight by a paid employee of LiveJournal that makes things a bit sticky. The question is whether or not this oversight and control went so far that the volunteer moderators could also be seen as "agents" of LiveJournal, rather than independent users of the platform.
Evidence presented by Mavrix shows that LiveJournal maintains significant control over ONTD and its moderators. Delzer gives the moderators substantive supervision and selects and removes moderators on the basis of their performance, thus demonstrating control. Delzer also exercises control over the moderators’ work schedule. For example, he added a moderator from Europe so that there would be a moderator who could work while other moderators slept. Further demonstrating LiveJournal’s control over the moderators, the moderators’ screening criteria derive from rules ratified by LiveJournal
The court doesn't fully answer the question, but sends it back to the lower court, saying that it's a "genuine issue of material fact" that should be explored to determine if LiveJournal was responsible, and thus would lose its safe harbors. The specific fact pattern and details here may mean that this ruling doesn't turn out to be a huge problem in the long run for safe harbors, but... it is somewhat worrisome, in that there are at least a few statements in the ruling that are... concerning. For example:
... LiveJournal relies on moderators as an integral part of its screening and posting business model.
But... lots of sites rely on independent and volunteer moderators as a part of their business model. That alone shouldn't matter as to whether or not a volunteer is truly an agent of the company.
A larger issue may be the simple fact that even if a moderator is deemed to be an "agent" of a platform, if they're not experts in copyright, it would be ridiculous to then argue that their own failure to stop infringement makes an entire company liable. That would doom many websites that rely on volunteer help. If one were to mess up and not understand the vast nuances of copyright law, the liabilities for the platform could be immense. As Parker Higgins notes, the expectation here is unbalanced in a ridiculous way, especially as this very same court doesn't seem to think that the sender of a DMCA takedown should take as much responsibility for its actions:
Still, even if the moderator draws a paycheck from the platform, it seems unreasonable to expect them to approach thorny copyright questions with the nuance of a trained professional. That is especially true when you compare this ruling with the Ninth Circuit’s most recent opinion in Lenz v. Universal, the “dancing baby” case, which looks down the other end of the copyright gun at takedown notice senders. Notice senders must consider fair use, but only so far as to form a “subjective good faith belief” about it. If courts don’t require the people sending a takedown notice to form an objectively reasonable interpretation of the law, why should they impose a higher standard on the moderators at platforms handling staggering quantities of user uploads?
But if moderators are a platform’s “agents,” then it runs into trouble if they have actual or “red flag” knowledge of infringements. The Ninth Circuit has instructed the lower court to find out whether the moderators had either. Noting the watermarks on some of the copyrighted images in the case, the court phrased the question of “red flag” knowledge as whether “it would be objectively obvious to a reasonable person that material bearing a generic watermark or a watermark referring to a service provider’s website was infringing.” That’s an important point to watch. Copyright ownership and licensing can be extremely complex — so oversimplifying it to the idea that the presence of a watermark means any use is infringing would have profound negative consequences.
And this is why this ruling may backfire for Hollywood -- even as it pushed the court to rule this way. As EFF notes, at the very time that the MPAA is demanding that platforms do more to moderate content, the implications of this ruling may force them to do much less moderation:
The fact that moderators reviewed those submissions shouldn’t change the analysis. The DMCA does not forbid service providers from using moderators. Indeed, as we explained in the amicus brief (PDF) we filed with CCIA and several library associations, many online services have employees (or volunteers) who review content posted on their services, to determine (for example) whether the content violates community guidelines or terms of service. Others lack the technical or human resources to do so. Access to DMCA protections does not and should not turn on this choice.
The irony here is that copyright owners are constantly pressuring service providers to monitor and moderate the content on their services more actively. This decision just gave them a powerful incentive to refuse.
There are a few other issues in this case that are also potentially problematic. As Annemarie Bridy points out over at Stanford's Center for Internet & Society, the court seems to totally mess up the analysis of the DMCA's safe harbors by confusing part (a) of the DMCA 512 (which applies to network providers) and part (c) (which applies to online service providers):
According to the court, the section 512(a) safe harbor covers users’ submission of material to providers, and section 512(c) covers the providers’ subsequent posting of that material to their sites. There is no such submission-posting distinction in section 512. On the face of the statute and in the legislative history, it’s quite clear that section 512(a) is meant to cover user-initiated, end-to-end routing of information across a provider’s network. A residential broadband access provider is the paradigmatic section 512(a) provider. Section 512(c) covers hosting providers like LiveJournal that receive, store, and provide public access to stored user-generated content. To characterize LiveJournal as a hybrid 512(a)/512(c) provider misapplies the statute and introduces into the case law a wrongheaded distinction between submitting and posting material.
Putting aside the peculiar submission-posting dyad, the dispositive question concerning LiveJournal’s eligibility for the section 512(c) safe harbor is whether the site’s moderator-curated, user-submitted posts occur “at the direction of users,” taking into consideration the nature of moderators’ review and the fact that only about one-third of user submissions are ultimately posted. That question can be answered entirely within the ambit of section 512(c) and the existing case law interpreting it, including the Ninth Circuit’s own decision in Shelter Capital. There was simply no need for the court to invoke section 512(a) in this case.
The court's analysis here is... just weird. It's on page 13 of the ruling, and it really does seem to take a totally unchartered path in arguing that the submission of content is covered by 512(a) while the posting is covered by (c). But... that's wrong:
The district court focused on the users’ submission of infringing photographs to LiveJournal rather than LiveJournal’s screening and public posting of the photographs. A different safe harbor, § 512(a), protects service providers from liability for the passive role they play when users submit infringing material to them.... The § 512(c) safe harbor, however, focuses on the service provider’s role in publicly posting infringing material on its site.
Among the other issues with this case, there's also one on the question of whether or not the anonymous volunteer moderators should be disclosed. As we've discussed in the past, because the First Amendment also protects anonymity, any move to reveal an anonymous commenter must be carefully weighed against their First Amendment right to anonymity. The court here more or less brushes off this issue, saying that once the lower court determines the level of agency, that will answer the question on preserving anonymity:
Notwithstanding the deferential standard of review and complex issues of law that govern this discovery ruling, we vacate the district court’s order denying the motion and remand for further consideration. Whether the moderators are agents should inform the district court’s analysis of whether Mavrix’s need for discovery outweighs the moderators’ interest in anonymous internet speech. Given the importance of the agency analysis to the ultimate outcome of the case, and the importance of discovering the moderators’ roles to that agency analysis, the district court should also consider alternative means by which Mavrix could formally notify or serve the moderators with process requesting that they appear for their deposition at a date and time certain.
This is yet another important case in determining how online platforms can actually function today -- and rulings that undermine safe harbors like the DMCA frequently seem to be what Hollywood wants -- but again, this may backfire. Making it harder for these sites to function if they're actively involved in moderation only means they'll do much less of it.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: 9th circuit, copyright, dmca, moderators, safe harbors
Companies: livejournal, mavrix
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in chronology ]
Whom the gods would destroy...
[ link to this | view in chronology ]
It's worth remembering that copyright plaintiffs always point out that if the user-upload site they're suing can instruct its staff to actively and judiciously look out for child porn, and yet at the same time turn a blind eye to "obvious" copyrighted content, then that's clear proof of willful ignorance -- or worse.
[ link to this | view in chronology ]
Re:
A lot of bad behavior gets permitted and overlooked online, simply because a website has the resources to monitor it's users behavior, but nowhere near the resources required to maintain the copyright regime's iron grip. As this story shows, if a site tries to monitor their users, all of a sudden they are drafted copyright cops who can be sued for not doing "their" job.
Remember that when dealing with trolls, and the like. They are 100% backed and supported by the MAFIAA, who would rather make the internet a hellhole, than loose even a single penny to "copyright infringement".
[ link to this | view in chronology ]
I know it when I see it
[ link to this | view in chronology ]
Re: I know it when I see it
[ link to this | view in chronology ]
Re: Re: I know it when I see it
Can't see why, the standard they use is easy enough.
'If it's making us money and/or we control it, it's not infringing. If it's not making us money and or we don't control it, it's infringing.'
[ link to this | view in chronology ]
Re: Re: Re: I know it when I see it
[ link to this | view in chronology ]
Re: Re: Re: Re: I know it when I see it
[ link to this | view in chronology ]
Re: I know it when I see it
[ link to this | view in chronology ]
Moderators? You think we're crazy enough to have those?
This strikes me as a case where it would almost be worth it for Hollywood to 'win'. If moderation means liability, then sites won't have moderators, they'll just let people post at will and the site will have to keep an entirely hands-off approach lest they open themselves up to a potential world of hurt(bringing things right back to pre-Safe Harbors time, where sites didn't dare moderate then either for the same reason).
Where before a site might have moderators trying to keep more obvious cases of infringement down now none of it will be removed until they receive a DMCA claim regarding it, leading to vastly more potential infringement being posting and staying up longer than before.
If I thought that Hollywood was capable of long-term planning I might actually suspect that they want something like this to point to as an example of why they need even harsher copyright law, but as it stands I'm pretty sure this is just another case of them being so shortsighted and eager to 'shoot' those dastardly pirates that they don't realize that their foot is also in the line of fire.
[ link to this | view in chronology ]
Re: Moderators? You think we're crazy enough to have those?
[ link to this | view in chronology ]
The conspiracy theorist in me thinks this might be part of the plan. Force all media-related sites to act as large businesses rather than the largely fan/independent majority that's always existed. Then with net neutrality abolished they can throttle traffic to all but those who pay the required ransom. The **AAs regain complete control of their industry, and they laugh all the way to the bank until stagnation and poor quality offerings with no competition eventually strangle the industry (which they won't care about, since that's long after the current crop have cashed in).
I don't necessarily think that this is what they're trying, but it's something that crosses my mind occasionally.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
Both of those outcomes mean less competition and an internet that's closer to TV (think "America's funniest").
Which is, incidentally, a wet dream of the **AAs - just like PaulT said.
For every starving artist affected by piracy there's at least two people being censored by "rightsholders" with the help of copyright.
[ link to this | view in chronology ]
[ link to this | view in chronology ]