UN Free Speech Expert: EU's Copyright Directive Would Be An Attack On Free Speech, Violate Human Rights
from the don't-let-it-happen dept
We've been writing a lot about the EU's dreadful copyright directive, but that's because it's so important to a variety of issues on how the internet works, and because it's about to go up for a vote in the EU Parliament's Legal Affairs Committee next week. David Kaye, the UN's Special Rapporteur on freedom of expression has now chimed in with a very thorough report, highlighting how Article 13 of the Directive -- the part about mandatory copyright filters -- would be a disaster for free speech and would violate the UN's Declaration on Human Rights, and in particular Article 19 which (in case you don't know) says:
Everyone has the right to freedom of opinion and expression; the right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media regardless of frontiers.
As Kaye's report notes, the upload filters of Article 13 of the Copyright Directive would almost certainly violate this principle.
Article 13 of the proposed Directive appears likely to incentivize content-sharing providers to restrict at the point of upload user-generated content that is perfectly legitimate and lawful. Although the latest proposed versions of Article 13 do not explicitly refer to upload filters and other content recognition technologies, it couches the obligation to prevent the availability of copyright protected works in vague terms, such as demonstrating “best efforts” and taking “effective and proportionate measures.” Article 13(5) indicates that the assessment of effectiveness and proportionality will take into account factors such as the volume and type of works and the cost and availability of measures, but these still leave considerable leeway for interpretation.
The significant legal uncertainty such language creates does not only raise concern that it is inconsistent with the Article 19(3) requirement that restrictions on freedom of expression should be “provided by law.” Such uncertainty would also raise pressure on content sharing providers to err on the side of caution and implement intrusive content recognition technologies that monitor and filter user-generated content at the point of upload. I am concerned that the restriction of user-generated content before its publication subjects users to restrictions on freedom of expression without prior judicial review of the legality, necessity and proportionality of such restrictions. Exacerbating these concerns is the reality that content filtering technologies are not equipped to perform context-sensitive interpretations of the valid scope of limitations and exceptions to copyright, such as fair comment or reporting, teaching, criticism, satire and parody.
Kaye further notes that copyright is not the kind of thing that an algorithm can readily determine, and the fact-specific and context-specific nature of copyright requires much more than just throwing algorithms at the problem -- especially when a website may face legal liability for getting it wrong. And even if the Copyright Directive calls for platforms to have remediation processes, that takes the question away from actual due process on these complex issues.
The designation of such mechanisms as the main avenue to address users’ complaints effectively delegates content blocking decisions under copyright law to extrajudicial mechanisms, potentially in violation of minimum due process guarantees under international human rights law. The blocking of content – particularly in the context of fair use and other fact-sensitive exceptions to copyright – may raise complex legal questions that require adjudication by an independent and impartial judicial authority. Even in exceptional circumstances where expedited action is required, notice-and-notice regimes and expedited judicial process are available as less invasive means for protecting the aims of copyright law.
In the event that content blocking decisions are deemed invalid and reversed, the complaint and redress mechanism established by private entities effectively assumes the role of providing access to remedies for violations of human rights law. I am concerned that such delegation would violate the State’s obligation to provide access to an “effective remedy” for violations of rights specified under the Covenant. Given that most of the content sharing providers covered under Article 13 are profit-motivated and act primarily in the interests of their shareholders, they lack the qualities of independence and impartiality required to adjudicate and administer remedies for human rights violations. Since they also have no incentive to designate the blocking as being on the basis of the proposed Directive or other relevant law, they may opt for the legally safer route of claiming that the upload was a terms of service violation – this outcome may deprive users of even the remedy envisioned under Article 13(7). Finally, I wish to emphasize that unblocking, the most common remedy available for invalid content restrictions, may often fail to address financial and other harms associated with the blocking of timesensitive content.
He goes on to point -- as we have -- that while large platforms may be able to deal with all of this, smaller ones are going to be in serious trouble:
I am concerned that the proposed Directive will impose undue restrictions on nonprofits and small private intermediaries. The definition of an “online content sharing provider” under Article 2(5) is based on ambiguous and highly subjective criteria such as the volume of copyright protected works it handles, and it does not provide a clear exemption for nonprofits. Since nonprofits and small content sharing providers may not have the financial resources to establish licensing agreements with media companies and other right holders, they may be subject to onerous and legally ambiguous obligations to monitor and restrict the availability of copyright protected works on their platforms. Although Article 13(5)’s criteria for “effective and proportionate” measures take into account the size of the provider concerned and the types of services it offers, it is unclear how these factors will be assessed, further compounding the legal uncertainty that nonprofits and small providers face. It would also prevent a diversity of nonprofit and small content-sharing providers from potentially reaching a larger size, and result in strengthening the monopoly of the currently established providers, which could be an impediment to the right to science and culture as framed in Article 15 of the ICESCR.
It's well worth reading the whole thing. I don't know if this will have more resonance with the members of the EU Parliament's Legal Affairs Committee, but seeing as they keep brushing off or ignoring most people pointing out these very same points, one hopes that someone in Kaye's position will at least get them to think twice about continuing to support such a terrible proposal.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: article 13, copyright, copyright directive, david kaye, eu, free speech, intermediary liability, monitoring, un, upload filters
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in chronology ]
That would be almost all of the content, unless the site is an archive of public domain works like Project Guttenberg.
The way things are being phrased makes it look like copyright is something reserved to the traditional publishers.
[ link to this | view in chronology ]
Re:
unless the site is an archive of public domain works like Project Guttenberg
Not even then, really, as you might remember Project Guttenberg was recently forced to block German users entirely due to works entering the public domain in the US but not in Germany.
[ link to this | view in chronology ]
Re:
That is the general view of copyright already. If you set up a nice, tasteful, professional-looking photo shoot and try to get copies printed from your original image file, many places will refuse to print them or allow you to do it yourself, because they look like they are copyrighted.
As if only high end professional works could be.
[ link to this | view in chronology ]
where social media is headed
Whether they admit it or not, he ultimate goal is to create a system much like Youtube, Twitter, and Facebook are becoming, in which there are two de facto standards. Big corporations and government officials are deemed "trustworthy" and exempt from moderation, while all the peons are deemed "untrustworthy" and subjected to severe moderation, generally of the shoot-first-and-ask-questions-later variety, which can be triggered by both user complaints (especially dogpiled ones) or a cat-and-mouse series of ever-changing algorithms designed to get content taken down quickly if not immediately, mostly without human interaction.
We've seen an extreme case of this in Usenet posts in recent years, where legitimate content gets reported as copyright-infringing --and automatically taken down-- because of the presence of key words in the title, and sometimes for no identifiable reason whatsoever.
Already, authors are learning, through experience, to censor their own speech to avoid algorithm auto-takedowns.
https://www.youtube.com/watch?v=LlRFoYr-XuY
Any rules or laws regarding content takedowns MUST include penalties for false reporting.
[ link to this | view in chronology ]
Re: where social media is headed
[ link to this | view in chronology ]
Re: Re: where social media is headed
[ link to this | view in chronology ]
Re: Re: Re: where social media is headed
[ link to this | view in chronology ]
Human Rights vs Copyright
Abolish copyright.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
It will also be very hard for them to start charging everyone to access sites and enforcing the will be next to impossible.
The internet will never die and if you want to stop this contract your MEP instead of coming on here and TorrentFreak and ranting all day with saying anything about fighting this.
[ link to this | view in chronology ]
EU Copyright Regulations
[ link to this | view in chronology ]
Re: EU Copyright Regulations
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Democracy is dead in the world.
[ link to this | view in chronology ]
Facebook Bounty Bug
Pyramiding and Bug as a instrument of scam.
[ link to this | view in chronology ]
Unacceptable in a modern country
This is unacceptable! Either the European Union stops this madness now, or it shall expect a revolution that will assure its fall very soon. This is no joke: They are making a huge mistake if they think any group can just wipe out half of the internet overnight and get away with it.
[ link to this | view in chronology ]