EU Parliament Takes Up Its Next Attempt To Wipe Out An Open Internet: Terrorist Content Regulation Vote On Wednesday
from the can-we-make-this-disappear-in-an-hour dept
For the past few weeks and months I've been warning people that if you were worried about the EU Copyright Directive, you should be absolutely terrified about the EU Terrorist Content Regulation, which has continued to march forward with very little attention compared to the Copyright Directive. We've detailed the many, many problems with the Terrorist Content Regulation, starting with the requirement that any site (even a one-person blog somewhere outside of the EU) be required to take down content within an hour of notification by an ill-defined "competent authority," but also covering other aspects, such as requiring mandatory content filters.
When the EU Parliament's civil rights committee, LIBE, moved the proposal forward last week, it stripped out some of the worst aspects of the law, but left in the 1 hour content removal requirement. And the largest group in the EU Parliament, the EPP, has already put forth amendments to bring back all the other bad stuff in the proposal. As MEP Julia Reda notes, the EU Parliament will now vote on the Terrorist Content Regulation on Wednesday, and that will include votes on bringing back the awful stuff as well as amendments to hopefully remove the ridiculous and impossible one hour takedown requirement. Reda is explaining why EU citizens should call on their MEPs to support an amendment to remove the one hour removal requirement:
Unfortunately, the unreasonable Commission proposal that illegal terrorist content must be taken down within one hour remains the default in the report adopted by the LIBE committee (see Article 4). The only exception to this rule is for the very first time a website owner receives a removal order from an authority, in which case they get 12 hours to familiarise themselves with the procedure and applicable deadlines. Afterward, regardless of platform size or resources, they must react within one hour in order to avoid harsh penalties. These penalties may amount to 4% of a platform’s turnover in case of persistent infringements (see Article 18).
A one-hour deadline is completely unworkable for platforms run by individuals or small providers, who have no capacities to hire staff tasked with handling potential removal orders 24/7. No private website owner can be expected to stay reachable over night and during weekends in the unlikely event that somebody uploads terrorist material. Their only realistic option available would be the automation of removals: Terrorism filters through the back door.
Blindly deleting or blocking flagged content without review is bound to lead to the deletion of legal uploads, such as a news report on terrorism that shows footage from a war zone, which may indeed be illegal in another context. Already today, there are plenty of examples of overzealous administrative authorities flagging perfectly legal material as terrorist content (Note: Unlike the title of that post suggests, these notices didn’t come from an EU agency, but a French national authority). Thus it’s imperative that websites of all sizes have the necessary time to review reports.
A joint attempt by the Greens/EFA and GUE groups to give providers more time to react was rejected by the LIBE Committee. Amendments by Greens/EFA, GUE and the S&D group will be put to the vote on Wednesday once more to try to get rid of the unworkable one hour deadline.
Separately, talks about the issue of upload filters that were in the original proposal, but removed by the LIBE, only to be re-introduced as an amendment by the EPP Group:
We managed to push back on upload filters, which are included in the Commission proposal in Article 6. The text adopted by the LIBE Committee makes explicit that the state authorities who can order platforms to remove material they consider terrorist content cannot impose obligations on web hosts to monitor uploads, nor to use automated tools for that matter.
Instead, the text calls for “specific measures” that hosts can take in order to protect their services (see Article 6). These measures can range from increasing human resources to protect their service to exchanging best practices (see Recital 16). But regardless of the measure chosen, hosts must pay particular attention to users’ fundamental rights. This clarification is a major victory, considering that the introduction of upload filters seems to be the main objective of the European Commission proposal.
The EPP is against this change and has tabled amendments that would re-introduce the possibility for authorities to force platforms to use upload filters. The plenary must reject the EPP’s pro-upload filter amendment 175 and adopt the LIBE position against upload filters (Amendments 84 to 89)!
Finally, there is the question of whether or not platforms should have their private Terms of Service elevated into the equivalent of law under the proposal. Again, LIBE got rid of this, but the EPP wants to bring it back:
In addition to having to act on removal orders, platforms were to receive “referrals” of content that may or may not be considered terrorist content, which they could then voluntarily assess not by standards of law, but their self-set arbitrary terms of service (see Article 5). Rightfully, the LIBE Committee realised that this would set a dangerous precedent for the privatisation of law enforcement and deleted the provision. While platforms will still undoubtedly make mistakes when removing content, they will at least have to judge by definitions of illegal terrorist content the EU set two years ago.
The EPP group has tabled amendments to try to re-introduce Article 5. On Wednesday, we will have to make sure that Article 5 stays deleted!
Given what we've already seen with the Copyright Directive, it will come as little surprise to suggest that there's a very real chance that the EU Parliament will approve a horrific version of this bill that will make it effectively impossible for any smaller website to operate within the EU without facing massive liability. For a site like our own, under some of these proposals, we would literally be required to hire someone in the EU to be on call 24 hours a day to pull down content. And while I'm sure various "services" would spring up to represent non-EU based websites, in what world am I going to be comfortable giving direct access to all of Techdirt's servers to some random person in the EU in charge of pulling content down as soon as anyone requests it?
Reda and I spoke about this a bit on last week's podcast, and while I know there's reasonable fatigue about all of these attacks on the internet (especially from the EU), it is really important to at least try to stop yet another awful law.
Filed Under: censorship, eu, eu terrorist content regulation, filters, takedowns, terrorist content, terrorist content regulation