EU Commission Says Social Media Companies Must Take Down 'Terrorist Content' Within One Hour
from the plus-more-internet-hobbling-guidelines dept
Once social media companies and websites began acquiescing to EU Commission demands for content takedown, the end result was obvious. Whatever was already in place would continually be ratcheted up. And every time companies failed to do the impossible, the EU Commission would appear on their virtual doorsteps, demanding they be faster and more proactive.
Facebook, Twitter, Google, and Microsoft all agreed to remove hate speech and other targeted content within 24 hours, following a long bitching session from EU regulators about how long it took these companies to comply with takedown orders. As Tim Geigner pointed out late last year, the only thing tech companies gained from this acquiescence was a reason to engage in proactive censorship.
Because if a week or so, often less, isn't enough, what will be? You can bet that if these sites got it down to 3 days, the EU would demand it be done in 2. If 2, then 1. If 1? Well, then perhaps internet companies should become proficient in censoring speech the EU doesn't like before it ever appears.
Even proactive censorship isn't enough for the EU Commission. It has released a new set of recommendations [PDF] for social media companies that sharply increases mandated response time. The Commission believes so-called "terrorist" content should be so easy to spot, companies will have no problem staying in compliance.
Given that terrorist content is typically most harmful in the first hour of its appearance online and given the specific expertise and responsibilities of competent authorities and Europol, referrals should be assessed and, where appropriate, acted upon within one hour, as a general rule.
Yes, the EU Commission wants terrorist content vanished in under an hour and proclaims, without citing authorities, that the expertise of government agencies will make compliance un-impossible. The Commission also says it should be easy to keep removed content from popping up somewhere else, because it's compiled a "Database of Hashes."
Another bad idea that cropped up a few years ago makes a return in this Commission report. The EU wants to create intermediary liability for platforms under the concept of "duty of care." It would hold platforms directly responsible for not preventing the dissemination of harmful content. This would subject social media platforms to a higher standard than that imposed on European law enforcement agencies involved in policing social media content.
In order to benefit from that liability exemption, hosting service providers are to act expeditiously to remove or disable access to illegal information that they store upon obtaining actual knowledge thereof and, as regards claims for damages, awareness of facts or circumstances from which the illegal activity or information is apparent. They can obtain such knowledge and awareness, inter alia, through notices submitted to them. As such, Directive 2000/31/EC constitutes the basis for the development of procedures for removing and disabling access to illegal information. That Directive also allows for the possibility for Member States of requiring the service providers concerned to apply a duty of care in respect of illegal content which they might store.
This would apply to any illegal content, from hate speech to pirated content to child porn. All of it is treated equally under certain portions of the Commission's rules, even when there are clearly different levels of severity in the punishments applied to violators.
In accordance with the horizontal approach underlying the liability exemption laid down in Article 14 of Directive 2000/31/EC, this Recommendation should be applied to any type of content which is not in compliance with Union law or with the law of Member States, irrespective of the precise subject matter or nature of those laws...
The EU Commission not only demands the impossible with its one-hour takedowns, but holds social media companies to a standard they cannot possibly meet. On one hand, the Commission is clearly pushing for proactive removal of content. On the other hand, it wants tech companies to shoulder as much of the blame as possible when things go wrong.
Given that fast removal of or disabling of access to illegal content is often essential in order to limit wider dissemination and harm, those responsibilities imply inter alia that the service providers concerned should be able to take swift decisions as regards possible actions with respect to illegal content online. Those responsibilities also imply that they should put in place effective and appropriate safeguards, in particular with a view to ensuring that they act in a diligent and proportionate manner and to preventing [sic] the unintended removal of content which is not illegal.
The Commission follows this by saying over-censoring of content can be combated by allowing those targeted to object to a takedown by filing a counter-notice. It then undercuts this by suggesting certain government agency requests should never be questioned, but rather complied with immediately.
[G]iven the nature of the content at issue, the aim of such a counter-notice procedure and the additional burden it entails for hosting service providers, there is no justification for recommending to provide such information about that decision and that possibility to contest the decision where it is manifest that the content in question is illegal content and relates to serious criminal offences involving a threat to the life or safety of persons, such as offences specified in Directive (EU) 2017/541 and Directive 2011/93/EU. In addition, in certain cases, reasons of public policy and public security, and in particular reasons related to the prevention, investigation, detection and prosecution of criminal offences, may justify not directly providing that information to the content provider concerned. Therefore, hosting service providers should not do so where a competent authority has made a request to that effect, based on reasons of public policy and public security, for as long as that authority requested in light of those reasons.
These recommendations will definitely cause all kinds of collateral damage, mainly through proactive blocking of content that may not violate any EU law. It shifts all of the burden (and the blame) to tech companies with the added bonus of EU fining mechanisms kicking into gear 60 minutes after a takedown request is sent. The report basically says the EU Commission will never be satisfied by social media company moderation efforts. There will always be additional demands, no matter the level of compliance. And this is happening on a flattened playing field where all illegal content is pretty much treated as equally problematic, even if the one-hour response requirement is limited to "terrorist content" only at the moment.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: censorship, content filtering, copyright infringement, eu, eu commission, free speech, takedowns, terrorist content
Reader Comments
The First Word
“Pointy-haired boss syndrome...
Where everything is easy when you don't have to do it.
Be interesting if the companies in question were to flat out call the politicians out on this. Demand that if it's so easy to spot the content in question that they provide examples.
Out of these 10 videos, which should be removed?
How about out of 100?
1,000?
10,000?
Keep in mind the allowed time-frame isn't any different from the first set to the last, you had one hour to go over ten videos, and you have one hour to go over ten thousand.
If it's really so easy then they should have no problem keeping up with the task as it scales up to what they are demanding the companies manage.
If the response when they fail is 'well hire more people!' demand that they flat out state how many people they think the companies should be obligated to hire, how many extra bodies they want warming seats, checking through everything to ensure only approved content is allowed to be posted.
Add to the fact that the government is demanding direct, unquestionable censorship power in being able to state 'this is to be taken down, and you can not contest that decision', and this goes from 'really bad' to 'disastrously bad' mighty quick.
Subscribe: RSS
View by: Time | Thread
The whiners will keep whining or pushing for laws to the eternity. And everybody will lose, including those platforms.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
It will keep climbing down until the mandated removal interval is "instantly".
Depending on what you mean, it could be worse than that. The next step after 'as soon as it's put up it needs to be taken down' is that it's never allowed up in the first place.
Everything must be vetted before it's allowed to be posted, no exceptions(well, other than 'official' content of course...), so that no 'terroristic content' can possible make it onto the platforms. That this will all but destroy them is a price the politicians involved are willing to have others pay.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Oh, but only if it were actually possible
[ link to this | view in chronology ]
Re: Oh, but only if it were actually possible
One-way communication with the people from "reliable sources" was the best thing, and regular plebs were tied to the "Letter to the Editor" or being interviewed on TV for expressing an outside (vetted) opinion.
By that logic, it'd be much safer for everybody if everyone's words are checked over first, like they were back then. We can't let "dangerous"/"offensive" content on the Internet, at any cost to speaking freely! /sarcasm
[ link to this | view in chronology ]
Re: Re: Oh, but only if it were actually possible
You mean the prolefeed?
[ link to this | view in chronology ]
Re: Re: Oh, but only if it were actually possible
Honestly, that is what they really want. Corporations tied closely to the way things happened pre-Internet would love it if a broadcast model was the only one that was left to be viable, while the governments would love it if they could control the narrative like they used to.
That's why there's such a divide between "silicon valley" and the telcos/government. The latter either don't understand the modern world or profited far easier from the way things used to be, while the former both understand it and have worked out how to leverage it to their benefit.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
First of all, how would they populate their database? This content is created by individuals and groups who then post it to social media sites using video-sharing services. Therefore, the government is not going to get this first, they are.
Due to the government's own rules this stuff has to be taken down within the hour, which is not enough time for it to be caught by the government, unless they issue an order that copies must be made of all items taken down and sent to themselves, thereby raising the cost of compliance.
Therefore, there's no way of shifting the burden to the government as you describe it; they'd be reliant on ISPs to find, identify, store, and send the dodgy content in order to set up and maintain their database.
[ link to this | view in chronology ]
Governments, we demand you keep terrorists off our sites and services. Please drone bomb them within an hour of us - or you! - finding any terrorist or child porn content on our sites. Kthxbye.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
So why is this only for "Social media content"?
[ link to this | view in chronology ]
Re: So why is this only for "Social media content"?
Keep the world moderately dangerous (or make words themselves "dangerous" under the law if your country is safer) and the censorship will make dumb people feel a false sense of security whilst losing their freedoms.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
Then, when their people revolt against their buffoonish overlords, they can come back.
[ link to this | view in chronology ]
Time Travel is NOT possible...
The internet removed government control of the messengers; thus removing control of the message.
And *that* genie is NOT going back into the bottle...
So keep on trying to kill the internet one piece at a time... and it will keep up with its normal response:
"Damage Detected. Re-Routing"
[ link to this | view in chronology ]
Re: Time Travel is NOT possible...
To which the government response will increasingly be:
"Non-Compliance Detected. Bankrupting / Litigating / Holding at gun point."
Granted we're not at the last one on a widespread basis yet, but believe me, those pushing censorship will happily implement it given enough time. They always do.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Freedom of speech
Keep trying to tell people that the harder you fight what you think is wrong, the more damage you wind up doing to yourself. By all means, go ahead and grab a pound of meat and then stick your hand in with it to make sure all of it has been ground up. That is how stupid people are, on all sides of this issue.
Right now, crazy has been called sane!
[ link to this | view in chronology ]
Re: Freedom of speech
Recent editorial claimed a large percentage of the public does not believe anything Donald has to say, and that this is dangerous. Many months ago I thought it was funny - now it is getting a bit scary.
[ link to this | view in chronology ]
Turn off Social Media in the EU...
Seriously though, turn off the access to them from EU ip addresses.
If they want to use the sites, they'll have to VPN in, and then the EU laws won't apply.
[ link to this | view in chronology ]
Pointy-haired boss syndrome...
Where everything is easy when you don't have to do it.
Be interesting if the companies in question were to flat out call the politicians out on this. Demand that if it's so easy to spot the content in question that they provide examples.
Out of these 10 videos, which should be removed?
How about out of 100?
1,000?
10,000?
Keep in mind the allowed time-frame isn't any different from the first set to the last, you had one hour to go over ten videos, and you have one hour to go over ten thousand.
If it's really so easy then they should have no problem keeping up with the task as it scales up to what they are demanding the companies manage.
If the response when they fail is 'well hire more people!' demand that they flat out state how many people they think the companies should be obligated to hire, how many extra bodies they want warming seats, checking through everything to ensure only approved content is allowed to be posted.
Add to the fact that the government is demanding direct, unquestionable censorship power in being able to state 'this is to be taken down, and you can not contest that decision', and this goes from 'really bad' to 'disastrously bad' mighty quick.
[ link to this | view in chronology ]
Re: Pointy-haired boss syndrome...
[ link to this | view in chronology ]
Re: Pointy-haired boss syndrome...
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Who is this group??
THIS ISNT the EU..
This is the group that is Supposed to be responsible from interactions and Trade between the EU countries..
Its a BUNCH of WELL PAID, by Each of the EU states, persons that are Supposed to represent EACH EU state.(they love passing laws, for some stupid reason)
1. this is 1 step of the concept of controlling NEWS/INFORMATION/Distribution.
2. WHO THE F.. IS THIS??
3. wHO's idea was this??
NOW for the fun part..
HOW BIG is google? Could we say that IF Google wanted to, that about 1/2 the internet would no longer be accessible to the EU??
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
Quite frankly this sort of thing should make everyone in the UK glad that "Leave" won, no matter how rocky the change to trading with the EU under WTO rules proves to be. (Yes, I think it will come to that, precisely because of the "we are to be obeyed" attitude of the European Commission.)
[ link to this | view in chronology ]
Re: Re: Re:
Only if your delusional enough to think that similar and worse rulings won't come out of Westminster. Frankly, if you look at the history of the EU and the Tories, we have actually been rescued from far worse things already being made law with fewer protections for the public. Things will get worse for us, the only thing that will change is that people like you no longer have the EI boogey man to blame (although, no doubt, your tabloids will find a way to tell you to blame them anyway).
[ link to this | view in chronology ]
Companies themselves to blame
The real reason is that these companies have taken responsibility of larger amount of content than they can actually handle properly. EU's demands are perfectly valid for smaller/quicker companies who don't have huge market reach. The large companies like facebook, google and twitter were just greedy when they spread their networks to the whole world. If they can't handle their market reach, they have alternative to reduce their global reach or hire more people to handle the problems. But any idea that EU's demands are somehow unreasonable simply because the large companies cannot fullfil the requirements is just crazy. It's the companies own problem that they wanted the whole world to use their platforms.
[ link to this | view in chronology ]
Re: Companies themselves to blame
[ link to this | view in chronology ]
Re: Re: Companies themselves to blame
Owner of the pub will still call police every time you bring your semiautomatic machine gun to the party.
[ link to this | view in chronology ]
Re: Re: Re: Companies themselves to blame
That's a great example. And in your example, the owner of the pub wouldn't be held liable for the actions of the patron who brought the semiautomatic machine gun to the party. Even if they had a habit of kicking out other people who they'd noticed had guns before. And pub owners are also not responsible for finding all semiautomatic machine gun's their patrons might bring within an hour. Which, if our analogy were to be made closer to the truth, the pub holds a few million people and thousands of them brought their black painted nerf guns.
[ link to this | view in chronology ]
Re: Re: Re: Re: Companies themselves to blame
Well, they actually are responsible if some lunatic shoots people in their premises. This is why there's professional guards in the door, so that they detect the guns before letting people inside the party place. Some parties even check for knives and other useful gadgets before letting partygoers to the partyplace.
Organizers of large gatherings are obviously responsible if during the event, something bad happens and people get injured or dead.
Obviously the damage with social media sites are different nature, and terrorist material has tendency to be spammed to large area of the world, in order to find the people who are maniputable to work for the terrorists. This is why the platforms need to be careful with the content quality before spamming the material to large groups of people.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Companies themselves to blame
So would hold you the school staff, and the cops guarding the place responsible for deaths in the Florida school shooting, as they failed to keep the gunman out?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Companies themselves to blame
Yes. If the professional security people can't do the job, who can? Of course it's a team effort in schools, so teachers can report if they detect people to go wrong direction, but there's always people who are responsible when things don't go as planned.
There's a reason why security people get paid -- so that everyone else can feel safe and secure against large masses of people and whatever can happen when people go broken. Detecting and fixing the problems is responsibility of the professional guards, teachers, police and everyone else who can detect the activity.
Social media companies are experts in social media's effects to the world, so they ought be to controlling whatever is happening in that area.
Note that a pub will have one professional guard per 50 people visiting their premises, and I'm not sure if facebook and twitter has that many employees... Maybe they're running out of resources to handle their impact to the world.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Companies themselves to blame
And frankly this isn't akin to the cops guarding the place being responsible for deaths in the Florida school shooting. This is cops guarding the place being responsible for not putting every single individual on campus through an x-ray machine, pat down and full search every hour they remain on the premesis, with every word spoken scrubbed for anything law enforcement might not like with actual context, illegality or threat to safety be damned.
Where two students talking about a movie can be taken out of context to make them the next shooters. Where expressing displeasure with the existing administration being treated as a terrorist. All because someone in charge is 'just being careful', lest they be held personally responsible for that one conversation being the one in a million statement that preceeds an attack.
I am not using an accurate statistic, but giving an indication of scale. It is Facebooks responsibility to become investigative officers. It is Twitter's job to personally vet every insignificant cat-video and selfie. In the example of a bar, listening devices must be planted on every patron and every second of conversation listened to in real time to ensure that the patrons don't say something that might annoy someone who's ACTUAL JOB IT IS to do investigative work.
Would you want to see your bar have to hire enough people to put your patrons under more scrutiny than a max security prison? On your own dime? Because stopping one unacceptable comment or conversation is worth putting the dozens, hundreds, THOUSANDS of other conversations under the microscope?
In your example, the bar would shut down. The burden of liability to great to effectively police profitably, or frankly at all. You'd either see 'No talking' signs posted everywhere with demands every person entering the bar be strip searched, or you'd see the next bar fight land the bar owner in jail with assault charges for not pre-empting a confrontation. Because 'they were responsible for making everyone feel safe and secure'.
One might say 'Well they are being given an hour, it's not instantaneous' but (1) If companies bend to the 1 hour timeframe, you can bet the next step is for politicians to DEMAND everything be verified and vetted and approved before being put online and (2) if you compare the scale of conversations happening on social media to conversations happening at the bar or school in your example, the burden is... frankly still not equivocal. Companies are still given the more impossible task.
I'm sick of hearing governments demand everyone else do law enforcements job for it. Let's flip the scenario here.
A murderer got away with a crime? A theft occurred? Law enforcement has one hour, hell let's be generous, one day to solve the crime. Otherwise they are liable for the damage caused, especially if the same individual commits another crime. The burden is too great for law enforcement to keep up? Hire more people! Apparently it's that simple if platform holders are being held to such a standard! I mean, that's why we hire law enforcement right? If they can't do the job, who can? There's always people who are responsible when things don't go as planned. There's a reason why police get paid. So that everyone else can feel safe and secure against large masses of people and whatever can happen when people go broken. Detecting and fixing the problems is responsibility of law enforcement who cand etect the activity.
Law enforcement are experts in crime's effects to the world, so they ought to be controlling whatever is happening in that area.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Companies themselves to blame
Unfortunately, history has proven that when authorities are under pressure to get results, all it means is that innocent people are railroaded for crimes they didn't commit. Adding time pressure means they just try to get *any* conviction, not investigate the crime properly and find out who actually did it.
That's actually one of the problems here - social media companies will be forced to suppress a huge amount of legitimate speech because they're afraid of something slipping through without their control. It seems that the reality is beyond tp's ability to comprehend, but it"s not a question of manpower or the site's ability to process actual complaints.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Companies themselves to blame
While it is entirely reasonable to expect ISPs to remove questionable content in a timely fashion it takes human eyeballs to decide what is or isn't acceptable within the company's TOS.
They're already frantically playing whack-a-mole and often deleting innocent content. It's unreasonable to expect them to speed the process up as it'd mean more innocent content gets taken down with the bad stuff.
Banning stuff is easy. It's a damn sight harder to tackle the social issues that drive people to terrorism, etc.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Companies themselves to blame
Well, that's really the issue, isn't it? Saying "you should remove content in a timely manner after it's been identified" is one thing, and an hour shouldn't be too much to ask under those circumstances.
However, what they're actually being asked to do is *identify* the content, then remove within an hour. That's a completely different request. This is where the disconnect lies, I think - some people don't realise that the first task takes a lot more than the second.
"Banning stuff is easy. It's a damn sight harder to tackle the social issues that drive people to terrorism, etc."
Which, frankly, is why we hear so much idiotic crap like this. It's far easier to point to a 3rd party and blame them and/or pass the responsibility to them than it is to fix root causes. The real fixes are both very long-term actions and fairly boring. It's much easier to get voted in based on easy "fixes" that do nothing than it is to tackle groundroots issues that will bear real fruit in a decade.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Companies themselves to blame
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Companies themselves to blame
Especially since the entire reason for your belief that fair use doesn't exist is because it takes too long to prove. You know what other bits of law take too long to prove? Try the whole damn thing...
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Companies themselves to blame
Not where I live, they don't. Your premise is either very wrong, or you live somewhere unusually dangerous that doesn't translate to the rest of the real world
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Companies themselves to blame
As usual, thank fuck I don't live in a place where guns are so common that this is expected to be the norm. I'll happily wander in and out of my local pubs without having to pass through armed guards or risk being shot by other people who go there, thanks.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Companies themselves to blame
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Companies themselves to blame
But, he said pubs. Even with cultural differences, if you're going to a pub that can't have more than 50 people there without people packing deadly weapons, you're either in a very, very bad part of town, or you're a lying asshole. I think I know which one he is.
[ link to this | view in chronology ]
Re: Re: Re: Companies themselves to blame
[ link to this | view in chronology ]
Re: Re: Re: Re: Companies themselves to blame
[ link to this | view in chronology ]
Who, aside from a microcephalic lunatic with terminal hydrophobia would give that?
It's most harmful in that brief period when only the people who already know about it can find it? Really? REALLY?
Citation needed: not that we would believe the citation, but at least we would know that the source of the citation was a professional long-nosed incendiary-trousers prevaricator.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Typo in original posting...
Likely you meant either "sharply DECREASES mandated response time," or "sharply increases mandated respose SPEED."
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
There's a thing about cat videos turning evil, and furs linked to ethics problems or being wrong in the internet; but whoever mentions hitler first in discussion is known to lose the discussion.
[ link to this | view in chronology ]
Whatever was already in place would continually be ratcheted up.
[ link to this | view in chronology ]
Censorship by degrees
And how long till it extends to copyrighted items? That's where it's headed, people. Historically, mission creep towards copyright enforcement has always been the case.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
What is terrorist content?
For example, if someone says "Death to all Christians", then that could probably be a terrorist threat.
But if someone says "Death to all Muslims", then they're repeating what so many other people (and politicians) are thinking.
Yet saying "death to anyone" should be treated the same.
[ link to this | view in chronology ]