Dear Europe: Please Don't Kill Free Speech In The Name Of 'Privacy Protection'
from the think-about-this-carefully dept
About a year and a half ago, we wrote about how the new European "General Data Protection Regulation" (GDPR) was potentially very problematic for free speech. That is, well-meaning "data protection" folks wrote up the GDPR, but it appears they did so with little thought towards what the impact might be on free speech. So, specifcally, when they include something like a right to "erasure" for certain information, you can understand, from a privacy standpoint why people may want certain data and information to be deleted from certain databases. But bring that over to the open web, rather than private databases, and you're talking about a censorship tool around a "right to be forgotten" system.
To deal with this kind of potential problem, rather than doing the smart thing and fixing and clarifying the GDPR, Europe has left things up to each member country to try to sort things out on their own, and to explore how to set their own data protection rules in a manner that will obey the GDPR but also avoid stomping out free expression. Unfortunately, it's unclear that many of the states are taking that balancing act very seriously. The UK quietly put up a comments request with all answers due by this Wednesday (and, of course, by the time this all gets sorted out, who's to say if the UK will even still be in the EU... but...).
Daphne Keller, who studies these things over at Stanford Law School's Center for Internet and Society has both a larger paper and a shorter blog post discussing this, specifically in the context of serious concerns about how the Right To Be Forgotten (RTBF) under the GDPR will be implemented, and how it may stifle freedom of expression across Europe. Right now, of course, the RTBF applies to search results, but under the GDPR it may expand to much more, including things like Twitter and Facebook:
Applying RTBF to platforms like Facebook, Dailymotion, or Twitter would be a big deal for Internet users’ expression and information rights. RTBF in its current form under Google Spain only covers search engines, and only requires “de-listing” search results – meaning that users will not see certain webpage titles, snippets, and links when they search for a data subject by name. Regulators have said that the RTBF is reconcilable with information and expression rights precisely because information is only de-listed, and not removed from the source page. But if social media or other hosts had to honor RTBF requests, much of the information they erased would not merely be harder to find – it would be truly gone. For ephemeral expression like tweets or Facebook posts, that might mean the author’s only copy is erased. The same could happen to cloud computing users or bloggers like artist Dennis Cooper, who lost 14 years of creative output when Google abruptly terminated his Blogger account.
Expanding the list of private platforms that must accept and adjudicate RTBF requests would directly affect users’ expression and information rights. But it is hard to pinpoint quite which GDPR articles speak to this issue. Is it purely a question of who counts as a controller under the GDPR’s definitions (Art. 4)? Might it be, as I have argued in other contexts, a question about the scope of objection and erasure rights (Arts. 17 and 21)? Do national expression and information rights shape a platform’s “responsibilities, powers and capabilities” under the Google Spain ruling (para. 38)? These are difficult questions. The answers will, in a very real way, affect the expression and information rights that Member State legislatures are charged with protecting.
And what happens if (as always happens) the process is abused and perfectly legitimate content is taken down? That is... once again... not at all clear:
The Article 29 Working Party has said that search engines generally shouldn’t tell webmasters about de-listings, and the Spanish DPA recently fined Google €150,000 for doing so. The data protection logic here is understandable. When a data subject tells a controller to stop processing her data, it seems perverse for the controller to instead process it more by communicating with other people about it.
But excluding the publisher or speaker from the platforms’ behind-closed-doors legal decisions puts a very heavy thumb on the scales against her. It effectively means that one private individual (the person asserting a privacy right) can object to a platform’s RTBF decision and seek review, while the other private individual or publisher (asserting an expression right) cannot. Other procedural details of the GDPR tilt the balance further. For example, a platform can reject a RTBF request that is “manifestly unfounded,” but only if the platform itself – which likely has little knowledge about or interest in the information posted by a user – assumes the burden of proof for this decision. (Art. 12.5)
This lopsided approach may be sensible for ordinary data erasure requests, outside the RTBF context. When a data subject asks a bank or online service to cancel her account, the power imbalance between the individual and the data controller may justify giving her some procedural advantages. But RTBF requests add important new rights and interests to the equation: those of other Internet users. Procedural rules should not always favor the data subject over other private individuals.
And, of course, as we've pointed out over and over again, the more liability you put on the platform for not deleting content, the more those platforms will default to deleting all sorts of content to avoid liability. And that would mean free and open spaces on the web become locked up fast. And that should be seen as worrying for those who believe in the internet as a platform for everyone, rather than just big media companies.
Research and common sense tell us that when platforms face legal trouble for failing to remove user expression, they are likely to remove too much. Claimants consistently ask platforms to remove more information than the law requires: studies say that 38% of copyright removal requests to Google Image Search raise invalid legal claims; Google and Bing both report that over 50% of RTBF requests do as well. But as the studies show, platforms often err on the side of caution, taking down lawful or lawfully processed information. Incentives to play it safe and simply comply with RTBF requests are strong under the GDPR, which permits penalties as high as 4% of annual global turnover or €20 million. (Art. 83) National law should account for this dynamic, putting procedural checks in place to limit over-removal by private platforms. Civil society recommendations like the Manila Principles offer a menu of options for doing just this. For example, the law can penalize people (or businesses, governments, or religious organizations) if they abuse notice-and-takedown to target other people’s lawful expression.
The GDPR does not provide meaningful procedural barriers to over-removal. In many cases, it appears to strongly tilt the playing field in favor of honoring even dubious RTBF requests – like ones Google received from priests trying to hide sexual abuse scandals, or financial professionals who wanted their fraud convictions forgotten.
As various countries in Europe look to put in place regulations to abide by the GDPR it would be nice if they actually considered this stuff. I fear they may not. If you have some time in the next day or two, at least feel free to take part in the UK comment period and hope that they get it right (even if they're on their way out of the EU).
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: data protection, europe, free speech, gdpr, intermediary liability, privacy, uk
Reader Comments
Subscribe: RSS
View by: Time | Thread
This case is rather obvious but sometimes we will only realize the true value of historical data many, many years or even decades after it happened. Do we really want to go down that slippery slope and make stuff vanish like that? Sure we need privacy but this is not the right path.
Dealing with the increasing surveillance conducted by Governments in the EU sound like a good start to improve privacy.
[ link to this | view in thread ]
You all just don't understand...
[ link to this | view in thread ]
[ link to this | view in thread ]
These Democracies?
something something, founding fathers of the USA, comment on suicide and democracy...
does this fit here or is this unapproved speech here at TD?
Europe is falling apart and for the past several years people have been wanting to be just like them. There will not be a Europe in 40 years that looks even remotely like what it is now.
The path is clear, use discontent and problems to take liberty away at every possible step. If you have to take it away under the guise of something good, all the better!
[ link to this | view in thread ]
Re: These Democracies?
Yes, just like in North America. Not perfectly, again like in North America. But no-one expects ANY system of government to work out perfectly.
Something something, founding fathers nevertheless made the USA a democracy, comment on republic and democracy not being mutually exclusive...
The EU may be falling apart, though probably not. Keep in mind that the end result of that would look like North America. Independent countries with trade agreements.
Like "Make America Great Again."
[ link to this | view in thread ]
It's Europe
[ link to this | view in thread ]
Re: It's Europe
[ link to this | view in thread ]
Re: These Democracies?
One needs to be more specific if they want others to take them seriously. What, exactly, do you mean with your all encompassing generalization?
"There will not be a Europe in 40 years that looks even remotely like what it is now."
Why would anyone expect a continent/country/city to remain static (not growing) for any amount of time?
"use discontent and problems to take liberty away at every possible step"
This looks like a GOP platform item, is it?
[ link to this | view in thread ]
Re: It's Europe
[ link to this | view in thread ]
Re: It's Europe
Inquisition and Crusades - valid point, but I'm not convinced the same thing could not happen here. In fact, it is more likely today than it has been for some time.
[ link to this | view in thread ]
[ link to this | view in thread ]
Google only believes in the right to be forgotten when it involves one of their own.
"Dear Europe: Please don't damage Google's business model!"
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re: It's Europe
Nobody's perfect. :)
[ link to this | view in thread ]
There are a few aspects here:
- if the information is truly irrelevant then who cares? By insisting on removing it, you **make it** relevant. This is your own, personal shot in the foot.
- if the information is really irrelevant, but some moron is considering it true - don't deal with a moron. It is harmful whatever you do, and this law will not help,
- if I, as a potential employer, see that prospective employee wants to hide something from the public, I will at least think twice, because
- this is a perfect bullying/blackmail opportunity - you are explicitly telling the whole world you **have** something to hide. This would make your employer vulnerable,
- this is also a perfect business opportunity - screening job candidates or business partners for what they want to hide - gives nice leverage.
On the whole using this law will mostly hurt. But then again - people learn better when it hurts to err.
[ link to this | view in thread ]
[ link to this | view in thread ]
[ link to this | view in thread ]