How The EU's Proposed New 'Privacy' Rules Will Be A Tool For Massive Censorship

from the takedown-takedown-takedown dept

We recently wrote about some concerns about the new Data Protection Directive that is being set up in Europe. The law is driven by people with good intentions: looking to better protect the privacy of European citizens. Privacy protection is an important concept -- but the current plans appear to be so focused on privacy protection that it gives very little regard for the unintended consequences of the way it's been set up. As we wrote in our last post, Daphne Keller at Stanford's Center for Internet and Society is writing a series of blog posts raising concerns about how the new rules clash with basic concepts of free speech. She's now written one about the immensely troubling setup of the "notice and takedown" rules included in the General Data Protection Regulation (GDPR). For years, we've been concerned by problematic notice and takedown procedures -- we've seen the DMCA frequently abused to stifle speech, rather than for genuine copyright challenges. But, for some reason, people often immediately leap to "notice and takedown solutions" for any kind of content they don't like, they and the drafters of the GDPR are no different.

Except, it's worse. Because whoever drafted the notice-and-takedown portion of the GDPR actually made the process worse than the notice and takedown rules found elsewhere. Here's the GDPR process, as explained by Keller:
  1. An individual submits a removal request, and perhaps communicates further with the intermediary to clarify what she is asking for.
  2. In most cases, prior to assessing the request’s legal validity, the intermediary temporarily suspends or “restricts” the content so it is no longer publicly available.
  3. The intermediary reviews the legal claim made by the requester to decide if it is valid. For difficult questions, the intermediary may be allowed to consult with the user who posted the content.
  4. For valid claims, the intermediary proceeds to fully erase the content. (Or probably, in the case of search engines, de-link it following guidelines of the Costeja “Right to Be Forgotten” ruling.) For invalid claims, the intermediary is supposed to bring the content out of “restriction” and reinstate it to public view -- though it’s not clear what happens if it doesn’t bother to do so.
  5. The intermediary informs the requester of the outcome, and communicates the removal request to any “downstream” recipients who got the same data from the controller.
  6. If the intermediary has additional contact details or identifying information about the user who posted the now-removed content, it may have to disclose them to the individual who asked for the removal, subject to possible but unclearly drafted exceptions. (Council draft, Art. 14a)
  7. In most cases, the accused publisher receives no notice that her content has been removed, and no opportunity to object. The GDPR text does not spell out this prohibition, but does nothing to change the legal basis for the Article 29 Working Party’s conclusions on this point.
If you don't see how that process is likely to lead to widespread abuse and the censorship of perfectly legal speech, you haven't been paying much attention on the internet over the last decade plus. To be fair, you can understand why the drafters think this process makes sense. They're thinking solely about truly problematic and embarrassing information. If, say, your personal medical records have been posted online, it makes sense to look for a way to have that info removed as quickly as possible. But, given how frequently people use these processes in the copyright context to takedown just content they "don't like" (and how often people admit they do so because it's the only way to get such content down), you know it's going to get massively abused for issues that have nothing to do with privacy protection.

Once again, it seems like regulators focus solely on solving for the "worst case" scenario, with little thought towards how that will be applied in much more common cases, and what that means for free speech and society.

And that's not all that's dangerous about the current rules. They also deal a huge blow to anonymous speech and privacy:
A second glaring problem with the GDPR process is its requirement that companies disclose the identity of the person who posted the content, without any specified legal process or protection. This is completely out of line with existing intermediary liability laws. Some have provisions for disclosing user identity, but not without a prescribed legal process, and not as a tool available to anyone who merely alleges that an online speaker has violated the law. It’s also out of line with the general pro-privacy goals of the GDPR, and its specific articles governing disclosure of anyone’s personal information -- including that of people who put content on the Internet.
Yes, that's right. In an effort to protect privacy, the drafters are so focused on a single scenario, that they don't consider how the process will be abused to weaken the privacy rights of others. Want to know who said something anonymously that you don't like? File a privacy complaint and the service provider is just supposed to cough up their name. Again, given how often we've seen bogus defamation claims made solely for the purpose of trying to identify those who speak anonymously, this is a major concern.

There are ways to create a better process for the removal of truly illegal information, but the GDPR simply wipes most of those away in the interest of expediency in trying to prevent a worst case scenario. And the end result may be an even worse situation, in which free speech and privacy rights are broadly wiped away from many by handing a powerful censorship tool, with privacy-destroying elements, to anyone who wants to go after someone else's speech. I've long been in favor of using "notice and notice" type systems that allow whoever posted content to protest before content is taken down, but even if they're going with a notice and takedown system, there are much better ways to implement ones that at least include some semblance of due process. In addition, there should be strong penalties for those who abuse these notice and takedown procedures.

As Keller writes:
Notice and takedown laws also exist to protect people who are harmed by online content. But protecting those people does not require laws to prioritize removal with little concern for the rights of online speakers and publishers. A good notice and takedown process can help people with legitimate grievances while incorporating procedural checks to avoid disproportionate impact on expression and information rights. Valuable information that would be gone under existing laws but for these checks -- importantly including transparency about what content has been removed -- spans religious, political, and scientific content, along with consumer reviews. Crafting the law to better protect this kind of content from improper removal is both important and possible.
It would be nice to see the drafters of the GDPR at least recognize the harm they may be about to cause.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: anonymous speech, data protection, eu, free speech, gdpr, notice and takedown, privacy, safe harbors


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 29 Oct 2015 @ 10:58am

    Notice and takedown will always result in things being taken down without any review ever taking place, as those receiving the notices will be overloaded by the demand. The only ways of getting a review will be raising such a storm that the problem comes to the attention of somebody in the company that made the takedown, or being in a position to be able to directly contact the company. That is you have to enough followers and fans to raise the storm on social media, or be a member of the club of the rich and powerful. Everybody else will just have to suffer having things taken down without any hope of redress.

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 29 Oct 2015 @ 11:09am

    Someone should write a bot to send notice on every single thing any of the drafters ever post.

    link to this | view in thread ]

  3. identicon
    Anonymous Coward, 29 Oct 2015 @ 11:38am

    You want real privacy rules? They should protect you against government surveillance. Otherwise they're just a tool for censorship.

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 29 Oct 2015 @ 11:38am

    Re:

    (well, not necessarily but mostly)

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 29 Oct 2015 @ 11:46am

    "There are ways to create a better process for the removal of truly illegal information ..."


    Yes. It's called a court order.

    "I've long been in favor of using "notice and notice" type systems that allow whoever posted content to protest before content is taken down..."


    How does that fix the privacy issue? You still have to identify yourself in your notice, and if you don't provide notice your content is still taken down.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 29 Oct 2015 @ 11:52am

    Response to: Anonymous Coward on Oct 29th, 2015 @ 11:09am

    Might work because it sounds like they've given notice senders every benefit of the doubt.

    link to this | view in thread ]

  7. icon
    Ninja (profile), 29 Oct 2015 @ 12:01pm

    Re:

    The first stage could be intermediated by the service provider. The ISP receives the demand, forwards to the target without providing details. The targeted party responds, the ISP returns the response to the requester stripping any details. If the requester still thinks they are harmed they could start a legal procedure that would be analyzed by the judge before the other party is contacted to determine whether it should proceed or not. The process would still protect the defendant identity until the decision is delivered.

    This would be a more functional, fair system.

    link to this | view in thread ]

  8. identicon
    Anonymous Coward, 29 Oct 2015 @ 12:08pm

    Free Speech

    Either you have it or you don't.

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 29 Oct 2015 @ 12:17pm

    The right to make forget.

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 29 Oct 2015 @ 12:37pm

    Re: Re:

    All these systems are fine for dealing with a single notice, but they run into problems at the scale of the Internet. About the only affordable system for an Internet scale company becomes an automatic system, with minimal processing by minimum wage labour for the notices that do not comes via the automatic system.
    The problem with any system is expecting content hosting companies to employ people to read and evaluate every notice. While that is viable for a few notices a week, or even ad day, it becomes very expensive when it becomes hundreds or thousands of notices an hour. Most people do not get the scale issues of the Internet, and while politicians should have an inkling of it from dealing with their electorate, they still under estimate the problem by many orders of magnitude.

    link to this | view in thread ]

  11. identicon
    Anonymous Coward, 29 Oct 2015 @ 3:14pm

    We need to abuse this process and make 'forget me' requests for every single politician up for re-election that voted for this shit, leaving only candidates visible that will undo the mess.

    link to this | view in thread ]

  12. identicon
    Anonymous Coward, 29 Oct 2015 @ 4:08pm

    You are crediting them with ignorance instead of wondering how much of this is intentional design.

    link to this | view in thread ]

  13. identicon
    Anonymous Coward, 30 Oct 2015 @ 7:45am

    We need a new version of the internet.

    link to this | view in thread ]

  14. identicon
    Ticket Monster, 30 Oct 2015 @ 2:00pm

    Re: new version

    The problem is not the internet, it is the legal framework society has built around it. We need a new framework that takes into account the scale and international nature of society and culture today.

    But every politician we elect has a vested interest in keeping things mostly the way they are. They would only nibble around the edges. And with more law being tied and twisted by international obligations, about the only way to change the system would be to have the Martian Ice Warriors attack. And how likely is that?

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.