The DOJ Is Conflating The Content Moderation Debate With The Encryption Debate: Don't Let Them

from the it's-not-the-same dept

As we've detailed a lot over the last week, the DOJ has decided that after years of failing to get backdoors mandated by warning about the "terrorism" bogeyman, it's decided to pick up the FOSTA playbook, and instead start focusing on child porn -- or what "serious people" now refer to as Child Sexual Abuse Material (CSAM). It did this last week with an assist from the NY Times, who published an article with (legitimately) scary stories, but somehow blaming the internet companies... because they actually report it when they find such content on their networks. I've seen more than a few people, even those who generally have been strong voices on the encryption debate and against backdoors, waver a bit on this particular subject, and note that maybe there shouldn't be encryption on social media networks, because it might (as the narrative says) help awful people hide their child porn.

Except... that's confusing a few different things. Mainly, it's mixing up the content moderation debate with the "lawful access" or "backdoors" debate. Yes, encryption makes it harder for the police to get in and see certain things, but that's by design. We live in a country with the 4th Amendment, in which we believe that it should be difficult for law enforcement to snoop deeply into our lives -- and that's always meant that some people will do and plot bad stuff out of the sight and hearing of law enforcement. Yet, if you were to look at law enforcement over the past 100 years, you can bet that they have many times more access to information about people today than they have in the past. The claim of "going dark" is laughable when you compare the information that law enforcement can get today even to what it could get 15 or 30 years ago.

But, importantly, bringing CSAM into the debate muddies the water by pretending -- incorrectly -- that in an end-to-end encrypted world you can't do any content moderation, and there's simply no way for platforms to block or report certain kinds of content. Yet, as Princeton professor Jonathan Mayer highlights in a new paper, content moderation is not impossible in an encrypted system. It may be different than it is today, but it's still very much possible:

Much of the public discussion about content moderation and end-to-end encryption over the past week has appeared to reflect two common technical assumptions:

  1. Content moderation is fundamentally incompatible with end-to-end encrypted messaging.
  2. Enabling content moderation for end-to-end encrypted messaging fundamentally poses the same challenges as enabling law enforcement access to message content.

In a new discussion paper, I provide a technical clarification for each of these points.

  1. Forms of content moderation may be compatible with end-to-end encrypted messaging, without compromising important security principles or undermining policy values.
  2. Enabling content moderation for end-to-end encrypted messaging is a different problem from enabling law enforcement access to message content. The problems involve different technical properties, different spaces of possible designs, and different information security and public policy implications.

You can read the whole thing, but as the paper notes, user reporting of such content still works in an end-to-end encrypted world, as does hash matching if done at the client end. There's a lot more in there as well, but what you realize in reading the paper is that while law enforcement has now latched onto the CSAM issue as its hook to break encryption (in part, I've been told by someone working with the DOJ, because they found it "polled well"), it's an entirely different problem. This is yet another "but think of the children" argument, which ignores the technical and societal realities.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: chris wray, content moderation, doj, encryption, going dark, jonathan mayer, william barr
Companies: facebook


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    Anonymous Anonymous Coward (profile), 8 Oct 2019 @ 8:40am

    Out of sight, out of their minds

    Isn't content moderation concerned with things other people can see, while encryption is about preventing other people from seeing what is encrypted? I can understand government types getting this wrong, but technologists?

    link to this | view in thread ]

  2. identicon
    anon, 8 Oct 2019 @ 9:42am

    NSLs and hashing

    So, when are the DoJ going to start sending NSLs to Google, Microsoft and Mozilla and have them add image hash checks to the browsers? Or are they already there hidden by 'safebrowsing'?

    link to this | view in thread ]

  3. identicon
    Anonymous Coward, 8 Oct 2019 @ 10:03am

    It's true that the mere existance on encryption on social media messanger apps might help one or more child abusers evade detections. This inspite of the fact that said encryption might also help keep children away from those who would abuse them

    It's also true that allowing any non-children to breath would enable some of those said non-children to sexually abuse children.

    However for some reason I don't see anyone proposing a breathing ban.

    link to this | view in thread ]

  4. icon
    ECA (profile), 8 Oct 2019 @ 10:44am

    Lets see.....

    How could I really piss off people... How about we goto the major locations in the world where Child abuse is Dominant.. Bangkok.. Install remote cameras thru the cellphone system, and take pictures.. Lets monitor those persons that Take plane flights to those areas..and link the pictures with the Airplane flights.. We could also match up incoming flights into those Areas, and keep tabs on those persons.. Why not?? well, every nation has their OWN laws. and whats legal here may not BE there.. like Chewing gum in public.(yep its illegal in a few locations) so with all this, and tracking CERTAIN persons..What could we find? could we send a BOT to their phones and see who else is Around this situation?? do some of these folks have enough money to PAY us off, so we dont do anything?? How much of FOSTA is logical, or probably Somewhat easy to figure WHO ISNT doing it? and the number represented seem to be Picked out of the air, and SLAMMED together from any/every source of missing children and not verified that they are NOW HOME. And even when looked/verified tend to be Less then .1% of their guesstimate.. This seems MORE of a lost Law, that will never be enforced, because its already enforced other ways. The only problem in this tends to be a Way for those in need, to have access to what they need. and I dont see that happening.

    link to this | view in thread ]

  5. icon
    ECA (profile), 8 Oct 2019 @ 10:45am

    Re: Lets see.....

    Safe houses and safe places??
    where did those go?? we used to have signs in many homes for places to find safe harbor.

    link to this | view in thread ]

  6. icon
    James Burkhardt (profile), 8 Oct 2019 @ 10:48am

    Re: Out of sight, out of their minds

    Primarily, content moderation is involved with publicly facing content. That's where the debate focus is.

    However, content moderation does involve content shared between users, with its own differences and challenges.

    link to this | view in thread ]

  7. identicon
    A Guy, 8 Oct 2019 @ 11:24am

    Re: Re: Out of sight, out of their minds

    That's not exactly true, content moderation between users requires the receiver to report the sender or an automated filter on the receivers device.

    If the sender and receiver agree not to report the content then it goes through more often.

    An automated filter on the device can do a lot of keyword filtering and message rejection without human intervention.

    link to this | view in thread ]

  8. icon
    JdL (profile), 8 Oct 2019 @ 12:18pm

    Nonsense

    Properly encrypted data can NOT be content-moderated, no matter what Professor Mayer says. Of course, a company could set up an app that looks at clear-text before it is encrypted, but anything already encrypted by the sender before that app sees it can't be examined.

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 8 Oct 2019 @ 12:30pm

    Gee, it must have been tough back in the day when people weren't sending messages or sharing content that couldn't be intercepted. I wonder how they busted child abusers, pedophiles, ephebephiles, and illegal porn producers and consumers before the net.

    link to this | view in thread ]

  10. identicon
    A Guy, 8 Oct 2019 @ 12:30pm

    Re: Nonsense

    You can put a filter that examines and accepts or rejects the message after it is decrypted but before it is displayed to the user straight on the user device.

    link to this | view in thread ]

  11. icon
    Anonymous Anonymous Coward (profile), 8 Oct 2019 @ 12:52pm

    Re: Re: Nonsense

    Why would anyone want to do that?

    Now if your talking about being forced to, or done clandestinely, then it is something else, and not legal in this country. China maybe.

    link to this | view in thread ]

  12. identicon
    A Guy, 8 Oct 2019 @ 1:04pm

    Re: Re: Re: Nonsense

    A child's app might filter the keyword "fuck" from their potty mouth uncle for example.

    link to this | view in thread ]

  13. identicon
    Anonymous Coward, 8 Oct 2019 @ 1:12pm

    Re: Out of sight, out of their minds

    "Enabling content moderation for end-to-end encrypted messaging is a different problem..."

    That very "problem statement" demonstrates a major difficulty - even smart, informed, well-intentioned technologist conflate issues.

    There is no issue of end-to-end encryption involved in content moderation. There's no mid-stream moderation that requires access. Moderation happens at an end-point, operating on decrypted, clear-text content (text, images, media, etc.).

    User-to-user moderation works the same way as public forum moderation, i.e., the receiving user (or his filter proxy-app) deletes undesirable content and possibly blocks the sender. Again - no need to decrypt mid-stream.

    link to this | view in thread ]

  14. identicon
    Anonymous Coward, 8 Oct 2019 @ 1:39pm

    Re: Re: Out of sight, out of their minds

    Lets keep the term moderation for when a person or organization decides what other people can see, and filtering for when a person decides what content they want to see. The distinction is important, as moderation imposes values on an audience, which is not always bad, while filtering is a personal decision that does not impact what other people see.

    link to this | view in thread ]

  15. identicon
    A Guy, 8 Oct 2019 @ 2:32pm

    Re: Re: Re: Out of sight, out of their minds

    Moderation for inappropriate language, especially expletives aimed at children is often done by a filter as I noted below. Your definitions do not comport with the normal definitions of those words.

    link to this | view in thread ]

  16. identicon
    A Guy, 8 Oct 2019 @ 3:26pm

    Re: NSLs and hashing

    Your looking for the porn/adult material filters you can buy from disney or something. It does come with the browser.

    link to this | view in thread ]

  17. identicon
    A Guy, 8 Oct 2019 @ 3:50pm

    Re: Re: NSLs and hashing

    That was... it does not come with the browser

    link to this | view in thread ]

  18. identicon
    Anonymous Coward, 8 Oct 2019 @ 4:20pm

    Re:

    Could it be ... their stated rational is false?
    Fake laws.

    link to this | view in thread ]

  19. identicon
    Anonymous Coward, 8 Oct 2019 @ 4:36pm

    Re:

    They didn't, where do you think the current crop of Politicians, CFO's and Media moglu's came from? (from the child abusers, pedophiles, and illegal porn producers)

    And why do you think "they" want to pass all these 'save the children' laws to prevent anyone else from doing what they have done? (their own guilt/obsessions...)

    What will the public do about it? (literally gives no fucks...)

    this is also why there are no serious mental health related 'issues' in the USA, we wouldn't want to have to lock up Politicians and CFO's because they are psychopathic with narcissistic personality disorders, now would we?

    link to this | view in thread ]

  20. icon
    nasch (profile), 9 Oct 2019 @ 8:38am

    Re: Re: Re: Nonsense

    Why would anyone want to do that?

    Because there are things on the internet that you don't want to see.

    link to this | view in thread ]

  21. identicon
    A Guy, 9 Oct 2019 @ 9:50am

    Re: Re: Re: Re: Nonsense

    I wish I never saw goatse

    link to this | view in thread ]

  22. icon
    nasch (profile), 9 Oct 2019 @ 12:55pm

    Re: Re: Re: Re: Re: Nonsense

    I wish I never saw goatse

    See, this guy gets it!

    link to this | view in thread ]

  23. identicon
    Anonymous Coward, 9 Oct 2019 @ 2:19pm

    Ever wonder just what happened to all the bullies and imbeciles we all had to deal with in junior high school? They became judges and prosecutors!

    link to this | view in thread ]

  24. icon
    urza9814 (profile), 10 Oct 2019 @ 10:38am

    Re: Re: Out of sight, out of their minds

    There's different kinds of moderation, and plenty of sites DO use mid-stream moderation that requires access. Facebook, for example.

    I like the idea posted by AC below where they suggest moderation vs filtering, although those words already have different uses so they probably aren't the best choice. I'd call it something like "policy moderation" vs "user moderation". Policy moderation is like Facebook, where you set a bunch of rules about what is and is not allowed, you let users file reports of specific content, but then you have hired moderators who review that content and determine if it is actually in violation. Some sites also use immediate policy moderation, where your post will be reviewed by a human to see if it complies before it is ever visible. Some sites use a mix, with automated filters which will determine if a comment should be held for human review. But all of those options require administrators at the company to be able to review the posted content. So either the company needs to be able to decrypt everything, or at the very least they need to insert code that will take the decrypted message from the user and pass that back to the company unencrypted. Either way they're getting unencrypted access. And obviously you can't count on any automatic filtering on the client end -- for example, if you do the thing where automatic filtering can flag a comment as requiring human review, the client can easily prevent that code from running on their end. You can use that to prevent things from being viewed, but not from being posted and distributed.

    For "user moderation", you just count downvotes and hide anything with enough downvotes. That could be done without direct access by the company to the decrypted content. But it doesn't let you set any kind of consistent rules, and it can often get abused, especially in larger communities. Things will get flagged because people just don't like the opinion expressed or the person expressing it...and there's not much you can do to prevent that.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.