Post No Evil: Content Moderation Decisions Are Always Trickier Than You Think

from the it-gets-worse-and-worse dept

Two years ago, we told anyone who wanted to understand the impossibility of content moderation to listen to an episode of the podcast/radio show Radiolab. Obviously, content moderation questions are back in the news again, and Radiolab recently re-released the episode with some updated content. Most of it is the same, but there's some more at the end to relate it to the latest news with the various attacks on social media coming from the president, the DOJ, and Congress.

Once again, I cannot recommend anything more than listening to this entire discussion. It's full of great examples of the impossible nature of content moderation. And, it especially highlights why the various proposals brought forth by Congress that say that social media companies need to have explicit rules for what is and what is not allowed is simply not practical in the real world, where there are so many edge cases, and so many times where the policy needs to be adapted due to a new edge case. Here's just a little bit of the transcript to whet your appetite for the whole damn thing. It's an early example in the story, highlighting the difficulty of dealing with breast feeding pictures under its policy:

[NEWS CLIP: A social networking website is under fire for its policy on photos of women breastfeeding their children.]

SIMON: Big time.

STEPHANIE MUIR: 12,000 members participated, and the media requests started pouring in.

[NEWS CLIP: The Facebook group called, "Hey Facebook: Breastfeeding is Not Obscene.]

STEPHANIE MUIR: I did hundreds of interviews for print. Chicago Tribune, Miami Herald, Time Magazine, New York Times, Washington Post ...

[ARCHIVE CLIP, Dr. Phil: You know, the internet is an interesting phenomenon.]

STEPHANIE MUIR: ... Dr. Phil. It was a media storm. And eventually, perhaps as a result of our group and our efforts, Facebook was forced to get much more specific about their rules.

SIMON: So for example, by then nudity was already not allowed on the site. But they had no definition for nudity. They just said no nudity. And so the Site Integrity Team, those 12 people at the time, they realized they had to start spelling out exactly what they meant.

KATE KLONICK: Precisely. All of these people at Facebook were in charge of trying to define nudity.

FACEBOOK EMPLOYEE: So I mean yeah, the first cut at it was visible male and female genitalia. And then visible female breasts. And then the question is well, okay, how much of a breast needs to be showing before it's nude? And the thing that we landed on was, if you could see essentially the nipple and areola, then that's nudity.

SIMON: And it would have to be taken down. Which theoretically at least, would appease these protesters because, you know, now when a picture would pop up of a mother breastfeeding, as long as the child was blocking the view of the nipple and the areola, they could say, "Cool, no problem."

KATE KLONICK: Then you start getting pictures that are women with just their babies on their chest with their breasts bare. Like, for example, maybe baby was sleeping on the chest of a bare-breasted woman and not actively breastfeeding.

FACEBOOK EMPLOYEE: Okay, now what? Like, is this actually breastfeeding? No, it's actually not breastfeeding. The woman is just holding the baby and she has her top off.

JAD: Yeah, but she was clearly just breastfeeding the baby.

ROBERT: Well, maybe just before.

SIMON: Well, I would say it's sort of like kicking a soccer ball. Like, a photo of someone who has just kicked a soccer ball, you can tell the ball is in the air, but there is no contact between the foot and the ball in that moment potentially. So although it is a photo of someone kicking a soccer ball, they are not, in fact, kicking the soccer ball in that photo.

ROBERT: [laughs]

JAD: [laughs] That's a good example.

SIMON: And this became the procedure or the protocol or the approach for all of these things, was we have to base it purely on what we can see in the image.

KATE KLONICK: And so they didn't allow that to stay up under the rules, because it could be too easily exploited for other types of content, like nudity or pornography.

FACEBOOK EMPLOYEE: We got to the only way you could objectively say that the baby and the mother were engaged in breastfeeding is if the baby's lips were touching the woman's nipple.

SIMON: So they included what you could call, like, an attachment clause. But as soon as they got that rule in place ...

FACEBOOK EMPLOYEE: Like, you would see, you know, a 25-year-old woman and a teenage-looking boy, right? And, like, what the hell is going on there?

KATE KLONICK: Oh, yeah. It gets really weird if you, like, start entering into, like, child age. And I wasn't even gonna bring that up because it's kind of gross.

FACEBOOK EMPLOYEE: It's like breastfeeding porn.

JAD: Is that a thing?

ROBERT: Are there sites like that?

SIMON: Apparently. And so this team, they realized they needed to have a nudity rule that allowed for breastfeeding but also had some kind of an age cap.

FACEBOOK EMPLOYEE: So -- so then we were saying, "Okay. Once you've progressed past infancy, then we believe that it's inappropriate."

SIMON: But then pictures would start popping up on their screen and they'd be like, "Wait. Is that an infant?" Like, where's the line between infant and toddler?

FACEBOOK EMPLOYEE: And so the thing that we landed on was, if it looked like the child could walk on his or her own, then too old.

SIMON: Big enough to walk? Too big to breastfeed.

ROBERT: Oh, that could be 18 months.

JAD: Yeah, that's like a year old in some cases.

SIMON: Yeah. And, like, the World Health Organization recommends breastfeeding until, you know, like, 18 months or two years, which meant there were a lot of photos still being taken down.

KATE KLONICK: Within days, we were continuing to hear reports from people that their photographs were still being targeted.

SIMON: But ...

[NEWS CLIP: Facebook did offer a statement saying ...]

FACEBOOK EMPLOYEE: You know, that's where we're going to draw the line.

Suffice it to say, that is not, in fact, where Facebook drew the line. Indeed, just last year we wrote about the company still having issues with drawing the line around this particular issue.

So if you want to talk intelligently about these issues, you should first listen to the Radiolab broadcast. It's only a little over an hour, and well worth your time:

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: content moderation, impossible choices, post no evil


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 8 Jul 2020 @ 3:52pm

    It could be CSAM!

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 8 Jul 2020 @ 5:04pm

    I mean, when it comes to dropping the hammer on horrible people who should've had the banhammer dropped on them ages ago, it seems like the decision isn't really always tricky; it's just a matter of public pressure and optics versus the platform's desire to profit off of bullshit. Graham Linehan and more recently Stefan Molyneux got their asses tossed off of Twitter. Stefan, alongside other racist turds like David Duke and Richard Spencer, were given the boot from YouTube as well.

    In plenty of instances, you can easily start with the biggest offenders and assholes and then work your way down to the trickier situations, or have a moderation group that takes care of the most flagrant violations while another group handles the trickier stuff, and letting the two commingle so that some tricky & tough choices might become easier as they become more informed.

    link to this | view in chronology ]

    • icon
      Mike Masnick (profile), 8 Jul 2020 @ 10:50pm

      Re:

      In plenty of instances, you can easily start with the biggest offenders and assholes and then work your way down to the trickier situations, or have a moderation group that takes care of the most flagrant violations while another group handles the trickier stuff, and letting the two commingle so that some tricky & tough choices might become easier as they become more informed.

      I think plenty of people think this is true, but it is rarely true. The deeper you look, it always gets more complicated. "Biggest offenders." Offenders of what? You seem to think that there's some global standard by which these things are measured, and there's not. And as soon as you ban someone you think is the "biggest offender" a bunch of people will point to someone else, insisting that "that guy over there did much worse." And then you're off and fighting again.

      It ain't at all easy.

      link to this | view in chronology ]

      • icon
        Ben (profile), 9 Jul 2020 @ 1:06am

        Re: Re:

        And by 'biggest offender', do you mean 'most followers who might be offended', or 'most frequent offensive content?'

        Under the first, many people would say that the Donald is in that category. Do you really want to open that can of worms? (Twitter are playing with the can opener right now!)

        link to this | view in chronology ]

        • icon
          Uriel-238 (profile), 9 Jul 2020 @ 11:21am

          Twitter and the can

          Twitter's playing with the can of worms, but the biggest threat is not in fair management of public figures that need moderating, but the threats of reprisal from those public figures.

          In 1996 the internet was regarded as a forum of the people, outside the jurisdiction of nations. And our world leaders have taken to it like the first coalition to post-revolution France.

          Something tells me they're unprepared for the Grande Armée that will rise if they kick at the net too hard.

          link to this | view in chronology ]

      • identicon
        Anonymous Coward, 9 Jul 2020 @ 9:38am

        Re: Re:

        For the love of Christ, these are high-profile people who the raw majority of reasonable individuals around the world would agree are sacks of shit and don’t deserve a massive platform to spread their bigoted garbage. Graham Linehan is an outspoken transphobic asshole. Stefan Molyneux is a well-known bigot. Richard Spencer is a Nazi. David Duke is motherfucking former KKK and continues to spread white supremacist views.

        When I say “biggest offenders”, I mean people like these who by all measures of human decency should’ve been given the fucking boot from most major platforms years ago. When you say ”offenders of what?” when I’m specifically talking about overt bigots, Nazis, and white supremacists who wear their hate on their sleeves and openly wish that certain groups of people weren’t alive, it just makes me do a double take. Just ban the If someone points out that there’s someone who did something worse, and you can see in their tweets or YouTube videos that it is worse or roughly equivalent, then you ban that fucker too.

        You seem to want to try and finagle the discussion about straightforward concepts like "We should ban the Nazis and flagrantly bigoted/fascist assholes" in a way that it becomes about these huge moral quandaries when it’s not. It’s just... fuckin’ not.

        link to this | view in chronology ]

        • icon
          Celyxise (profile), 9 Jul 2020 @ 10:28am

          Re: Re: Re:

          When I say “biggest offenders”, I mean people like these who by all measures of human decency should’ve been given the fucking boot from most major platforms years ago

          Yeah, well that's just like, your opinion, man.

          You seem to want to try and finagle the discussion about straightforward concepts like "We should ban the Nazis and flagrantly bigoted/fascist assholes" in a way that it becomes about these huge moral quandaries when it’s not. It’s just... fuckin’ not.

          Except that it literally fuckin' is. In the above example just swap out breastfeeding mothers with skinheads with swastikas - then wonder about all the bald biker dudes who are trying to take back a symbol of peace. Your opinions on what is offensive is not necessarily shared with everyone else and your lack of awareness on this shows you haven't put much thought into it.

          The entire point of this post, and the podcast it is about is that these things aren't as clear cut as many people, including you, want them to be.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 9 Jul 2020 @ 12:24pm

            Re: Re: Re: Re:

            Breastfeeding mothers and bald biker dudes with swastikas who just so happen to be trying to reclaim the symbol from the Nazis is tricky.

            Banning well-known bigots, white supremacists, and Nazis whose reputations precede them isn't tricky.

            Your opinions on what is offensive is not necessarily shared with everyone else

            My opinion is that if you're looking at outspoken assholes with track records of spewing hate and other bullshit, the decision to give them the boot isn't tricky.

            The people who don't share that opinion, and think that banning the likes of David Duke and Richard Spencer should be a matter of intense debate and discussion amongst a Trust & Safety team, seem to be either speech absolutists who think that all speech deserves to be treated equally and with due process (even though, if there is a "Marketplace Of Ideas" of the sort that speech absolutists love to harp on about, you'd think that they'd agree that some ideas have fallen out of value in that marketplace to become worthless and not even worth considering), or people who agree with David Duke and Richard Spencer's wishes for a whiter America and want those wishes to be spread as far and wide as possible. Either way, neither the absolutists or the Nazis' opinions are worth considering, because they give an air of legitimacy and equal weight to the kinds of ideologies that would see others killed, locked up in camps, or both.

            link to this | view in chronology ]

  • identicon
    Pixelation, 8 Jul 2020 @ 6:24pm

    There are a lot of people with a black and white viewpoint in this world with many shades of grey. Context is decisive and everyone has their own context. We're not going to make everyone happy. Someone will always complain. "Too much of this and not enough of that!" or "Too much of that and not enough of this!" Bottom line, people need jobs to occupy their minds.

    link to this | view in chronology ]

  • icon
    Uriel-238 (profile), 8 Jul 2020 @ 6:34pm

    I'd have more respect for Facebook...

    ...if they hid most might-be-offensive things behind a spoiler screen that could be lifted with a click, and only may-actually-be-illegal stuff was blocked.

    Is it called a spoiler screen? I think of spoiler text which is invisible until moused-over, for fiction spoiler information. Some kind of device which allows the user to consent to viewing the questionable content. Heck, as a Facebook non-user I don't even know if Facebook does this.

    Not that may-actually-be-illegal helps much. What's actually illegal in the US varies from county to county. A website like Facebook can afford to track the IP address location of each user and customize user experience accordingly, but smaller sites might find it too expensive and would err on the side of showing too little.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 9 Jul 2020 @ 4:59am

      Re: I'd have more respect for Facebook...

      track the IP address location of each user

      And what about VPNs, Tor?

      link to this | view in chronology ]

  • identicon
    Crafty Coyote, 8 Jul 2020 @ 7:08pm

    And in the end, our desires to enjoy our rights to express ourselves through freedom of speech will be ultimately no match for Disney's copyright lawyers and the Helen "Won't somebody please think of the children?" Lovejoys of the world.

    So, the First Amendment is pie-in-the-sky compared to the way things truly are.

    link to this | view in chronology ]

    • icon
      nasch (profile), 10 Jul 2020 @ 4:45am

      Re:

      So, the First Amendment is pie-in-the-sky compared to the way things truly are.

      The first amendment has nothing to say about a private company's moderation decisions.

      link to this | view in chronology ]

  • icon
    Ben (profile), 9 Jul 2020 @ 1:13am

    The discussion quote at Facebook didn't even start to consider international differences in culture as to what is offensive or not. In majority-Muslim countries, a woman showing any flesh other than hands and face is offensive to some pretty powerful people. So none of the images posited are permissible.
    But in other countries (I'd take a guess at Sweden, perhaps), are cool with top-less-ness in general, so the line between acceptable and unacceptable comes somewhere else.
    And this is before we get to personal standards or acceptability and offence!
    I know Facebook is an American (woohoo!!) company. But their customer base (the advertisers) is international, as is their product (the members' attention). Their standards of moderation cannot be limited to some narrow band of WASP (the predominate make-up of Congress, Senate, Whitehouse and Judiciary) sensibilities.

    link to this | view in chronology ]

  • icon
    Peter (profile), 9 Jul 2020 @ 1:32am

    And the other question is ..

    ... why would anyone see a picture of a breastfeeding woman unless they were specifically looking for it?

    Could it be, by any chance, that the people who complain about pictures of breastfeeding women have been showing a strong interest in the topic, and - willingly or not - suggested to Facebook's algorithms they want to see more of them?

    Could it be, that instead of censoring Facebook, the people carrying a grievance would just need to, say, click on a few pictures of tropical beaches or cats playing guitar to prompt Facebook to drop the breastfeeding for some lighter content?

    link to this | view in chronology ]

    • icon
      Strawb (profile), 9 Jul 2020 @ 3:51am

      Re: And the other question is ..

      ... why would anyone see a picture of a breastfeeding woman unless they were specifically looking for it?

      Because someone on their friend list either posted or interacted with a post of a breastfeeding woman.

      link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.