Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Facebook's AI Continues To Struggle With Identifying Nudity (2020)

from the ai-is-not-the-answer dept

Summary: Since its inception, Facebook has attempted to be more "family-friendly" than other social media services. Its hardline stance on nudity, however, has often proved problematic, as its AI (and its human moderators) have flagged accounts for harmless images and/or failed to consider context when removing images or locking accounts.

The latest example of Facebook's AI failing to properly moderate nudity involves garden vegetables. A seed business in Newfoundland, Canada was notified its image of onions had been removed for violating the terms of service. Its picture of onions apparently set off the auto-moderation, which flagged the image for containing "products with overtly sexual positioning." A follow-up message noted the picture of a handful of onions in a wicker basket was "sexually suggestive."

Facebook's nudity policy has been inconsistent since its inception. Male breasts are treated differently than female breasts, resulting in some questionable decisions by the platform. Its policy has also caused problems for definitively non-sexual content, like photos and other content posted by breastfeeding groups and breast cancer awareness videos. In this case, the round shape and flesh tones of the onions appear to have tricked the AI into thinking garden vegetables were overtly sexual content, showing the AI still has a lot to learn about human anatomy and sexual positioning.

Decisions to be made by Facebook:

  • Should more automated nudity/sexual content decisions be backstopped by human moderators?
  • Is the possibility of over-blocking worth the reduction in labor costs?
  • Is over-blocking preferable to under-blocking when it comes to moderating content?
  • Is Facebook large enough to comfortably absorb any damage to its reputation or user goodwill when its moderation decisions affect content that doesn't actually violate its policies?
  • Is it even possible for a platform of Facebook's size to accurately moderate content and/or provide better options for challenging content removals?
Questions and policy implications to consider:
  • Is the handling of nudity in accordance with the United States' more historically Puritianical views really the best way to moderate content submitted by users all over the world?
  • Would it be more useful to users is content were hidden -- but not deleted -- when it appears to violate Facebook's terms of service, allowing posters and readers to access the content if they choose to after being notified of its potential violation?
  • Would a more transparent appeals process allow for quicker reversals of incorrect moderation decisions?
Resolution: The seed company's ad was reinstated shortly after Facebook moderators were informed of the mistake. A statement from the company raised at least one more question as its spokesperson did not clarify exactly what the AI thought the onions actually were, leaving users to speculate what the spokesperson meant, as well as how the AI would react to future posts it mistook for, "well, you know."

"We use automated technology to keep nudity off our apps," wrote Meg Sinclair, Facebook Canada's head of communications. "But sometimes it doesn't know a walla walla onion from a, well, you know. We restored the ad and are sorry for the business' trouble."

Originally posted at the Trust & Safety Foundation website.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: ai, content moderation, nudity
Companies: facebook


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    Samuel Abram (profile), 11 Dec 2020 @ 3:29pm

    I don't know…

    I will say that those onions do turn me on…

    *drools like Homer Simpsons*

    link to this | view in thread ]

  2. icon
    GHB (profile), 11 Dec 2020 @ 3:55pm

    It is as hard as tumblr

    I think the AI thought the onions were breasts. The lighting and shadows caused a false positive on the AI.

    Techdirt, TheMysterousMrEnter's technocracy episode on tumblr may agree with you on this one. Policing the internet at scale in general is downright impossible, even for a big tech industry.

    link to this | view in thread ]

  3. identicon
    Anonymous Coward, 11 Dec 2020 @ 4:17pm

    products with overtly sexual positioning.

    Well, there is a lot s skin to skin contact in the photo.

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 11 Dec 2020 @ 4:19pm

    Those luscious, juicy mounds of flesh looks so sweet, it brings a tear to my eye.

    link to this | view in thread ]

  5. icon
    Samuel Abram (profile), 11 Dec 2020 @ 4:26pm

    Re: It is as hard as tumblr

    Thanks for that. That youtube vid is pure goodness that would feel right at home at TechDirt.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 11 Dec 2020 @ 5:21pm

    Re:

    There are a lot of layers to that joke.

    link to this | view in thread ]

  7. identicon
    Crafty Coyote, 11 Dec 2020 @ 7:18pm

    Nudity filters- at a time when men in European style swimsuits are seen as too revealing, I sure am glad Facebook is keeping us from seeing advertisements of Canadian onions. Those flesh-colored breast-shaped vegetables from the North are ruining America's youth.

    link to this | view in thread ]

  8. identicon
    Anonymous Coward, 12 Dec 2020 @ 5:48am

    Thank god somebody is finally doing something about all this nudity everywhere, why there is literally no other more pressing issues at this time. Good to see that our priorities are in order.

    link to this | view in thread ]

  9. icon
    JoeCool (profile), 12 Dec 2020 @ 8:54am

    Re:

    Well, since they cured covid19, ended world hunger, eliminated war, reversed global warming, stamped out racism and sexism, what else was there to do?
    ;)

    link to this | view in thread ]

  10. identicon
    Canuck, 12 Dec 2020 @ 2:16pm

    Re:

    Nice false dilemma fallacy you've got there.

    link to this | view in thread ]

  11. icon
    Scary Devil Monastery (profile), 14 Dec 2020 @ 12:35am

    Whether to laugh or cry...

    I'm normally of the opinion that real stupidity beats artificial "intelligence" any day...
    ...but as the OP demonstrates, we're getting there.

    link to this | view in thread ]

  12. icon
    John85851 (profile), 22 Dec 2020 @ 2:08pm

    I'm an adult, I should see nudity if I want

    Here's an idea: how about if Facebook treats people like adults and has a "nudity" checkbox when they sign in: check yes if you don't mind seeing nude images, check no to not see them. Then only show nude images to people who checked the box.
    Then there's no need for AI or automated moderation: if someone reports a nude image and they checked "yes", then Facebook rejects the report because the user opted-in.

    Just imagine the kinds of groups that could form if they allowed nudity! And more groups mean more users on the site, which means more user engagement, which means higher ads rates, and so on.
    Heck, Facebook could even mine people's data just by seeing which groups with nudity they join (which they probably do already).
    And continuing with this argument, how much money is Facebook leaving on the table by not allowing nudity and adult groups?

    link to this | view in thread ]

  13. icon
    DonutAtwork.com (profile), 24 Jan 2021 @ 9:02am

    I second this

    FB should continue their best on this to make it a pleasant social network for users of any age. Now, it is just leaving it to the power of users to report then take action accordingly. The current AI's OCR is actually very capable and accurate enough. It really is up to them to do so.

    On top of, FB should also be more stringent on allowing 'incredible' businesses to take up FB Ads. Its crazy to see so many 'scam' ads in FB looking to pick up their victims in the world's largest network. Perhaps while we wait for FB's solutions, FB users here, please report whenever you see nudity or scam looking ads in FB. Thank You!

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

Close

Email This

This feature is only available to registered users. Register or sign in to use it.