Facebook Rejects GRIS Launch Trailer For Being Sexually Suggestive When It Clearly Is Not

from the this-is-stupid dept

It should be well understood at this point that attempts by internet platforms to automagically do away with sexualized content on their sites via algorithms are... imperfect, if we want to be kind. The more accurate description is to say that these filters are so laughably horrible at actually filtering out objectionable content that they seem farcical. When, for instance, Tumblr can't tell the difference between porn and pictures of Super Mario villains, and when Facebook can't do likewise between porn and bronze statues or educational breast cancer images consisting of stick figures...well, it's easy to see that there's a problem.

Notably, some of the examples above, and many others, are years old. You might have thought that in the intervening years, the most prominent sites would have gotten their shit together. You would be decidedly wrong, as evidenced by Facebook's refusal to allow Devolver Digital, the publishers of the forthcoming video game GRIS, to publish this launch trailer for the game, due to its sexual content.

Did you spot the sexual content? I know you probably think you did. Or, you at least you think you know what confused the filters, and you probably think it had something to do with the close up on the female character's face.

Well, ha ha, jokes on all of us, because it was this image for...reasons?


Yes, the outline image of a crumbling sculpture is what set off Facebook's puritanical alarms. Now, Devolver Digital appealed this with Facebook, but, amazingly, that appeal was rejected by Facebook, which argued for some reason that it "doesn't allow nudity." Except, of course, there is no damned nudity in the trailer. In fact, there isn't anything even remotely close to nudity. This is about as clean as it gets.

Let's go to the folks at Devolver Digital for a reaction to the failed appeal.

A Devolver representative tells Kotaku “this is stupid”.

I could try to add something to that, but why bother? Facebook filters: this is stupid.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: content moderation, gris
Companies: facebook


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    That One Guy (profile), 9 Jan 2019 @ 5:50pm

    'Outline of a statue? Oh you better believe that's porn!'

    If that is something a person considers 'explicit' I can only say that they are one seriously pent-up puritanical pinhead, and need to be removed from any job involving a judgement call on what is and is not 'explicit'.

    link to this | view in thread ]

  2. icon
    Anonymous Anonymous Coward (profile), 9 Jan 2019 @ 6:02pm

    Underdeveloped

    The answer is so clear. They need artificial intelligence (or machine learning) to reconfigure their algorithms. Every time the algorithm screws up, some human goes in and tells the AI that it screwed up. Then, over time the machine will learn what the humans actually think is wrong which will improve its correct score. That is until the human is replace with someone secular, who is then replaced by someone fundamentalist, who is then replaced by someone with severe sexual phobias, and then a Neanderthal, and then a rhesus monkey.

    Look, it will work for music and videos and political commentary as well, whatever you want. Just give it enough data and it will sort out all we need to have sorted out. Just ask it.

    /s

    link to this | view in thread ]

  3. icon
    Gary (profile), 9 Jan 2019 @ 7:32pm

    Underage

    Facebook realized this game hadn't been out for 18 years yet, so therefore the character was a minor and shouldn't be in an adult video!!

    link to this | view in thread ]

  4. identicon
    ryuugami, 9 Jan 2019 @ 7:52pm

    Hmm. Looking at that picture... maybe Facebook's filters have a tentacle-porn related PTSD, so they're overreacting. The algorithmic equivalent of "I've seen enough hentai to know where this is going" meme. Has anyone tested how many octopus and squid pictures get blocked?

    link to this | view in thread ]

  5. identicon
    ryuugami, 9 Jan 2019 @ 7:54pm

    Re: Underdeveloped

    That is until the human is replace with someone secular, who is then replaced by someone fundamentalist, who is then replaced by someone with severe sexual phobias, and then a Neanderthal, and then a rhesus monkey.

    Or maybe this has already happened, and we've already reached the "rhesus monkey" stage.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 9 Jan 2019 @ 7:58pm

    link to this | view in thread ]

  7. icon
    Phoenix84 (profile), 9 Jan 2019 @ 8:06pm

    Yeah, I can see it.
    If you look at the image very small (like in the thumbnail of the RSS feed), it looks like a women leaning forward while sitting on a toilet.

    link to this | view in thread ]

  8. identicon
    Vic, 9 Jan 2019 @ 9:14pm

    Facebook = porn!

    Did you notice that Facebook's presence is usually symbolized by the letter F? I have a very good idea what it stands for... Does anybody else?

    link to this | view in thread ]

  9. identicon
    Rocky, 9 Jan 2019 @ 9:14pm

    Re:

    > If you look at the image very small (like in the thumbnail of the RSS feed), it looks like a women leaning forward while sitting on a toilet.

    You mean like almost any picture of Rhodin's The Thinker taken from the right angle looks like a guy taking a dump? Double standards again...

    In general, FB are run by a bunch of hypocritical assholes and it shouldn't be a surprise to anyone that they can't get their shit together as long as they can make a buck by exploiting their users info.

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 9 Jan 2019 @ 9:38pm

    If you are still using facebook you haven't a clue, unless you are a corporate shill that is. twitter and facebook blocked on our machines.

    link to this | view in thread ]

  11. identicon
    Lawrence D’Oliveiro, 9 Jan 2019 @ 9:45pm

    Computer Says No

    Foolish mortals, haven’t you realized yet that resistance against your cyber-overlords is futile?

    (I can remember stories about the consequences of wrong decisions made by automated systems going back over 40 years ... back when it was considered science fiction.)

    link to this | view in thread ]

  12. identicon
    Christenson, 9 Jan 2019 @ 10:24pm

    Wanna see how *impossible* it is to make good filters???

    OK, lets set up an image:

    (*) (*)

    Oooh, you dirty-minded prude, that is a pair of nipples!
    Whaddya mean those are footnote markers??

    Ya can't have a computer saying something *is* or *is not* explicit without *context*.. and that is before we get into discussing whether that explicitness *is* or *is not* appropriate!

    It just isn't gonna happen (looking at you, instagram) without more *human* involvement.

    link to this | view in thread ]

  13. icon
    PaulT (profile), 10 Jan 2019 @ 12:11am

    Re: Underdeveloped

    "They need artificial intelligence (or machine learning) to reconfigure their algorithms."

    I see your sarcasm, but I'll bet the reason this came up is because they're doing exactly that, and some people have been trying to game the system by altering the colour scheme of the nudes they're posting to try and get through.

    link to this | view in thread ]

  14. icon
    PaulT (profile), 10 Jan 2019 @ 12:18am

    "Facebook, which argued for some reason that it "doesn't allow nudity."

    "Facebook filters: this is stupid."

    Erm, Tim, you seem to think that a human being was involved at some point in the communication here. This seems to be the kind of thing that's automated until the second or third complaint. The first time a human being was made aware was probably when Kotaku picked up the story.

    On the flip side, this is a reverse Streisand effect situation. I wasn't aware of this game, now I am and I think it looks pretty cool. When it makes it to a platform I own, I'll definitely be interested in checking it out.

    link to this | view in thread ]

  15. icon
    hij (profile), 10 Jan 2019 @ 3:06am

    Training matters

    This is what happens when your programmers are all fixated on hentai.

    link to this | view in thread ]

  16. identicon
    Anonymous Coward, 10 Jan 2019 @ 3:25am

    I guess Facebook's AI has become so intelligent that it reacts like any human who was forced to watch tons of porn: it starts seeing naked bodies everywhere.

    link to this | view in thread ]

  17. identicon
    Capt ICE Enforcer, 10 Jan 2019 @ 4:07am

    Seriously?

    You seriously can't see it? After watching this video I have no f'ing idea what is going on. Which is f'ing my brain and we all know that f'ing is sometime sexual. I feel violated...

    link to this | view in thread ]

  18. icon
    Gary (profile), 10 Jan 2019 @ 6:21am

    Re: Re:

    Getting everything right isn't really FB's job in life. Serving as many posts as possible and as many ads as possible is all they need to do.
    Expecting them to do a perfect job at moderating content at any scale is impossible.

    link to this | view in thread ]

  19. identicon
    Rocky, 10 Jan 2019 @ 6:40am

    Re: Re: Re:

    I'm not expecting them to do a perfect job, what I'm expecting them to do is to have consistent standards but when they repeatedly use double standards I think they have earned the right to be called hypocrites.

    link to this | view in thread ]

  20. icon
    PaulT (profile), 10 Jan 2019 @ 7:20am

    Re: Re: Re: Re:

    Are they using double standards?

    Seems to me more like they're automating something that cannot be done by human beings due to the sheer volume of content, and getting occasionally tripped up by something that's totally subjective.

    Do you have a better solution for them, accepting the fact that we still do not have the level of AI technology where something that's "obvious" to a human being may not be to a software algorithm?

    link to this | view in thread ]

  21. identicon
    Anonymous Coward, 10 Jan 2019 @ 7:33am

    Re: 'Outline of a statue? Oh you better believe that's porn!'

    They didn't say "explicit" they said "suggestive" which is another thing entirely—and inherently subjective, and I can kind of barely see that. I'm not nearly as certain as Timothy, either, that the person depicted is wearing clothes (I don't see any, so it could certainly suggest nudity).

    link to this | view in thread ]

  22. identicon
    Anonymous Coward, 10 Jan 2019 @ 7:49am

    Re: Re: 'Outline of a statue? Oh you better believe that's porn!

    I'm not nearly as certain as Timothy, either, that the person depicted is wearing clothes (I don't see any, so it could certainly suggest nudity).

    The same way that characters in XKCD cartoons are naked, eh?

    link to this | view in thread ]

  23. identicon
    Anonymous Coward, 10 Jan 2019 @ 8:08am

    Re: Re: Underdeveloped

    (agreement): With our current orangutan in chief it's likely that the rhesus is already in a government agency already.

    link to this | view in thread ]

  24. identicon
    Michael, 10 Jan 2019 @ 8:11am

    Re: Underdeveloped

    "Every time the algorithm screws up, some human goes in and tells the AI that it screwed up."

    Except this time and every other time just like it (I'm certain there are millions). This time, the appeal resulted in the human also being as useless as the AI at determining there was no naked people in the images.

    link to this | view in thread ]

  25. icon
    PaulT (profile), 10 Jan 2019 @ 8:20am

    Re: Re: Underdeveloped

    "(I'm certain there are millions)"

    Of course, there are millions of posts to Facebook every minute so even a tiny rounding error worth of false positives will reach millions in a short amount of time. That's the reason AI is being used in the first place, since it would be literally impossible to have humans do the work.

    "This time, the appeal resulted in the human also being as useless as the AI"

    Has it been confirmed that a human being has been involved at any point? The original appeal could also have been automated, sometimes these things don't reach a person until at least the second appeal.

    You're not wrong if a human being did look and send the original rejection, but I'm not convinced that was the case.

    link to this | view in thread ]

  26. identicon
    Anonymous Coward, 10 Jan 2019 @ 9:52am

    Re: Re: Re: 'Outline of a statue? Oh you better believe that's p

    Technically naked, the best kind of naked.

    link to this | view in thread ]

  27. identicon
    Anonymous Coward, 10 Jan 2019 @ 9:54am

    Re: Facebook = porn!

    Are you implying that "the F word" isn't "Facebook"?

    link to this | view in thread ]

  28. identicon
    Rocky, 10 Jan 2019 @ 10:07am

    Re: Re: Re: Re: Re:

    I'm not talking specifically about content moderation but about FB in general.

    For this specific instance they blocked something that IS art although in a form that the mainstream art establishment probably wouldn't consider be art. If the same clip had been created by a famous artist I doubt FB would have blocked it and they would definitely have unblocked it on an appeal, ie double standards.

    When the terror attack took place in Paris a while back FB added the option for all users to change their profile picture so they place the French flag on it, yet there is no such option added when terrorist blow up something in the Middle East, ie double standards.

    They say their users privacy is important while at the same time they allow access to users private information to almost anyone without any oversight, ie double standards.

    Users of religion A writes critical posts about religion B and gets blocked, but users of religion B that writes critical posts of religion A doesn't get blocked, ie double standards.

    There are numerous instances where FB arbitrarily blocks users for posts where others go scot-free.

    Appealing a moderation is crap shoot, it all depends which moderator looks at your appeal if it's turned down or not.

    link to this | view in thread ]

  29. identicon
    Anonymous Coward, 10 Jan 2019 @ 10:08am

    I guess this means those disgustingly suggestive Disney Lion King clips will not be impacted at all.

    link to this | view in thread ]

  30. icon
    ECA (profile), 10 Jan 2019 @ 12:56pm

    dear face book.

    Iv watched the trailers, Iv watched game play..
    For anyone getting a woody by any of this, I would suggest they STOP watching Anime so much.. They might loose a part of their body for playing with it, so much..

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.