Content Moderation At Scale Is Impossible To Do Well: Series About Antisemitism Removed By Instagram For Being Antisemetic
from the hate-vs.-reporting-on-hate dept
I've written a lot about the impossibility of doing content moderation well at scale, and there are lots of reasons for that. But one of the most common is the difficulty both AI and human beings have in distinguishing hateful/trollish/harassing behavior from those reporting on that behavior. We've pointed this out over and over again in a variety of contexts. One classic example is social media websites pulling down human rights activists highlighting war crimes by saying it's "terrorist content." Another were the many examples of people on social media talking about racism and how they're victims of racist attacks having their accounts and posts shut down over claims of racism.
And now we have another similar example. A new video series about antisemitism posted its trailer to Instagram... where it was removed for violating community guidelines.
Thank you Instagram for proving the point @Yair_Rosenberg and we at @JewishUnpacked are trying to make. Yes, #antisemitism is bad, BUT educating people about it isn't. You taking our video down shows that this work is more important than ever. https://t.co/PVCItAm1pc pic.twitter.com/dBg3izUGfo
— Johnny Kunza (@johnkunza) August 4, 2021
You can see the video on YouTube, and it's not difficult to figure out how this happened. The message from Instagram says it violates that organization's community guidelines against "violence or dangerous organizations." The video in question, all about antisemitism, does include some Nazi imagery, obviously to make the point that in its extreme form, antisemitism can lead to the murder of Jews. But, Instagram has banned all Nazi content, in part due to those who complained about antisemitism on Instagram.
And that leads to a dilemma. If you're banning Nazi content, you also have to realize how that might lead to content about Nazis (to criticize them and to warn about what they might do) also getting banned. And, again, this isn't new. Earlier this year we had a case study on how YouTube's similar ban took down historical and educational videos about the Holocaust.
The point here is that there is no easy answer. You can say that it should be obvious to anyone reviewing this that trailer (highlighting how bad antisemitism is) is different from actual antisemitism, but it's a lot harder in practice at massive scale. First you need people who actually understand the difference, and you have to be able to write rules that can go out to thousands of moderators in a simple enough manner that explicitly makes clear the differences. And, you also need to give reviewers enough time to actually understand the context, which is kind of impossible given the scale of the content that needs to be reviewed. In such situations the "simpler" versions of the rules often are what get written: "No Nazi content." That's clear and scalable, but leads to these kinds of "mistakes."
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: antisemitism, community standards, content moderation, hate speech, nazi content, scale
Companies: facebook, instagram
Reader Comments
Subscribe: RSS
View by: Time | Thread
Dumb
Don't post important things on social media platforms.
[ link to this | view in thread ]
I'm surprised that the hate groups haven't gone to a "nudie cutie" solution to getting their hate message out by masking their content in a "documentary" style but focusing mostly on the targets of their hate. Maybe that's too subtle for their fans.
[ link to this | view in thread ]
Rules Can Be Illegitimate
Of course, when the topic is something we can all agree on, such as fighting against antisemitism, the old mantra of "their platform, their rules" goes out the window.
-Getting censored proves that your opinion is the strongest.
[ link to this | view in thread ]
Re: Rules Can Be Illegitimate
pile of horse shit.
[ link to this | view in thread ]
Back from cheering on ISIS I see
-Repeatedly lying about being 'censored' because people keep showing you the door of their private property proves that you're not just a person no-one wants to be around but a dishonest one who refuses to own their own words and deeds and instead blames others.
[ link to this | view in thread ]
Re: Rules Can Be Illegitimate
Did you even read the article? Please point to the sentence and / or paragraph where Mike is complaining about their right to moderate.
I'll wait.
It wasn't about "their platform, their rules". It was about how content moderation at scale is impossible to get 100% correct.
If you would get your head out of Trump's ass, maybe you would understand the difference, which is obvious to anybody who read the article and has a modicum of common sense.
How does if feel to be a supporter of ISIS, Nazis, racists, homophobes, bigots, Alex Jones, Nick Fuentes and every other asshole who has been moderated from social media? Doesn't your statement mean that they have the strongest opinions and by making that statement, you are supporting their opinions.
[ link to this | view in thread ]
Re: Rules Can Be Illegitimate
Of course, when the topic is something we can all agree on, such as fighting against antisemitism, the old mantra of "their platform, their rules" goes out the window.
Where did I say that? It is still their platform and their rules. I have remained entirely consistent, unlike you.
Koby, I used to think you were just a confused, ignorant dupe. But now you're actively making shit up, I can only conclude that you're a troll.
-Getting censored proves that your opinion is the strongest.
So.. by this argument, supporters of Nazis have the strongest opinions?
[ link to this | view in thread ]
You should try to find examples that they don't support
How does if feel to be a supporter of ISIS, Nazis, racists, homophobes, bigots, Alex Jones, Nick Fuentes and every other asshole who has been moderated from social media?
What makes you think being on their side would be a problem for Koby, he may not be honest enough to own his own position but he's sure as hell not complaining about fiscal conservatives being shown the door when he whines about how social media is silencing people.
[ link to this | view in thread ]
That is the dilemma. Nobody has the resources to effectively do context-sensitive moderation, so they're left with the choice to either categorically forbid all discussion of a sensitive topic, or not.
I disagree with Instagram's choice.
[ link to this | view in thread ]
Re: Re: Rules Can Be Illegitimate
It means platforms' moderation decisions are the strongest opinions, since the only censorship going on in the real world is Republicans' constant attempts to silence that constitutional free speech.
[ link to this | view in thread ]
Re: Rules Can Be Illegitimate
“Getting censored proves that your opinion is the strongest.”
So why do you like kiddie porn?
[ link to this | view in thread ]
And let me make the situation MORE impossible...
So instagram will probably restore the decent anti- antisemitic video, due to the public outcry.
For the sake of argument, let me construct a little series of partial embeds of all the parts of the video showing bad things, stripping the context, and adding my own: "this is good!". I'm sure we can all name a trumplestiltskin website where they'd love to host that....
Now, is that content on instagram good or bad??? Should it be left up, given the definitely transformative fair use of turning it to a pro-nazi purpose???
Especially given that the exact same transformation can be accomplished by taking the whole and using a video editor and posting the result and posting, and some Nazi probably already did that.
[ link to this | view in thread ]
Bad.
No.
Context is the key to this situation. You should know that.
[ link to this | view in thread ]
Re:
Context is always the key to a situation.
[ link to this | view in thread ]
Re: Rules Can Be Illegitimate
"the old mantra of "their platform, their rules" goes out the window"
Weirdly, that's nothing like what the article says. If it did, it would be an argument for Instagram to be forced to host content against its will just because you agree with what's being said, regardless of their rights to control their own property or exercise their won freedom of speech and association - which is an argument only idiots like you are making. It's never been made here by anyone with any credibility.
"-Getting censored proves that your opinion is the strongest."
No, it doesn't. It might mean that you're a loud obnoxious asshole who needs to be quietened in order for everyone else in the room to continue talking, but you can be an asshole and wrong at the same time. Loud != correct, no matter how many times Ben Shapiro tells you it is.
[ link to this | view in thread ]
Faulty premise
The problem with this whole argument is it's all built upon what I think is likely a flawed premise that this removal was some kind of unintentional collateral damage. Given Face-tagram's history, I find that implausible.
If you consider the possibility that the moderation guidelines aren't meant to promote "good" content or police hate speech or anything like that, but are instead designed primarily to reduce controversy, then this makes a lot more sense. They don't want to kick out or drive away the Nazis, that's bad for business! They also don't want to be investigated and questioned by cops and government officials. They want a bland, sanitized, advertiser-friendly network. They aren't doing this to push social progress, they're doing it so they can be the place where Nazis and Antifa alike can chat with grandma and wish their college roommate a happy birthday and never encounter anything that might make them too uncomfortable.
Perhaps, with sufficient public outrage, they'll restore this video. Perhaps they'll decide the controversy of removing it is worse than the controversy of leaving it up. That still won't make them good people though...
[ link to this | view in thread ]
Re: Rules Can Be Illegitimate
-Getting censored proves that your opinion is the strongest.
Hey Koby, why do you support Isis and Nazis?
[ link to this | view in thread ]
Re: Re: Rules Can Be Illegitimate
"Getting censored proves that your opinion is the strongest."
Your assertion that the venom-spewing fuckwit standing at the head of a "Kill all the jews" rally has more credibility than a history professor or humanitarian speaker is duly noted, Koby.
If I were you I'd find a more appropriate quote to try to back your bullshit with.
[ link to this | view in thread ]
Re: Re: Rules Can Be Illegitimate
"Hey Koby, why do you support Isis and Nazis?"
Because both are overrepresented on those platforms he calls his own. Parler, Gab and GETTR are the homes of the last "true" americans - confederates, russians, nazis and the 9/11 guys.
[ link to this | view in thread ]
Re: Faulty premise
The problem with this whole argument is it's all built upon what I think is likely a flawed premise that this removal was some kind of unintentional collateral damage.
Not really. The article concludes that this was most likely intentional collateral damage.
See:
In such situations the "simpler" versions of the rules often are what get written: "No Nazi content." That's clear and scalable, but leads to these kinds of "mistakes."
[ link to this | view in thread ]
Re: Dumb
Where should they have posted it instead that it could have gotten a large audience?
[ link to this | view in thread ]
Re: Faulty premise
Platforms like Facebook thrive on controversy. Controversy drives engagement, which drives revenue. Reducing controversy is the last thing they would ever try to do.
[ link to this | view in thread ]
Re: Re: Faulty premise
For users, yes. For advertizers, no. And we all know who pays the bills at Facebook.
[ link to this | view in thread ]
Re: Re: Re: Faulty premise
Advertisers want posts that get a lot of attention. That's largely controversial posts. They don't want to get any of the controversy on themselves of course, so they and FB want to suppress anything that makes them look bad. But their moderation is clearly not designed to prune controversial content.
[ link to this | view in thread ]
Re: Re: Faulty premise
Probably the ultimate example of that is the fact that it took a bloody insurrection and failure to win an election for social media to kick Trump out.
[ link to this | view in thread ]
self-consistency
[ link to this | view in thread ]
Re:
"Context is the key to this situation. You should know that."
US republicans don't believe in context. That's the only way we can interpret the way Koby and his less eloquent ilk keep trying to compare government sending someone to jail for speaking up with a private entity tossing some asshole out of their own property.
[ link to this | view in thread ]