Content Moderation At Scale Is Impossible: Some Republican Politicians Are Indistinguishable From Neo Nazis
from the example-number-380414352908 dept
Over and over and over again we've pointed out that content moderation at scale is impossible to do well -- in part because at such scale, there are bound to be a huge number of errors, even if the percentage of errors is relatively small. We've also pointed out that a lot of the content decisions that moderators face fall into a terrible gray area, where it's not easy to craft scalable rules that can be applied fairly across the board -- in part because context matters and it's impossible to scale the reviewing and understanding of context.
Motherboard recently had an excellent article detailing one manifestation of this problem, by noting that trying to apply rules across the board leads to some problematic results:
With every sort of content filter, there is a tradeoff, he explained. When a platform aggressively enforces against ISIS content, for instance, it can also flag innocent accounts as well, such as Arabic language broadcasters. Society, in general, accepts the benefit of banning ISIS for inconveniencing some others, he said.
In separate discussions verified by Motherboard, that employee said Twitter hasn’t taken the same aggressive approach to white supremacist content because the collateral accounts that are impacted can, in some instances, be Republican politicians.
And, as the Motherboard piece notes, "banning politicians wouldn't be accepted by society as a trade-off for flagging all the white supremacist propaganda."
Indeed, as it stands right now, we already have Republican politicians like Ted Cruz screaming about "conservative bias" at a point when it's clear that Twitter is actually bending over backwards not to unfairly go after "conservatives" (such as described above). Of course, perhaps it all comes together when you look at the widely shared "study" claiming to "prove" conservative bias in Twitter bans, that really just showed Twitter banning a bunch of white supremacists and trolls. Apparently, at least some conservatives themselves have trouble telling the difference between themselves and white supremacists. Which, you know, maybe is an issue...
Jillian York has a great Twitter thread detailing some of the issues here, basically highlighting how bad algorithms are at context. You should read the whole thing, but a short (slightly edited) excerpt:
Here's the thing: Algorithms, at least right now, cannot identify *how* text is being used. To give you a clear example, the word "dyke" is both an insult and a reclaimed word used by lesbians to describe themselves sometimes. We know that filters regularly censor non-hateful uses of the word "dyke." Instagram does it and in the 1990s Dick Van Dyke's name was often collateral damage of this type of censorship. We've even got a YouTuber claiming that he had to change his damn name because his content was being demonetized...as a result of his last name being Dyke.
It goes on to talk about the impossibility of filtering text, when language is used in so many different (often creative) ways, and where it's simply impossible to think that you can use algorithms to filter out "bad" people. It's the kind of thing you wish more people understood, so everyone should go read York's thread.
But, either way this continues to demonstrate just how fraught an issue content moderation is. Combine the fact that it's difficult to separate Nazis from some Republican politicians, and the fact that the grandstanding Ted Cruzes and Louis Gohmerts of the party would absolutely flip out if an actual GOP politician was banned from Twitter, with the fact that plenty of people are making perfectly reasonable calls for trying to get Twitter to kick Nazis off their platform and Twitter (and other platforms) is left in an impossible position. It's very much a damned if you do/damned if you don't position. And it's not just that one group or another might be "upset" by either decision -- it's that people are positively livid about either decision and insisting that no one in their right mind would choose such a decision.
It's not a tightrope that is being walked -- it's an impossible situation to deal with in a way that doesn't make people completely furious about.
Filed Under: algorithms, bias, censorship, content moderation, filters, republican politicians, white supremacists
Companies: twitter