Once Again, Algorithms Can't Tell The Difference Between 'Bad Stuff' And 'Reporting About Bad Stuff'
from the stop-demanding-more-algorithms dept
We've discussed many times just how silly it is to expect internet platforms to actually do a good job of moderating their own platforms. Can they do better? Yes, absolutely. Should they put more resources towards it? For the most part, yes. But there seems to be this weird belief among many -- often people who don't like or trust the platforms -- that if only they "nerded harder" they could magically smarts their way to better content moderation algorithms. And, in many cases, they're demanding such filters be put in place and threatening criminal liability for failing to magically block the "right" content.
This is all silly, because so much of this stuff involves understanding nuance and context. And algorithms still suck at context. For many years, we've pointed to the example of YouTube shutting down an account of a human rights group documenting war crimes in Syria, as part of demands to pulldown "terrorist propaganda." You see, "terrorist propaganda" and "documenting war crimes" can look awfully similar. Indeed, it may be exactly the same. So how can you teach a computer to recognize which one is which?
There have been many similar examples over the years, and here's another good one. The Atlantic is reporting that, for a period of time, YouTube removed a video that The Atlantic had posted of white nationalist Richard Spencer addressing a crowd with "Hail, Trump." You remember the video. It made all the rounds. It doesn't need to be seen again. But... it's still troubling that YouTube removed it. YouTube removed it claiming that it was "borderline" hate speech.
And, sure, you can understand why a first-pass look at the video might have someone think that. It's someone rallying a bunch of white nationalists and giving a pretty strong wink-and-a-nod towards the Nazis. But it was being done in the context of reporting. And YouTube (whether by algorithm, human, or some combination of both) failed to comprehend that context.
Reporting on "bad stuff" is kind of indistinguishable from just promoting "bad stuff."
And sometimes, reporting on bad stuff and bad people is... kind of important. But if we keep pushing towards a world where platforms are ordered to censor at the drop of a hat if anything offensive shows up, we're going to lose out on a lot of important reporting as well. And, on top of that, we lose out on a lot of people countering that speech, and responding to it, mocking it and diminishing its power as well.
So, yes, I can understand the kneejerk reaction that "bad stuff" doesn't belong online. But we should be at least a bit cautious in demanding that it all disappear. Because it's going to remain close to impossible to easily determine the difference between bad stuff and reporting on that bad stuff. And we probably want to keep reporting on bad stuff.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: algorithms, content moderation, takedowns
Companies: youtube
Reader Comments
Subscribe: RSS
View by: Time | Thread
censorship as a recruitment tool
[ link to this | view in chronology ]
Maybe this is their intent.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
A bit off-topic, but...
Alternative platforms need a chance to thrive prior to getting inundated by people that are only there because they had no choice.
If both sides of the political spectrum don't fight against censorship, regardless of who is censored, there will not be a practical alternative available when they censor you.
[ link to this | view in chronology ]
Re: A bit off-topic, but...
Remember that at that point in time Communism was a murderous cult that was destroying Russia (starving the Ukraine to death)and attempting to take over the world. It looked like the authentic No. 1 evil in the world. Fascism at that point had not really shown its own evil face.
The implication then is that everyone's rights need to be defended, no matter how vile they seem to be - because the people with the power to persecute them always have the potential to become even worse.
[ link to this | view in chronology ]
Re: Re: A bit off-topic, but...
The actual problem with communism is that it does nothing to address human nature; like any other ostensibly harmless philosophy promising blue skies and rainbows in an anarchist Utopia it creates a power vacuum by demolishing the institutions that uphold our society. Like it or not, we need leaders, and when we see chaos all around and somebody promising to restore order, we'll take whatever's going. And that "somebody" is usually a strongman figure. We've seen this played out throughout history wherever an idealist fantasy didn't work out in practice.
This is why the Far Left and the Far Right have so much in common, including anti-Semitism; they're two sides of the same coin.
It's why I tend to gravitate towards a centrist conservative position: it's entirely reasonable to be afraid of extremists on any side and I prefer order to chaos.
[ link to this | view in chronology ]
Re: Re: Re: A bit off-topic, but...
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
Would you care to elaborate on any of those accusations or are you just a mud-slinging troll?
[ link to this | view in chronology ]
Re: Re:
What he bitched about has absolutely nothing to do with the article aside from a tenuous reference to Google.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Strict liability for such actions
It would only take a few million-dollar judgements against the censors to make them think twice about such practices.
FWIW
[ link to this | view in chronology ]
You build up an enemy, like say a group of people who think that becoming violent or discriminating based on racial stereotypes is a good thing for the world at large, and then online platforms take their content down because it's "unacceptable".
Then it's gone. But people from the groups noticed it disappeared and they're convinced even moreso that their side is being unjustly repressed, no matter how ridiculous most people think the group is. It's a political Streisand effect: "Why suppress that group/video/comment/blog/website unless their enemies thought it was a valid threat?"
Let people who cite ridiculous views speak. Let them be judged by their own words because they have no bite when we *choose* not to listen to them.
I'm not trapped on one web page, communication protocol or social media platform. If there's something I find offensive there, I laugh or cringe and go somewhere else; the Internet's a big place.
I'm not victimized, and the only thing that's offensive for someone to "protect" me from things like this, like I'll instantly get online PTSD or something. :P
[ link to this | view in chronology ]
Re:
Who knew that mean tweets can be as harmful to a person's long-term mental health as bullets and bombs on a battlefield?
http://www.dailymail.co.uk/news/article-2605888/Woman-claims-PTSD-Twitter-cyberstalking- says-bit-war-veterans.html
[ link to this | view in chronology ]
Re: Re:
The scumbags that own and run toxic tabloid media like the Daily Mail are far bigger contributers to the problem than Twitter etc.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
Since these are what's keeping our government in power (plus the ineptitude on the part of Her Majesty's Opposition) I can't see that happening any time soon.
[ link to this | view in chronology ]
#2 reason why not: libel laws require the speech to be:
- untrue;
- presented as fact and not opinion;
- either causing actual harm or being "innately harmful."
"Your video was removed because it was determined to be in violation of our community guidelines" does not, in itself, cause actual harm. It's the removal of the video that would cause any problems -- which is not covered by libel.
It's also not untrue -- the video was determined to be in violation of their guidelines -- and the guidelines themselves are an opinion of what YouTube considers to be hate speech.
[ link to this | view in chronology ]
People in power make plans
[ link to this | view in chronology ]
Humanity Taken For Granted
On the other hand some people know exactly what they're doing. Their agenda is about controlling what others do. They know that perfection is impossible and overkill is their intent.
[ link to this | view in chronology ]