YouTube And Demonetization: The Hammer And Nail Of Content Moderation
from the not-so-easy dept
Last week we had a story about a bunch of Pokemon YouTubers discovering their accounts were dropped after YouTube confused their discussion of "CP" (Combat Points), thinking it might actually refer to a different "CP": child porn. The accounts were reinstated after a bit of an outcry.
It appears that this was part of a larger effort by YouTube to deal with a variety of problems on its platform (and, yes, its platform has problems). But some of the resulting stories suggest that YouTube is so focused on "demonetization" as a tool that it's losing sight of alternatives. The Pokemon story appears to have actually been part of a larger effort to respond to claims that child predators were using the comments of certain, perfectly normal videos of kids to, well, do bad stuff. The whole thing is pretty sickening and horrifying and I have no interest in going into the details.
As the controversy over this -- quite reasonably -- gained attention, some pointed out that these videos with exploitative comments were, in many cases, being monetized with big brand name ads appearing next to them. This type of complaint is... not new. People have been complaining about brand names appearing in ads next to "bad" content or "infringing" content for many years. Of course, it's pretty much all matched by algorithm, and part of the problem is that because people are gaming the system, the algorithm (and YouTube) hadn't quite caught on to what was happening. Of course, the outcry from the public -- especially about the monetization -- then resulted in advertisers pulling all their ads from YouTube. And, whether it was the advertisers leaving or the general public outcry (it was almost certainly both, but I'm sure most people will assume it was the advertisers bailing that really made the difference), YouTube went on a big new effort to pull ads from lots of videos.
And in doing so, it created a brand new controversy. Just as this started, a mother named Jessica Ballinger complained on Twitter that YouTube had demonetized videos of her 5 year old son. YouTube responded on Twitter, noting that it was because of the comments on the video, obliquely referencing the stories discussed above.
Of course, this immediately created a new kind of backlash as people (rightfully!) pointed out that disabling monetization of a video based on the comments on that video just seems to empower and encourage trolls. Want to harm a YouTuber you don't like? Just post a sketchy comment on their videos and, boom, you can take away their money.
And, to be clear, this is not a new thing for Google. Just last month we noted that the company has a similarly silly policy with Adsesnse and blogs that have comments. If Adsense decides that some of your user-generated comments are "bad" they might demonetize the page that hosts those comments. As with the stories above, this is mostly to appease advertisers and avoid the sort of (slightly misguided) screaming about "big brand ad appearing next to awful comments." We found that policy to be silly in that situation, but it's even more ridiculous here.
As Dan Bull noted in response to all of this it seems that a much simpler solution compared to demonetizing such videos is to just remove the sketchy, awful, predatory comments.
I got a response from YouTube! Clarifying their demonetising of videos based on the comments.
What I don't understand is why they wouldn't just disable comments if the issue is predatory comments. Instead they leave the comments section up and disable ads. https://t.co/EfVvWXibS7— Dan Bull 🍐 (@itsDanBull) February 22, 2019
That doesn't work in the Adsense context, where the comments are out of Google's control, but on YouTube it's their own damn platform. Of course, the reality is probably that YouTube freaked out following the initial reports about the predatory comments, and then did what it could do most quickly -- which was to demonetize any such video that might possibly have such content, before doing a more thorough sweep of the awful comments.
Still, this once again highlights the difficulty, perils, and trade-offs associated with content moderation on a massive platform. Every choice has trade-offs and creates different kinds of problems. And between the speed with which people freak out about stuff and the scale that this is happening, you get situations like this where people (rightly!) freak out about the awful thing some people are doing in the YouTube comments (with the associated freak-out about how the videos are monetized) leading to YouTube to rush to demonetize... leading to a freak-out from others who were demonetized for totally innocent videos, where they had done absolutely nothing wrong. You can see how these choices may get made to deal with that situation, and still end up in a kind of messed up position.
Of course, demonetization is a perfectly reasonable tool for YouTube to use in some cases. Soon after the mess described above, YouTube also demonetized a bunch of quack anti-vaccine videos, which had also resulted in people being reasonably upset about the kind of nonsense that was showing up in their YouTube feeds (and making these anti-science quacks money). But, again, it's likely that there's a cat and mouse game going on here, and filters of questionable quality. It won't surprise me to find out that (1) anti-vaxxer conspiracy nuts figure out ways to get around this monetization block and (2) perfectly innocent videos get de-monetized (while I haven't heard of any yet, it wouldn't surprise me if -- for example -- a legitimate news report about the anti-vaxxer movement might get caught up in the flow).
Again, there's nothing wrong with YouTube or other platforms making these kinds of decisions, but they're not easy decisions and they (1) make mistakes and (2) have unintended consequences that are worth watching carefully as well. Right before YouTube did that demonetization, another story was making the rounds about how Pinterest took an even more extreme position in basically blocking all anti-vax content and searches on its platforms. Especially given the latest stories of measles outbreaks caused entirely by gullible anti-vaxxers, there's a perfectly reasonable argument that Pinterest is doing the right thing here. But... as others like Christina Xu noted, this kind of move is kind of difficult to separate from Chinese internet censorship.
1) This is DEFINITELY the right move on Pinterest's part because kids are dying of measles, AND
2) This is exactly how censorship is often enacted on the Chinese internet, SO
3) What systems of oversight do we need for these critical, complex choices? https://t.co/hlP9SE0bFJ— Christina Xu is in 🗽 (@xuhulk) February 21, 2019
Indeed, it would not surprise me in the slightest to see stories appear in the Chinese media before long pointing to these examples of why it is taking the right approach with the Great Firewall of China and its free speech suppressing censorship. After all, the Chinese have done it in the past with other, similar, stories.
And, thus, once again, there's a key point to make in all of this: anyone who thinks there are "easy" choices here is wrong. There is not. Almost any choice has massive problems and potentially huge consequences -- including the choice of doing nothing at all. And while many people use this to argue that it is a bad look for the internet platforms, it seems a stronger argument is that it's a bad look for humanity. These platforms are reflecting humanity, including some pretty awful humans. And society hasn't been able to stop some humans from being awful, terrible people with awful ideas, and it would be unfair to suggest that internet companies should magically be able to step in and solve that problem with hamfisted tools like "demonetization."
Filed Under: comments, content moderation, demonetization, free speech, youtube, youtube comments
Companies: google, youtube