No, YouTube Cannot Reasonably Moderate All Content On Its Platform
from the that's-not-how-it-works dept
The UK Tech Editor of the Guardian, Alex Hern, is usually a very thoughtful and cogent commentator on all things digital. I usually enjoy reading his thoughts on technology and find them worth thinking about. However, he appears to have something of a blindspot concerning content moderation. A few weeks back we highlighted an odd tweet of his suggesting that if YouTube and Facebook simply employed ONE PERSON to search for "New Zealand terror attacks," they could have magically deleted all of the Christchurch shooting videos:
Spoiler alert: both companies employ way more content moderators than that, and as both companies admitted soon after, it was an "all hands on deck" situation in which they sought to block as many of those videos as fast as they could.
Last week, Alex was back with another hot take on YouTube content moderation arguing that it's totally possible to moderate all content on YouTube. Alex regularly deletes his old tweets, so here's his thread:
And here is the text of what he said:
“There’s no way YouTube could pre-moderate every video that gets posted to its platform” is one of those things that’s said a lot, but… isn’t actually true.
YouTube sees ~400 hours of content uploaded every minute. That’s 24,000 minutes of content every minute. That means, let’s say, 30,000 people working at any given minute to watch content as it comes in.
Working eight hour shifts, and again rounding up, that’s 100,000 moderation employees. Let’s pay them well: the London living wage is just under £20k a year. So that’s a £2b staffing bill, or $2.61bn a year. Alphabet - YouTube’s parent company - reported $30bn profit last year.
So what does “can’t” mean in this context? Is it really unreasonable for Google to employ one fifth as many workers as Tesco? Is it really a requirement that every single video ever posted be visible within seconds of upload?
Or is it just about boosting net income as high as it will go, and pushing back against social norms that may threaten that?
The thread got tons of attention and retweets, and lots of people agreeing and promoting it. The only trouble is that it's utter nonsense. Let's go through the details. First, and most obviously, the 400 hours stat is old. That was based on a report in 2015. Much more recent testimony (as in, from last week before Congress) has YouTube saying "over 500 hours of video uploaded every minute." To be conservative, let's use the number 500, even though YouTube says it's higher than that, and in all likelihood the number continues to grow each year. That means we're already talking about 30,000 minutes of content added every minute. Hern talks about having 30,000 people employed at any one time to view all that content, but that makes a few assumptions that are incorrect.
First, assuming that anyone can sit there and just watch content straight for 8 hour shifts is crazy. There have already been lots of discussions about the difficult situation content moderators are in, in that reviewing content all the time is incredibly stressful, and often requires significant breaks. So even as Hern "rounds up" his numbers, we're likely talking a much higher need for reviewers to actually cover all this content in an 8 hour day. Also, Hern assumes that a single viewing by a single individual is all that content moderation would take. That makes no sense at all. To determine if a video is appropriate often would require multiple viewings, and sometimes some level of research to understand the context/language/etc. in a video to determine whether or not it met some criteria. And let's not leave out the ongoing training that would be required of moderators to keep up on the ever changing nature of what's allowed/not allowed under YouTube's terms of service. The Radiolab episode we discussed last year showed just how difficult a process it is to train moderators, and to continually update what's allowed, as so much content falls into a "gray zone" rather than being black and white (allowed/not allowed).
So, just on that criteria alone, you're probably talking about at least doubling the number of reviewers needed just so you'd actually have enough time to view each video enough times to fully understand what's going on and to keep up with the rules. So now we're talking about at least 200,000 content moderators. The last report I've seen says that in 2018 Google had 98,771 employees. So this alone would nearly triple its workforce.
Oh, and we're still assuming that a single person viewing the video is all that's needed. But that's wrong. Last year, when we ran our "You Make the Call" game at the Content Moderation Summit in DC, one of the things we noted was that in every example we made the audience vote on, there was no uniform agreement on what content should be allowed or disallowed -- even when we specifically highlighted the rule that the content likely violated. On every single example, people disagreed and had strong arguments for why some content should be allowed, while others believed it should be taken down, and vice versa.
So, at the very least you'd want at least two people to review each piece of content, but then if they disagreed, you'd probably want a third reviewer. And that assumes that a sample size of three is actually reasonable. It probably isn't. Hell, our sample size of ~100 reviewers at the Content Moderation Summit couldn't agree on anything, so it's not clear how many people you'd actually need, but it's at least double the 200,000 employees we'd already mentioned. So, no we're talking about at least 400,000 employees, almost quintupling the size of Google's workforce solely because sometimes a few bad videos get through the existing process.
This seems like overkill.
And, for what benefit? We'd lose out on lots of things. Contrary to Hern's suggestion, this would certainly significantly delay the time it would take to get videos up on YouTube, which decreases the value of the platform. It would also certainly lead to much more unnecessary censorship, as moderators who are unsure of things are probably more likely to block a video from going live, rather than risking the ire of letting through a "bad" video. It would significantly limit the benefit of YouTube, but likely do very little (if anything) to curb the fact that there are assholes posting asshole-ish content online.
I've joked in the past that if we combine all the big questions about "the future of work" with the big questions about "content moderation," one might create a Switftian Modest Proposal to just employ everyone in the job of moderating the content of everyone else, but that hardly seems reasonable or practical.
As I noted up top: I think Alex is one of the more reasonable and thoughtful commentators on tech. But he seems to have a significant blindspot on the realities of content moderation and how it works. I'd urge him to talk to some experts in the space, or to actually sit in with a content moderation team if they'd let him do so (these days, they're a lot more open to letting in reporters to see just how difficult the process really is), before coming up with more hot takes on the subject.
Filed Under: alex hern, content moderation, content moderation at scale, reviewers
Companies: youtube