Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020)
from the for-the-children dept
Summary: Roblox is an incredibly popular online platform for games, especially among younger users. In 2020, it was reported that two-thirds of all US kids between 9 and 12 years old use Roblox, and one-third for all Americans under the age of 16. The games on the platform can be developed by anyone, as Roblox has set up a very easy environment, using the scripting language Lua, so that many of the games themselves are developed by Roblox’s young users.
Given the target market of Roblox, the company has put in place a fairly robust content moderation program designed to stop content that the company deems inappropriate. This includes all kinds of profanity and “inappropriate” language, as well as any talk of “dating,” let alone sexual innuendo. The company also does not allow users to share personal identifiable information.
The content moderation extends not just to players on the Roblox platform, but to the many game developers that create and release games on Roblox as well. Roblox apparently uses AI moderation from a company called Community Sift as well as human moderators from iEnergizer. Recent reports say that Roblox has a team of 2,300 content moderators.
Given the competing interests and incentives, there are both widespread reports of adult content being easily available (including to children) as well as developers complaining about having their content, projects, and accounts shut down over perfectly reasonable content, leading to widespread complaints that the moderation system is completely arbitrary.
Roblox is then left trying to figure out how to better deal with such adult content while simultaneously not upsetting its developers, or angering parents who don’t want their children exposed to adult content while playing games.
Decisions to be made by Roblox:
- How do you monitor so much content to make sure that adult content does not get through? How do you make sure that kids are not exposed to adult content?
- If the moderation systems are too aggressive, will that drive developers (and possibly some users) away?
- Should all games go through a human review process before they can be offered through Roblox?
- Are there better ways to communicate how and why content is moderated?
- Which is a more important constituency: the kids/families using Roblox or the developers who produce content for it? Is being aggressive in content moderation about finding a balance between those two groups?
- Is it worth “overbanning” if it means families feel safer using Roblox?
Originally published on the Trust & Safety Foundation website.
Filed Under: content moderation, kids
Companies: roblox