Content Moderation Case Study: Roblox Moderators Combat In-Game Reenactments Of Mass Shootings (2021)
from the modern-moderation dept
Online game platform Roblox has gone from a niche offering to a cultural phenomenon over its 15 years of existence. Rivalling Minecraft in its ability to attract young users, Roblox is played by over half of American children and has a user base of 164 million active users.
Roblox also gives players access to a robust set of creation tools, allowing users to create and craft their own experiences, as well as enjoy those created by others.
A surge in users during the COVID-19 pandemic created problems Roblox’s automated moderation systems — as well as its human moderators — are still attempting to solve. Roblox employs 1,600 human moderators who not only handle content flowing through in-game chat features but content created and shared with other users utilizing Roblox’s creation tools.
Users embraced the creation tools, some in healthier ways than others. If it happened in the real world, someone will try to approximate it online. Users have used a kid-focused game to create virtual red light districts where players can gather to engage in simulated sex with other players — an activity that tends to avoid moderation by utilizing out-of-game chat platforms like Discord to provide direct links to this content.
Perhaps more disturbingly, players are recreating mass shootings — many of them containing a racial element — inside the game, and inviting players to step into the shoes of mass murderers. Anti-Defamation League researcher Daniel Kelley was easily able to find recreations of the Christchurch Mosque shooting that occurred in New Zealand in 2019.
While Roblox proactively polices the platform for “terrorist content,” the continual resurfacing of content like this remains a problem without an immediate solution. As Russell Brandom of The Verge points out, 40 million daily users generate more content than can be manually-reviewed by human moderators. And the use of a keyword blocklist would result in users being unable to discuss (or recreate) the New Zealand town.
Company considerations:
- How does catering to a younger user base affect moderation efforts?
- What steps can be taken to limit access to or creation of content when users utilize communication channels the company cannot directly monitor?
- What measures can be put in place to limit unintentional interaction with potentially harmful content by younger users? What tools can be used to curate content to provide “safer” areas for younger users to explore and interact with?
Issue considerations:
- How should companies respond to users who wish to discuss or otherwise interact with each other with content that involves newsworthy, but violent, events?
- How much can a more robust reporting process ease the load on human and AI moderation?
- Can direct monitoring of users and their interactions create additional legal risks when most users are minors? How can companies whose user bases are mostly children address potential legal risks while still giving users freedom to create and communicate on the platform?
Resolution:
Roblox updated its Community Standards to let users know this sort of content was prohibited. It also said it would engage in “proactive detection” that would put human eyes on content related to terms like this, allowing geographic references but not depictions of the mosque shooting.
Originally posted to the Trust and Safety Foundation website.
Filed Under: children, content moderation, mass shootings
Companies: roblox