Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Roblox Moderators Combat In-Game Reenactments Of Mass Shootings (2021)

from the modern-moderation dept

Online game platform Roblox has gone from a niche offering to a cultural phenomenon over its 15 years of existence. Rivalling Minecraft in its ability to attract young users, Roblox is played by over half of American children and has a user base of 164 million active users.

Roblox also gives players access to a robust set of creation tools, allowing users to create and craft their own experiences, as well as enjoy those created by others. 

A surge in users during the COVID-19 pandemic created problems Roblox’s automated moderation systems — as well as its human moderators — are still attempting to solve. Roblox employs 1,600 human moderators who not only handle content flowing through in-game chat features but content created and shared with other users utilizing Roblox’s creation tools. 

Users embraced the creation tools, some in healthier ways than others. If it happened in the real world, someone will try to approximate it online. Users have used a kid-focused game to create virtual red light districts where players can gather to engage in simulated sex with other players — an activity that tends to avoid moderation by utilizing out-of-game chat platforms like Discord to provide direct links to this content. 

Perhaps more disturbingly, players are recreating mass shootings — many of them containing a racial element — inside the game, and inviting players to step into the shoes of mass murderers. Anti-Defamation League researcher Daniel Kelley was easily able to find recreations of the Christchurch Mosque shooting that occurred in New Zealand in 2019

While Roblox proactively polices the platform for “terrorist content,” the continual resurfacing of content like this remains a problem without an immediate solution. As Russell Brandom of The Verge points out, 40 million daily users generate more content than can be manually-reviewed by human moderators. And the use of a keyword blocklist would result in users being unable to discuss (or recreate) the New Zealand town. 

Company considerations:

  • How does catering to a younger user base affect moderation efforts?
  • What steps can be taken to limit access to or creation of content when users utilize communication channels the company cannot directly monitor? 
  • What measures can be put in place to limit unintentional interaction with potentially harmful content by younger users? What tools can be used to curate content to provide “safer” areas for younger users to explore and interact with?

Issue considerations:

  • How should companies respond to users who wish to discuss or otherwise interact with each other with content that involves newsworthy, but violent, events? 
  • How much can a more robust reporting process ease the load on human and AI moderation?
  • Can direct monitoring of users and their interactions create additional legal risks when most users are minors? How can companies whose user bases are mostly children address potential legal risks while still giving users freedom to create and communicate on the platform?

Resolution:

Roblox updated its Community Standards to let users know this sort of content was prohibited. It also said it would engage in “proactive detection” that would put human eyes on content related to terms like this, allowing geographic references but not depictions of the mosque shooting. 

Originally posted to the Trust and Safety Foundation website.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: children, content moderation, mass shootings
Companies: roblox


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 6 Jan 2022 @ 4:01am

    Perhaps they should release an adult version of the system so that players interested in more mature topics don't have to sneak in the kid version?

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Jan 2022 @ 12:41pm

    Some of their moderation tasks are difficult, to say the least. However, these are compounded by other parts of their business model and behavior. Roblox is problematic all the way down.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 7 Jan 2022 @ 4:04am

      Re:

      Having watched the recent exposé on Roblox's moderation and user protection policies, on top of how they run their market for cosmetics... yeah, it's pretty clear that Roblox does not have their users' best interests in mind.

      link to this | view in chronology ]

  • icon
    ECA (profile), 6 Jan 2022 @ 1:26pm

    REALITY SUCKS.

    link to this | view in chronology ]

    • icon
      ECA (profile), 6 Jan 2022 @ 1:36pm

      Re: REALITY SUCKS.

      But its reality.
      Truth hurts? only if you love the lies and cant deal with REAL.
      Now a few restrictions MIGHT be a nice thing. Like a 18+ section. But how are you going to monitor/sort that? How can you really tell who is who and the age? There was a suggestion awhile back, about listing AGE of those signed in and restricting Access by AGE.

      Building constructions that mimic real life ISNT BAD. It could show how things could have been better, if/maybe/choices were made. Simulating things Isnt bad. It can show kids WHAT they could have/may have been able to do to SAVE themselves.

      link to this | view in chronology ]

  • identicon
    sladew2129, 7 Jan 2022 @ 12:43am

    Really It has been played mostly in America I am fond of this game I play this game in my Spare.left time I ready dissertation with help of <a href="https://www.reportwritinghelp.com/">types of report writing</a> service

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 21 Jan 2022 @ 9:34am

    Not related, but automatic moderation isn't much better. There's stories making the rounds of Reddit about how Nintendo's automatic detection software for Super Smash Bros Ultimate online is issuing warnings and bans over normal gameplay because its algorithms are incorrectly detecting "unsporting" gameplay:

    https://www.reddit.com/r/SmashBrosUltimate/comments/s9b6l1/when_you_buy_the_game_at_xmas_a nd_have_never/

    Users are complaining about receiving bans as a result:

    "I got banned for getting a Blade Beam spamming Cloud. I caught the rhythm and just stood there parrying every beam.

    I got the message that it's unsporting behavior to just stand still and do nothing."

    link to this | view in chronology ]

  • identicon
    SomeDude, 12 Feb 2022 @ 8:05am

    Wrong

    A lot of what is in here is very, very wrong indeed. Roblox only employs about 30 moderators total, not 1600, as I have personally confirmed. A lot of what's reported or submitted for use on roblox is run through a third-party filter. Roblox staff never oversee anything, they just sit back and let the bots take care of things. This is why all these 'mosque shooting' and related terrorism-centered games are still up on the site; staff simply do not care.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

Close

Email This

This feature is only available to registered users. Register or sign in to use it.