Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Huge Surge In Users On One Server Prompts Intercession From Discord (2021)

from the moderating-game-stonks dept

Summary: A wild few days for the stock market resulted in some interesting moderation moves by a handful of communications/social media platforms.

A group of unassociated retail investors (i.e. day traders playing the stock market with the assistance of services like Robin Hood) gathering at the Wall Street Bets subreddit started a mini-revolution by refusing to believe Gamestop stock was worth as little as some hedge funds believed it was.

The initial surge in Gamestop's stock price was soon followed by a runaway escalation, some of it a direct response to a hedge fund's large (and exposed) short position. Melvin Capital -- the hedge fund targeted by Wall Street Bets denizens -- had announced its belief Gamestop stock wasn't worth the price it was at and had put its money where its mouth was by taking a large short position that would only pay off if the stock price continued to drop.

As the stock soared from less than $5/share to over $150/share, people began flooding to r/wallstreetbets. This forced the first moderation move. Moderators briefly took the subreddit private in an attempt to stem the flow of newcomers and get a handle on the issues these sort of influxes bring with them.

Wall Street Bets moved some of the conversation over to Discord, which prompted another set of moderation moves. Discord banned the server, claiming users routinely violated guidelines on hate speech, incitement of violence, and spreading misinformation. This was initially viewed as another attempt to rein in vengeful retail investors who were inflicting pain on hedge funds: the Big Guys making sure the Little Guys weren't allowed on the playing field. (Melvin Capital received a $2.75 billion cash infusion after its Gamestop short was blown up by Gamestop's unprecedented rise in price.)

But it wasn't as conspiratorial as it first appeared. The users who frequented a subreddit that described itself as "4chan with a Bloomberg terminal" were very abrasive and the addition of mics to the mix at the Discord server made things worse by doubling the amount of noise -- noise that often included hate speech and plenty of insensitive language.

The ban was dropped and the server was re-enabled by Discord, which announced it was stepping in to more directly moderate content and users. With over 300,000 users, the server had apparently grown too large, too quickly, making it all but impossible for Wall Street Bets moderators to handle on their own. This partially reversed the earlier narrative, turning Discord into the Big Guy helping out the Little Guy, rather than allowing them to be silenced permanently due to the actions of their worst users.

Decisions to be made by Discord:

  • Do temporary bans harm goodwill and chase users from the platform? Is this the expected result when this happens?

  • Is participating directly in moderation of heavily-trafficked servers scalable?

  • How much moderation should be left in the hands of server moderators? Should they be allowed more flexibility when moderating questionable content that may violate Discord rules but is otherwise still legal?

Questions and policy implications to consider:
  • Are temporary bans of servers more effective than other, more scaled escalation efforts? Are changes more immediate?

  • Is the fallout from bans offset by the exit of problem users? Or do server bans tend to entrench the worst users to the detriment of new users and moderators who are left to clean up the mess?

  • As more users move to Discord, is the platform capable of stepping in earlier to head off developing problems before they reach the point a ban is warranted?

  • Does offloading moderation to users of the service increase the possibility of rules violations? If so, should Discord take more direct control earlier when problematic content is reported?

Resolution: The Wall Street Bets Discord server is still up and running. Its core clientele likely hasn't changed much, which means moderation is still a full-time job. An influx of new users following press coverage of this particular group of retail traders may dilute the user base, but it's unlikely to turn WSB into a genteel community of stock market amateurs. Discord's assistance will likely be needed for the foreseeable future

Originally published on the Trust & Safety Foundation website.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: content moderation, game stonks, short selling, stock trading, wall street
Companies: discord, gamestop, reddit


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    sumgai (profile), 24 Mar 2021 @ 7:06pm

    Copia Institute frequently tries to "file an Amicus Curiae brief" with other businesses.

    Decisions to be made by Copia:

    see below

    Questions and policy implications to consider:

    see below

    Resolution:

    Copia could do a better job of trying to sway leaders of other businesses/sites to their way of seeing things. For example: Instead of "Things to consider", which is rightfully taken as "Hey dummy, you didn't anticipate this, didya?", Copia might say "We have some insight we'd like to share with you, Mr. Business Owner, could we make some time to get together, please?"

    IOW, it's the old 'honey rather than vinegar' trick, I'm sure you understand.

    And yes, I'm not afraid to sign my name to this, I've felt this way since Copia's first post of this nature. I don't doubt for a minute that their intentions are good, but the way they're going about it just strikes me as pretty near to picking a fight, and not a way that one engages in meaningful discussion.

    link to this | view in chronology ]

    • icon
      Mike Masnick (profile), 24 Mar 2021 @ 11:03pm

      Re:

      What? I've read this comment three times through and I still have no idea at all what you're trying to say?

      link to this | view in chronology ]

      • icon
        sumgai (profile), 25 Mar 2021 @ 8:58am

        Re: Re:

        It's simple, Mike. I'm saying, in the same written form as the posting, that Copia is attempting to interject their views on how a business should perform. In my view, they aren't asking Business X to review and take under advisement "these things", they are expressing in no uncertain terms that Business X should (nearly must) execute on these bullet-list items. As i said, they could be doing this in a positive manner, i.e. politely asking for a sit-down to discuss the various concerns.

        But I do admit, for all I know, they might be playing nice with the big boys, and the postings are simply abridged in such a manner as to make it look like they're..... curt, that's the word I'm looking for. As in, not abrupt or rude, simply not nuanced - curt describes that nicely. If that's the case, please let me/us know, and I'll gladly retract my statements and sentiments.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 25 Mar 2021 @ 6:45pm

          Re: Re: Re:

          Yeah i'm not seeing it. Also i find that the questions are food for thought and not some "just asking questions" troll. I believe they are authentic questions and not crypto-insinuations. Further, it's a thing to think about for anyone in a related position, not just the outfit(s) mentioned in any particular CM study.

          Tealdeer: I feel like it is pretty honest and good-faith stuff.

          And quite possibly useful for people who either haven't bothered to analyze these things or think about them more generally, or for people who are stuck in a culture/climate where maybe they do not naturally have the opportunity to think about them.

          link to this | view in chronology ]

        • icon
          Mike Masnick (profile), 25 Mar 2021 @ 10:46pm

          Re: Re: Re:

          It's simple, Mike. I'm saying, in the same written form as the posting, that Copia is attempting to interject their views on how a business should perform.

          Uh, no, not at all. Did you not read the part which says: "These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in."

          In my view, they aren't asking Business X to review and take under advisement "these things", they are expressing in no uncertain terms that Business X should (nearly must) execute on these bullet-list items.

          Uh, no, not at all. The whole point is to help people who do not live in the content moderation world to understand the kinds of questions that trust & safety teams need to deal with all the time, and the larger policy questions raised. These are not telling the companies what to do, and pretty clearly make no recommendations at all. They're designed to highlight the difficult/impossible choices trust & safety teams make.

          As i said, they could be doing this in a positive manner, i.e. politely asking for a sit-down to discuss the various concerns.

          You really are reading these all wrong. These are supportive of the companies, and noting the impossible decisions they need to make.

          link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

Close

Email This

This feature is only available to registered users. Register or sign in to use it.