Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: YouTube Doubles Down On Questionable 'graphic Content' Enforcement Before Reversing Course (2020)

from the moderation-road-rage dept

Summary:

YouTube creators have frequently complained about the opaque and frustrating nature of the platform’s appeals process for videos that are restricted or removed for violating its Community Guidelines. Beyond simply removing content, these takedowns can be severely damaging to creators, as they can result in “strikes” against a channel. Strikes incur temporary restrictions on the user’s ability to upload content and use other site features, and enough strikes can ultimately lead to permanent channel suspension.

Creators can appeal these strikes, but many complain that the response to appeals is inconsistent, and that rejections are deemed “final” without providing insight into the decision-making process or any further recourse. One such incident in 2020 involving high-profile creators drew widespread attention online and resulted in a rare apology and reversal of course by YouTube.

On August 24, 2020, YouTube creator MoistCr1TiKaL (aka Charlie White, who also uses the handle penguinz0), who at the time had nearly six-million subscribers, posted a video in which he reacted to a viral 2014 clip of a supposed “road rage” incident involving people dressed as popular animated characters. The authenticity of the original video is unverified and many viewers suspect it was staged for comedic purposes, as the supposed “violence” it portrays appears to be fake, and the target of the “attack” appears uninjured. Soon after posting his reaction video, White received a strike for “graphic content with intent to shock” and the video was removed. On September 1, White revealed on Twitter that he had appealed the strike, but the appeal was rejected.

White then posted a video expressing his anger at the situation, and pointed out that another high-profile YouTube creator, Markiplier (aka Mark Fischbach), had posted his own reaction to the same viral video nearly four years earlier but had not received a strike. Fischbach agreed with White and asked YouTube to address the inconsistency. To the surprise of both creators, YouTube responded by issuing a strike to Fischbach’s video as well.

The incident resulted in widespread backlash online, and the proliferation of the #AnswerUsYouTube hashtag on Twitter, with fans of both creators demanding a reversal of the strikes and/or more clarity on how the platform makes these enforcement decisions.

Company considerations:

  • If erroneous strikes are inevitable given the volume of content being moderated, what are the necessary elements of an appeals process to ensure creators have adequate recourse and receive satisfactory explanations for final decisions?
  • What are the conditions under which off-platform attention to a content moderation decision should result in further manual review and potential reversals outside the normal appeals process?
  • How can similar consideration be afforded to creators who face erroneous strikes and rejected appeals, but do not have large audiences who will put off-platform pressure on the company?

Issue considerations:

  • How can companies balance the desire to directly respond to controversies involving highly popular creators with the desire to employ consistent, equitable processes for all creators?
  • How should platforms harmonize their enforcement decisions when they are alerted to clear contradictions between the decisions on similar pieces of content?

Resolution:

On September 2, a few hours after Fischbach announced his strike and White expressed his shock at that decision, the TeamYouTube Twitter account replied to White and to Fischbach with an apology, stating that it had restored both videos and reversed both strikes and calling the initial decision “an over-enforcement of our policies.” Both creators expressed their appreciation for the reversal, while also noting that they hope the company makes changes to prevent similar incidents from occurring in the future. Since such reversals by YouTube are quite rare, and apologies even rarer, the story sparked widespread coverage in a variety of outlets.

Originally posted to the Trust and Safety Foundation website.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: case study, content moderation, enforcement, road rage
Companies: youtube


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 16 Feb 2022 @ 6:14pm

    Small creators don't have the leverage to put on pressure.

    How can similar consideration be afforded to creators who face erroneous strikes and rejected appeals, but do not have large audiences who will put off-platform pressure on the company?

    This is the most important thing to consider. If online ser ices hosting user-generated content can't come up with a solution for this problem, then they should lean toward under-enforcement rather than over-enforcement. The bigger the company is, the more conservative the moderation should be.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 17 Feb 2022 @ 4:12am

      Re: Small creators don't have the leverage to put on pressure.

      How many times have you heard a story where some company was being obstructionist ... right up to the point where it became a news story, whereupon all the objections fell away and the company suddenly wanted to do the right thing?

      Yeah. That's how small creators will be able to put on pressure, and no other way. Even big Youtube stars weren't able to get Youtube to pay attention, until they made it a news story. That is, it wasn't simply "I've got 6 million subscribers and I'd like you to reconsider." It was 6 million subscribes DDOSing their phone bank ... er, twitter channel.

      link to this | view in chronology ]

      • icon
        nasch (profile), 17 Feb 2022 @ 3:50pm

        Re: Re: Small creators don't have the leverage to put on pressur

        That is, it wasn't simply "I've got 6 million subscribers and I'd like you to reconsider." It was 6 million subscribes DDOSing their phone bank ... er, twitter channel.

        Right, so how does someone with 6,000 or 600 subscribers get a fair shake?

        link to this | view in chronology ]

    • identicon
      Anonymous Coward, 19 Feb 2022 @ 12:23am

      Re: Small creators don't have the leverage to put on pressure.

      I agree with you there, but to be fair, if they did that, a politician / media outlet would scream about how "YouTube doesn't care" about whatever their pet issue of the day is.

      They are in an unenviable position where no matter what they do they'll be the "bad guy".

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 16 Feb 2022 @ 7:58pm

    How easy to moderate this comment... Too easy for some maybe.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 16 Feb 2022 @ 8:00pm

    lol was the original video on yt, and still up? i might have to look into that.

    link to this | view in chronology ]

  • icon
    TheComputerGuy (profile), 16 Feb 2022 @ 9:34pm

    Banning All Graphic

    Next YouTube will ban any graphics from the site. The logo will be removed, then the thumbnails, and eventually, it will become SoundCloud.

    link to this | view in chronology ]

  • identicon
    me, 17 Feb 2022 @ 5:07am

    They replied with an apology

    Until.... it happens the next time, and the next and the next. They want creators to provide content but then make life miserable for them when they do.

    link to this | view in chronology ]

  • identicon
    Rekrul, 18 Feb 2022 @ 7:29pm

    Charlie's tweet is spot-on: No human actually reviews appeals, they simply get automatically rejected by a bot.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

Close

Email This

This feature is only available to registered users. Register or sign in to use it.