Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Social Media Services Respond When Recordings Of Shooting Are Uploaded By The Person Committing The Crimes (August 2015)

from the real-time-decision-making dept

Summary: The ability to instantly upload recordings and stream live video has made content moderation much more difficult. Uploads to YouTube have surpassed 500 hours of content every minute (as of May 2019), making any form of moderation inadequate.

The same goes for Twitter and Facebook. Facebook's user base exceeds two billion worldwide. Over 500 million tweets are posted to Twitter every day (as of May 2020). Algorithms and human moderators are incapable of catching everything that violates terms of service.

When the unthinkable happens -- as it did on August 26, 2015 -- these two social media services swiftly responded. But even their swift efforts weren't enough. The videos posted by Vester Lee Flanagan, a disgruntled former employee of CBS affiliate WDBJ in Virginia, showed him tracking down a WDBJ journalist and cameraman and shooting them both.

Both platforms removed the videos and deactivated Flanagan's accounts. Twitter's response took only minutes. But the spread of the videos had already begun, leaving moderators to try to track down duplicates before they could be seen and duplicated yet again. Many of these ended up on YouTube, where moderation efforts to contain the spread still left several reuploads intact. This was enough to instigate an FTC complaint against Google, filed by the father of the journalist killed by Flanagan. Google responded by stating it was still removing every copy of the videos it could locate, using a combination of AI and human moderation.

Users of Facebook and Twitter raised a novel complaint in the wake of the shooting, demanding "autoplay" be opt in -- rather than the default setting -- to prevent them from inadvertently viewing disturbing content.

Moderating content as it is created continues to pose challenges for Facebook, Twitter, and YouTube -- all of which allow live-streaming.

Decisions to be made by social media platforms:

  • What efforts are being put in place to better handle moderation of streaming content?
  • What efforts -- AI or otherwise -- are being deployed to potentially prevent the streaming of criminal acts? Which ones should we adopt?
  • Once notified of objectionable content, how quickly should we respond?
  • Are there different types of content that require different procedures for responding rapidly?
  • What is the internal process for making moderation decisions on breaking news over streaming?
  • While the benefits of auto-playing content are clear for social media platforms, is making this the default option a responsible decision -- not just for potentially-objectionable content but for users who may be using limited mobile data?

Questions and policy implications to consider:

  • Given increasing Congressional pressure to moderate content (and similar pressure from other governments around the world), are platforms willing to "over-block" content to demonstrate their compliance with these competing demands? If so, will users seek out other services if their content is mistakenly blocked or deleted?
  • If objectionable content is the source for additional news reporting or is of public interest (like depictions of violence against protesters, etc.), do these concerns override moderation decisions based on terms of service agreements?
  • Does the immediate removal of criminal evidence from public view hamper criminal investigations? 
  • Are all criminal acts of violence considered violations of content guidelines? What if the crime is being committed by government agents or law enforcement officers? What if the video is of a criminal act being performed by someone other than the person filming it? 
Resolution: All three platforms have made efforts to engage in faster, more accurate moderation of content. Live-streaming presents new challenges for all three platforms, which are being met with varying degrees of success. These three platforms are dealing with millions of uploads every day, ensuring objectionable content will still slip through and be seen by hundreds, if not thousands, of users before it can be targeted and taken down.

Content like this is a clear violation of terms of service agreements, making removal -- once notified and located -- straightforward. But being able to "see" it before dozens of users do remains a challenge.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: case study, content moderation, live video, moderation, streaming, video streaming


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Pixelation, 7 Aug 2020 @ 7:25pm

    Could they delay "live streams" to create extra time for moderation?

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 7 Aug 2020 @ 8:24pm

    Re:

    There are far too many live streams, and there are even more than usual due to COVID. How many thousand churches are closed or semi-closed or serving self-isolating members via live streaming of services? How many amateur or wannabe-pro entertainers are streaming because live audiences are not available? College lectures ... the list goes on and on. All of Google's employees couldn't monitor everything in real-time.

    link to this | view in thread ]

  3. identicon
    Anonymous Coward, 7 Aug 2020 @ 8:46pm

    Re:

    No. They can't even get the moderation done correctly with regular videos. What makes you think that adding more time, when they can't do it with infinite time, will fix the issue? Let alone fix the issue for raw and unedited content that the streamer may have no control over?

    This isn't an issue of time. It's an issue of viewer sensibilities being violated on the internet. Which given the fact that it can't be done in the real world reliably, shouldn't surprise people.

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 7 Aug 2020 @ 8:50pm

    Re:

    Couldn't people not watch a thing that might upset them? These streams & videos are not like some calming ambient music in a garden setting with a jump scare.

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 7 Aug 2020 @ 10:59pm

    Re:

    Think about how much content. To put that 500 hours per minute into a different context. Just 4 minutes of uploads to YouTube would provide an entire year of employment to a moderator who does absolutely nothing, but spent 100% of their time at work reviewing videos. No meetings. No performance reviews. No professional development. Nothing except 8 hours of continuous review of uploaded videos, every working day.

    Now, if you assume a more reasonable 4 hours of reviewing videos per day, YouTube would require over a quarter of a million employees whose only job is to review videos. Not gonna happen.

    link to this | view in thread ]

  6. icon
    PaulT (profile), 8 Aug 2020 @ 1:53am

    Re:

    No. The problem is not that they don't have time to moderate, the problem is the volume of streams. Plus, if they delay the stream it can make the entire thing pointless, if people are streaming time sensitive events.

    link to this | view in thread ]

  7. icon
    seedeevee (profile), 8 Aug 2020 @ 6:09pm

    Moderating = Censoring

    and they are already censoring too much.

    And leave your "only The Govt Can Censor" BS at the keyboard.

    link to this | view in thread ]

  8. icon
    Stephen T. Stone (profile), 8 Aug 2020 @ 8:05pm

    Moderation is a platform/service owner or operator saying “we don’t do that here”. Personal discretion is an individual telling themselves “I won’t do that here”. Editorial discretion is an editor saying “we won’t print that here”, either to themselves or to a writer. Censorship is someone saying “you won’t do that anywhere” alongside threats or actions meant to suppress speech.

    Show me which one Twitter admins do when they boot someone from Twitter for violating the Terms of Service.

    link to this | view in thread ]

  9. icon
    Stephen T. Stone (profile), 8 Aug 2020 @ 8:08pm

    Oh, and before you try to whine about “censorship of conservatives”, you should know that Facebook tried to appease conservatives by refusing to punish or reversing punishments for major conservative pages for posts containing misinformation. That seems like a pro-conservative bias, if’n you ask me.

    link to this | view in thread ]

  10. icon
    PaulT (profile), 8 Aug 2020 @ 11:31pm

    Re: Moderating = Censoring

    No, leave your "moderation = censorship" BS at the door.

    The main difference is that while it's hard to move countries, you can take your whiny ass to any other website in seconds if you find that the one you use is telling people like you to STFU and GTFO too much for your tastes. Stop crying and do it, and tell the friends of yours who are also being told to shut their childish mouths to join you. Problem solved. Every one of these sites has a bunch of competition, the main problem is people insisting on trying to force sites that don't want them to host them, rather than going to a place that actually wants them there.

    link to this | view in thread ]

  11. icon
    PaulT (profile), 8 Aug 2020 @ 11:33pm

    Re:

    That is one of the more hilarious things that's been reported recently. These people whine about people attacking them, yet the only credible report of actual bias recently is one where they're being defended. In other words - if they were being treated equally, they'd be told to shut up even more than they are now. Sadly, it won't enter their thick skulls that they are the problem.

    link to this | view in thread ]

  12. identicon
    Anonymous Coward, 9 Aug 2020 @ 1:38am

    Re: Re: Moderating = Censoring

    The problem is not they will not go to other sites, but rather that the audience they want to reach will not follow them there. They do not want debate, but rather to force convert people to their point of view, or at least to drown out all opposing views

    link to this | view in thread ]

  13. identicon
    Anonymous Coward, 9 Aug 2020 @ 5:03am

    Re: Moderating = Censoring

    "And leave your "only The Govt Can Censor" BS at the keyboard."

    I do not recall the phrase you quoted as being common mantra on TD, perhaps you have an example.

    I suspect you are complaining about the commonly required explanation of the First Amendment and your associated rights.

    Yes, one can be censored by anyone else. It, by definition, does not have to be the government doing it, but that is what these fine people are trying to convey and you know it.

    /twocents

    link to this | view in thread ]

  14. identicon
    Anonymous Coward, 9 Aug 2020 @ 5:05am

    Re: Re: Moderating = Censoring

    ... is not ...
    what these fine people are trying to convey and you know it

    grrrr - more coffee

    link to this | view in thread ]

  15. icon
    johnwatson (profile), 9 Aug 2020 @ 10:51pm

    This isn't an issue of time. It's an issue of watcher sensibilities being disregarded on the web. Which given the way that it isn't possible in reality dependably, shouldn't shock individuals.

    link to this | view in thread ]

  16. identicon
    Anonymous Coward, 10 Aug 2020 @ 4:25am

    Autoplay video should be made opt-in everywhere and enforced at the browser level, just for being incredibly obnoxious as a concept.

    link to this | view in thread ]

  17. identicon
    Anonymous Coward, 10 Aug 2020 @ 7:21am

    Uploads to YouTube have surpassed 500 hours of content every minute (as of May 2019), making any form of moderation inadequate.

    Too big to succeed is too big to exist.

    link to this | view in thread ]

  18. identicon
    Anonymous Coward, 10 Aug 2020 @ 8:49am

    Re:

    YouTube is succeeding quite well thank you. What is however impossible is to get every one in town to agree on what should be moderated, never mind everyone in the whole world, and YouTube is trying to moderate the world, and so will have problems in doing that.

    If you prefer small platforms that are moderated close to your tastes, find them, or create your own. However do not complain about lack of content if you do that.

    link to this | view in thread ]

  19. icon
    That One Guy (profile), 10 Aug 2020 @ 10:25am

    Re:

    Awesome, time to axe the mail system, public roads, phones, the banking system, the internet that you just used to post comment...

    link to this | view in thread ]

  20. icon
    That One Guy (profile), 10 Aug 2020 @ 10:30am

    Re: Re:

    It does rather show that the whining about 'persecution' are actually working sadly, as you've got companies bending over backwards to give them special treatment lest they throw yet another childish tantrum, along with suggesting that companies need to stop walking on eggshells and trying to spare the feelings of the poor put upon conservatives and just flat out tell them that if they get hammered by moderation that's aimed at cutting down on assholes and the lies they tell more often it's because they are lying assholes.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

Close

Email This

This feature is only available to registered users. Register or sign in to use it.