Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Suppressing Content To Try To Stop Bullying (2019)

from the not-a-good-solution dept

Summary: TikTok, like many social apps that are mainly used by a younger generation, has long faced issues around how to deal with bullying done via the platform. According to leaked documents revealed by the German site Netzpolitik, one way that the site chose to deal with the problem was through content suppression -- but specifically by suppressing the content of those the company felt were more prone to being victims of bullying.

The internal documents showed different ways in which the short video content that TikTok is famous for would be rated for visibility. This could include content that was chosen to be “featured” (i.e., seen by more people) but also content that was deemed “Auto R” for a form of suppression. Content rated as such was excluded from the “for you” feed on Tiktok after reaching a certain number of views. The “for you” feed is how most people view TikTok videos, so this rating would effectively put a cap on views. The end result was the “reach” of content categorized as Auto R was significantly limited, and completely prevented from going “viral” and amassing a large audience or following.

What was somewhat surprising was that TikTok’s policies explicitly suggested putting those who might be bullied in the “Auto R” category -- even saying that those who were disabled, autistic, or with Down Syndrome, should be put in this category to minimize bullying.

According to Netzpolitik, employees at TikTok repeatedly pointed out the problematic nature of this decision, and how it was discriminatory itself and punishing people not for any bad behavior, but because of the belief that their differences might possibly lead to them being bullied. However, they claimed that they were prevented from changing the policies by TikTok’s corporate parent, ByteDance, which dictated the company’s content moderation policies.

Decisions to be made by TikTok:

  • What are the best ways to deal with and prevent bullying done on the platform?
  • What are the real world impacts of suppressing the viral reach of any content based on the type of person making the content?
  • Is it appropriate to effectively prevent those you think will be bullied from getting full access to your platform to prevent the possibility of bullying?
  • What data points are being assessed to justify the assumptions being made about “Auto R” being an effective anti-bullying tool?
Questions and policy implications to consider:
  • When there are strong pushes from policymakers to platforms that they need to “stop bullying” will it lead to unintended consequences like the effective minimization of access to these platforms by potential victims of bullying, rather than dealing with the root causes of bullying?
  • Will efforts to prevent a bad behavior be used to really sweep that activity under the rug, rather than looking at how to actually make a platform safer?
  • What is the role of technology intermediaries in preventing bad behavior?
Resolution: TikTok admitted that these rules were a “blunt instrument” that were put in place rapidly to try to minimize bullying on the platform -- but that the company had realized it was the “wrong” approach and had implemented more nuanced policies:

"Early on, in response to an increase in bullying on the app, we implemented a blunt and temporary policy," he told the BBC.

"This was never designed to be a long-term solution, and while the intention was good, it became clear that the approach was wrong.

"We have long since removed the policy in favour of more nuanced anti-bullying policies."

However, the Netzpolitik report suggested that this policy had been in place at least until September of 2019, just three months before its reporting came out in December of 2019. It is unclear exactly when the “more nuanced” anti-bullying policies were put in place, but it is possible that they came about due to the public exposure and pressure from the reporting on this issue.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: bullying, content moderation, suppression
Companies: tiktok


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. This comment has been flagged by the community. Click here to show it
    identicon
    Hanoi Jane and COVID19 - "God's gift to Techdirt", 7 Oct 2020 @ 8:28pm

    Gee, right here, Maz, ya MIGHT try to control your rabid fanboys

    who are suppressing viewpoints (with your active help of course via the site coding and the wrongly used "report" button which an Admin then approves of the censoring).

    You allow your fanboys to bully all dissent, and then wonder why even GOOGLE has to mark you "dangerous and derogatory"!

    link to this | view in thread ]

  2. This comment has been flagged by the community. Click here to show it
    identicon
    A big HOOTER., 7 Oct 2020 @ 8:34pm

    Oh, yeah, "Copia" is a great source! (That's just YOU, Maz...)

    LAST item there was "Posted on November 14, 2018 by Mike Masnick"

    Wow. What an active, influential "think tank" that is!

    link to this | view in thread ]

  3. icon
    Toom1275 (profile), 7 Oct 2020 @ 9:00pm

    Re: Gee, right here, Maz, ya MIGHT try to control your rabid fan

    [Ass-erts facts not in evidence]

    link to this | view in thread ]

  4. icon
    PaulT (profile), 7 Oct 2020 @ 11:59pm

    Re: Gee, right here, Maz, ya MIGHT try to control your rabid fan

    I admit, the fantasy version of the world that only exists in your head is very strange.

    link to this | view in thread ]

  5. icon
    PaulT (profile), 8 Oct 2020 @ 12:01am

    Re: Oh, yeah, "Copia" is a great source! (That's just YOU, Maz..

    How can someone so obsessively troll this site, yet miss the fact that all Copia articles have only been posted on this site for a long time? I mean, you only have to click on the link at the top of this article to see all of them...

    link to this | view in thread ]

  6. icon
    Stephen T. Stone (profile), 8 Oct 2020 @ 2:35am

    We don’t “bully all dissent”, Blueballs — only the dissent that is clearly in bad faith. You know, like “dissent” that concentrates on a decade-old slight that wasn’t even all that offensive (or funny) because you can’t let go of a grudge out of spite for…well, mainly yourself, at this point…or ignores the article at hand in favor of rattling off more grievances than a Donald Trump twitstorm (and with far less coherence than even his bullshit).

    Have you met Koby? I think you two would get along well.

    link to this | view in thread ]

  7. icon
    restless94110 (profile), 8 Oct 2020 @ 8:20am

    1787

    I can just see Ben Franklin, Madison, Jefferson, and the others going:

    We went to a fortune teller and she told us that in 250 years there was going to be this new word invented out of nothing called bullying and it would be used to destroy the 1st Amendment. What should we do?

    Ok, we got it. Make it God-given so that it can never be taken away by any government or citizen or business. There that should fix it.

    link to this | view in thread ]

  8. icon
    That Anonymous Coward (profile), 8 Oct 2020 @ 10:03am

    Better solution...
    Stop allowing humans to demand corporations do something, corporations are like large intellectually challenged teddy bears who might stumble on a solution after setting everything on fire first.

    Once upon a time if someones kid was bullying other kids, it was perfectly acceptable for any adult to slap the bully upside the head. It wasn't discussed it was how community worked. We expected kids to behave, didn't force them to like people but they knew going after someone meant your head would hurt & your parents would know about it... and well you might have problems sitting down for a bit.

    Now we have parents who will scream at you for looking at their little angel sideways as they run down the aisle ripping stuff off the shelves onto the floor.
    How DARE you tell me how to parent my child!!!
    Well maybe if you started parenting them I wouldn't have to say anything.

    My parents would have NEVER walked into the school and screamed at a teacher b/c I got a bad grade. I'd get a talking to (and being the defiant child I'd stay my course) but other adults were not disrespected to gain adoration from their kids.

    These kids MIGHT get bullied. News flash, those kids already knew this. Parents have blinders on that this never happens & when someone tries to tell them the truth they stop listening b/c their baby would never do that. In the grand scheme of things maybe just maybe lets stop abdicating the hard things we should be doing but won't to corporations to "take care of" for us.

    Imagine if teaching kids empathy mattered as much as having a sportsball season. We have parents demanding their kids get to play in the middle of a global pandemic, b/c their baby getting their moment in the spotlight matters more than you're kid might end up in the ICU or with long term heart/lung damage.

    link to this | view in thread ]

  9. icon
    PaulT (profile), 8 Oct 2020 @ 11:16pm

    Re: 1787

    "in 250 years there was going to be this new word invented out of nothing called bullying"

    While you are an idiot and you address a fantasy world that has nothing to do with reality, I did wonder if even this tidbit was remotely true. Of course, it's not, the people you mention would have been familiar with the word.

    https://www.etymonline.com/word/bully

    "Make it God-given"

    Also, I'd read the actual document you're referring to, because it doesn't say what you're imagining it does.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

Close

Email This

This feature is only available to registered users. Register or sign in to use it.