Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Moderating An Anonymous Social Network (2015)

from the anonymity-challenge dept

Summary: Between around 2013 and 2015 there was a sudden popularity of so-called “anonymous” social networks. A few had existed before, but suddenly the market was full of them: Whisper, Secret and Yik Yak received the most attention. All of them argued that by allowing people to post content anonymously, they were helping people, allowing them to express their true thoughts rather than repress them. Whisper and Secret both worked by letting people anonymously post short text, which would be shown over an image in the background.

In practice, many of the apps filled up with harassment, bullying, hateful content and the like. Whisper, as one of the bigger companies in the space, invested heavily in content moderation from early on, saying that it had set up an outsourced team (via TaskUs in the Philippines) to handle moderation. However, questions of scalability became an issue, and the company also built software to help with content moderation, called “The Arbiter.” In press reports, Whisper employees suggested “The Arbiter” was almost perfect:

On Whisper, “the golden rule is don’t be mean, don’t be gross, and don’t use Whisper to break the law,” says the company’s chief data officer, Ulas Bardak, who spearheaded development of the Arbiter along with data scientist Nick Stucky-Mack. That’s not a philosophy that you can boil down to a simple list of banned words. The Arbiter is smart enough to deal with an array of situations, and even knows when it’s not sure if a particular item meets the service’s guidelines.

However, even with The Arbiter, the company insisted that it needed humans, since Arbiter learned from the human moderators.

In its first few months of operation, the Arbiter has had a huge impact on how Whisper moderates itself. But even though there’s plenty of opportunity to fine-tune it over time, Whisper has no plans to eliminate the human touch in moderation altogether. After all, the only reason the Arbiter is effective is because it bases its decisions on those of human moderators. Which is why the company is continuing to shovel data from human-moderated Whispers into the software’s knowledge bank.

There’s always going to be a hybrid approach,” says Heyward. “The truth is, the way we use people today is very different from the way we used them a year ago or six months ago.” With the Arbiter humming along and handling much of the grunt work, the humans can focus more on the material that isn’t an easy call. And maybe Whisper will be able to pull off the not-so-easy feat of improving the quality of its content even as its community continues to grow.

Another article about Whisper’s approach to content moderation detailed how humans and the software work together.

Moderators look at Whispers surfaced by both machines and people: Users flag inappropriate posts and algorithms analyze text and images for anything that might have slipped through the cracks. That way, the company is less likely to miss cyberbullying, sex, and suicide messages. Moderators delete the bad stuff, shuffle cyberbullies into a “posts-must-be-approved-before-publishing” category, and stamp suicide Whispers with a “watermark” — the number for the National Suicide Hotline.

As you might imagine, the man power and operational systems required for that execution are huge. Whisper’s content moderation manual is nearly 30 pages. The standards get into the nitty-gritty, specifying minutia like whether a picture of a man shirtless outside is appropriate, but a selfie shirtless indoors is not.

When the TaskUs team comes across physical threats, it escalates the message to Whisper itself. “If someone posts, ‘I killed her and buried her in the backyard,’ then that’s a piece of content the company will report to the authorities,” TaskUs CEO Bryce Maddock says. “They’re going to pull the UID on your cell phone from Verizon or AT&T and the FBI and local police will show up at your door. It happens quite a bit.”

Even so there was significant controversy over how Whisper handled bullying and hateful content on its site, as well as how it maintained actual anonymity for its users. There were concerns raised that the app was not actually anonymous, and tracked its users. Whisper disputed some of these reports and claimed that some of the tracking was done both with permission and for good reasons (such as to do research on how to decrease suicide rates).

Decisions to be made by Whisper:

  • How do you keep an anonymous social network from becoming abusive?
  • How aggressive should you be in moderating content on an anonymous social network?
  • What are the tradeoffs between tracking users to prevent bad behavior and providing true anonymity?
  • Can an algorithm successfully determine and block detrimental content on a platform like Whisper?
Questions and policy implications to consider:
  • Is an anonymous social media network net positive or net negative?
  • Does anonymity make content moderation more difficult?
  • How do you protect users on an anonymous social network?
Resolution: One interesting aspect of having an anonymous social media application is that users might not even realize if their content is restricted. An academic paper from 2014 that explored Whisper’s content moderation features suggested that the app deleted significantly more content than other forms of social media.

Anonymity facilitates free speech, but also inevitably fosters abusive content and behavior. Like other anonymous communities, Whisper faces the same challenge of dealing with abusive content (e.g., nudity, pornography or obscenity) in their network.

In addition to a crowdsourcing-based user reporting mechanism, Whisper also has dedicated employees to moderate whispers. Our basic measurements... also suggest this has a significant impact on the system, as we observed a large volume of whispers (>1.7 million) has been deleted during the 3 months of our study. The ratio of Whisper’s deleted content (18%) is much higher than traditional social networks like Twitter (<4%)

The research dug into what kinds of content was deleted and from what types of users. Part of what it found is that people with deleted content often try to repost it (and frequently get the reposts blocked as well).

Finally, we take a closer look at the authors of deleted whispers to check for signs of suspicious behavior. In total, 263K users (25.4%) out of all users in our dataset have at least one deleted whisper. The distribution of deleted whispers is highly skewed across these users: 24% of users are responsible for 80% of all deleted whispers. The worst offender is a user who had 1230 whisper deleted during the time period of our study, while roughly half of the users only have a single deletion….

We observed anecdotal evidence of duplicate whispers in the set of deleted whispers. We find that frequently reposted duplicate whispers are highly likely to be deleted. Among our 263K users with at least 1 deleted whisper, we find 25K users have posted duplicate whispers….

As for Whisper itself, the company has gone through many changes and problems. It’s biggest competitors, Secret and YikYak, both shut down, but Whisper remains in business -- though not without problems. Whisper laid off a significant amount of its staff and all of its large institutional investors quit the board in 2017.

In the spring of 2020, security researchers discovered that nearly all of Whisper’s content was available for download via an unsecured database, allowing researchers to search through all of the content posted on the site. While the company insisted that the only data that was in the database was the same as what was publicly available through the app, they conceded that within the app, you did not have the ability to run queries on the database. Even years after the app was popular, it seems that concerns about anonymity and privacy remain.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: abuse, ai, anonymity, content moderation, free speech, harassment
Companies: whisper


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    anonymous, 4 Nov 2020 @ 5:16pm

    do these authors live under a rock?

    If that's how they classify a social network, they need to rewrite that article after researching Christopher Poole and the forever infamous 4chan.org

    link to this | view in chronology ]

    • icon
      Mike Masnick (profile), 5 Nov 2020 @ 11:10am

      Re: do these authors live under a rock?

      I wrote most of this one. I'm familiar with 4chan and know who Christopher Poole is... and... I do not understand what that has to do with the question of how Whisper did its content moderation.

      link to this | view in chronology ]

  • icon
    PaulT (profile), 4 Nov 2020 @ 11:52pm

    "In total, 263K users (25.4%) out of all users in our dataset have at least one deleted whisper. The distribution of deleted whispers is highly skewed across these users: 24% of users are responsible for 80% of all deleted whispers. The worst offender is a user who had 1230 whisper deleted during the time period of our study, while roughly half of the users only have a single deletion"

    Obviously, I don't have figures for other networks, but I wouldn't be surprised if this describes a typical breakdown for other platforms. You have a majority of people rarely, if ever, posting anything objectionable. A minority of users regularly being offensive for whatever reason, be that deliberate trolling or just having behaviour that others find toxic, then one outright asshole trying to ruin it for everyone.

    Extrapolated to Facebook or Twitter, this would be in line with what I would assume to be true there - most people are going about their day, but you have deliberate dickheads and the occasional obsessive person trying to derail the entire site. So, it's better for everyone if the lone dickhead is kicked off and the others who are disrupting the site have their content moderated to improve the functionality for everyone.

    "Among our 263K users with at least 1 deleted whisper, we find 25K users have posted duplicate whispers…."

    Again, this would indicate a similarity with other platforms, where the people who get moderated aren't posting their own thoughts but sharing memes or retweeting something dumb. This could be one reason why we find certain groups of people complaining that they are being targeted - when one of them post something offensive, they all just copy each other to such a degree that they all get moderated even though there was really only one piece of offensive content posted.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

Close

Email This

This feature is only available to registered users. Register or sign in to use it.