Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Pinterest's Moderation Efforts Still Leave Potentially Illegal Content Where Users Can Find It (July 2020)

from the pin-this dept

Summary: Researchers at OneZero have been following and monitoring Pinterest's content moderation efforts for several months. The "inspiration board" website hosts millions of images and other content uploaded by users.

Pinterest's moderation efforts are somewhat unique. Very little content is actually removed, even when it might violate the site's guidelines. Instead, as OneZero researchers discovered, Pinterest has chosen to prevent the content from surfacing by blocking certain keywords for generating search results.

The problem, as OneZero noted, is that hiding content and blocking keywords doesn't actually prevent users from finding questionable content. Some of this content includes images that sexually exploit children.

While normal users may never see this using Pinterest's built-in search tools, users more familiar with how search functions work can still access content Pinterest feels violates its guidelines, but hasn't actually removed from its platform. By navigating to a user's page, logged-out users can perform searches that seem to bypass Pinterest's keyword-blocking. Using Google to search the site -- instead of the site's own search engine -- can also surface content hidden by Pinterest.

Pinterest's content moderation policy appears to be mostly hands-off. Users can upload nearly anything they want to with the company only deleting (and reporting) clearly illegal content. For everything else that's questionable (or potentially harms other users), Pinterest opts for suppression, rather than deletion.

“Generally speaking, we limit the distribution of or remove hateful content and content and accounts that promote hateful activities, false or misleading content that may harm Pinterest users or the public’s well-being, safety or trust, and content and accounts that encourage, praise, promote, or provide aid to dangerous actors or groups and their activities,” Pinterest’s spokesperson said of the company’s guidelines.

Unfortunately, users who manage to bypass keyword filters or otherwise stumble across buried content will likely find themselves directed to other buried content. Pinterest's algorithms surface content related to whatever users are currently viewing, potentially leading users even deeper into the site's "hidden" content.

Decisions to be made by Pinterest:

  • Is hiding content effective in steering users away from subject matter/content Pinterest would rather they didn't access?
  • Would deletion -- rather than hiding -- result in affected users leaving the platform?
  • Is questionable content a severe enough problem the company should rethink its moderation protocols?
  • Should "related content" algorithms be altered to prevent the surfacing of hidden content?
Questions and policy implications to consider:
  • Does hiding -- rather than removing -- content potentially encourage users to use this invisibility to engage in surreptitious distribution of questionable or illegal content?
  • Does the possibility of hidden content resurfacing steer ad buyers away from the platform?
  • Will this approach to moderation -- hidden vs. deletion -- remain feasible as pressure for sites to aggressively police misinformation and "fake news" continues to mount?
Resolution: Pinterest's content moderation strategy remains mostly unchanged. As the site's spokesperson stated, the site appears to feel the hiding of content addresses most raised concerns, even if it does allow more determined site users to locate content the site would rather they never saw.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: content moderation, search
Companies: pinterest


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 12 Sep 2020 @ 2:53pm

    So, it hides content from the users who want content to be hidden. Users who go out of their way to find the hidden content presumably want to see said content, so they aren't going to complain.

    link to this | view in chronology ]

  • icon
    Tanner Andrews (profile), 13 Sep 2020 @ 6:52am

    secret code words

    Evidently they have secret code words for otherwise objectionable content. So, you search using "5g" for unreliable health information, or "we go all" to learn about the undesirability of darker-complected persons. And there is a different secret keyword, not mentioned in the article, for potentially illegal dirty pictures.

    It seems a fair trade off to me. If you want that content, you can have it. And if not, well.

    The problem comes up when you get suppressed content because it is related to your actual search. You go looking for information about car phones, and that leads to "5g", which leads to the unreliable health information, or, worse, information about the over-hyping and under-delivery of phone service.

    Perhaps a bug fix is in order. Unless you expressly ask for the suppressable information, normal exploring does not bring it up. Getting this ``right'' is what computer scientists call an NP-hairy problem.

    link to this | view in chronology ]

    • identicon
      Bruce C., 13 Sep 2020 @ 8:08am

      Re: secret code words

      The underlying problem is the "illegal" part. That means either a) geo-blocking (with its own limitations at least as bad as moderation), b) conforming to the requirements of the most restrictive governments (which probably conflict) or c) having a take-down policy.

      On the other hand, for "merely objectionable" content, Pinterest's model will probably hold up for most scenarios short of active trolling. Things could easily fall apart if some genius decides to start using politicians' names as code words for extreme adult content.

      link to this | view in chronology ]

      • icon
        Tanner Andrews (profile), 14 Sep 2020 @ 7:05am

        Re: Re: secret code words

        decides to start using politicians' names as code words for extreme adult content

        Or, given one politician's expressed interest in his daughter, using that pol's name as a code word for otherwise-banished child porn content.

        link to this | view in chronology ]

      • identicon
        Anonymous Coward, 14 Sep 2020 @ 2:03pm

        Re: Re: secret code words

        Sounds like the old April fool's day joke about setting the "evil bit" for Malware and Spam.

        A cousin "naughty bit" could potentially work merely for "poor taste/advertiser unfriendly content" that lets you post say dead baby jokes without it being included being included in the general searches nit for people who know what they are looking for but such coexisting tends to work poorly with "moralists" who feel such content shouldn't be there and disagreement with who falls on what line would be controversial being other. We already saw that with "gay or lesbian" being considered adult content just from porn search term collisions and the rough consensus is it offensive to call families not family friendly for having two moms or dads.

        I wonder if in a silly political asscovering move Pintrest would be better off allowing explicit and transparent as possible curation algorithim weight codes based upon an account's training and the ability to copy, revert to default, or paste one from others. That way if people start passing around their own curator to be some unholy misinformation bubble they could wash their hands of it as a user generated algorithim and bubble and not theirs.

        Empowring users to try to avoid any potential claims of responsibility certainly wouldn't be appreciated by detractors even if it removed a "their algorithim is deliberately causing people to do bad things/they have too much power!" shallow talking points arguments.

        link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

Close

Email This

This feature is only available to registered users. Register or sign in to use it.