Content Moderation Case Study: Pinterest's Moderation Efforts Still Leave Potentially Illegal Content Where Users Can Find It (July 2020)
from the pin-this dept
Summary: Researchers at OneZero have been following and monitoring Pinterest's content moderation efforts for several months. The "inspiration board" website hosts millions of images and other content uploaded by users.
Pinterest's moderation efforts are somewhat unique. Very little content is actually removed, even when it might violate the site's guidelines. Instead, as OneZero researchers discovered, Pinterest has chosen to prevent the content from surfacing by blocking certain keywords for generating search results.
The problem, as OneZero noted, is that hiding content and blocking keywords doesn't actually prevent users from finding questionable content. Some of this content includes images that sexually exploit children.
While normal users may never see this using Pinterest's built-in search tools, users more familiar with how search functions work can still access content Pinterest feels violates its guidelines, but hasn't actually removed from its platform. By navigating to a user's page, logged-out users can perform searches that seem to bypass Pinterest's keyword-blocking. Using Google to search the site -- instead of the site's own search engine -- can also surface content hidden by Pinterest.
Pinterest's content moderation policy appears to be mostly hands-off. Users can upload nearly anything they want to with the company only deleting (and reporting) clearly illegal content. For everything else that's questionable (or potentially harms other users), Pinterest opts for suppression, rather than deletion.
“Generally speaking, we limit the distribution of or remove hateful content and content and accounts that promote hateful activities, false or misleading content that may harm Pinterest users or the public’s well-being, safety or trust, and content and accounts that encourage, praise, promote, or provide aid to dangerous actors or groups and their activities,” Pinterest’s spokesperson said of the company’s guidelines.
Unfortunately, users who manage to bypass keyword filters or otherwise stumble across buried content will likely find themselves directed to other buried content. Pinterest's algorithms surface content related to whatever users are currently viewing, potentially leading users even deeper into the site's "hidden" content.
Decisions to be made by Pinterest:
- Is hiding content effective in steering users away from subject matter/content Pinterest would rather they didn't access?
- Would deletion -- rather than hiding -- result in affected users leaving the platform?
- Is questionable content a severe enough problem the company should rethink its moderation protocols?
- Should "related content" algorithms be altered to prevent the surfacing of hidden content?
- Does hiding -- rather than removing -- content potentially encourage users to use this invisibility to engage in surreptitious distribution of questionable or illegal content?
- Does the possibility of hidden content resurfacing steer ad buyers away from the platform?
- Will this approach to moderation -- hidden vs. deletion -- remain feasible as pressure for sites to aggressively police misinformation and "fake news" continues to mount?
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: content moderation, search
Companies: pinterest
Reader Comments
Subscribe: RSS
View by: Time | Thread
So, it hides content from the users who want content to be hidden. Users who go out of their way to find the hidden content presumably want to see said content, so they aren't going to complain.
[ link to this | view in thread ]
secret code words
Evidently they have secret code words for otherwise objectionable content. So, you search using "5g" for unreliable health information, or "we go all" to learn about the undesirability of darker-complected persons. And there is a different secret keyword, not mentioned in the article, for potentially illegal dirty pictures.
It seems a fair trade off to me. If you want that content, you can have it. And if not, well.
The problem comes up when you get suppressed content because it is related to your actual search. You go looking for information about car phones, and that leads to "5g", which leads to the unreliable health information, or, worse, information about the over-hyping and under-delivery of phone service.
Perhaps a bug fix is in order. Unless you expressly ask for the suppressable information, normal exploring does not bring it up. Getting this ``right'' is what computer scientists call an NP-hairy problem.
[ link to this | view in thread ]
Re: secret code words
The underlying problem is the "illegal" part. That means either a) geo-blocking (with its own limitations at least as bad as moderation), b) conforming to the requirements of the most restrictive governments (which probably conflict) or c) having a take-down policy.
On the other hand, for "merely objectionable" content, Pinterest's model will probably hold up for most scenarios short of active trolling. Things could easily fall apart if some genius decides to start using politicians' names as code words for extreme adult content.
[ link to this | view in thread ]
Re: Re: secret code words
Or, given one politician's expressed interest in his daughter, using that pol's name as a code word for otherwise-banished child porn content.
[ link to this | view in thread ]
Re: Re: secret code words
Sounds like the old April fool's day joke about setting the "evil bit" for Malware and Spam.
A cousin "naughty bit" could potentially work merely for "poor taste/advertiser unfriendly content" that lets you post say dead baby jokes without it being included being included in the general searches nit for people who know what they are looking for but such coexisting tends to work poorly with "moralists" who feel such content shouldn't be there and disagreement with who falls on what line would be controversial being other. We already saw that with "gay or lesbian" being considered adult content just from porn search term collisions and the rough consensus is it offensive to call families not family friendly for having two moms or dads.
I wonder if in a silly political asscovering move Pintrest would be better off allowing explicit and transparent as possible curation algorithim weight codes based upon an account's training and the ability to copy, revert to default, or paste one from others. That way if people start passing around their own curator to be some unholy misinformation bubble they could wash their hands of it as a user generated algorithim and bubble and not theirs.
Empowring users to try to avoid any potential claims of responsibility certainly wouldn't be appreciated by detractors even if it removed a "their algorithim is deliberately causing people to do bad things/they have too much power!" shallow talking points arguments.
[ link to this | view in thread ]