Content Moderation Case Study: Pinterest's Moderation Efforts Still Leave Potentially Illegal Content Where Users Can Find It (July 2020)
from the pin-this dept
Summary: Researchers at OneZero have been following and monitoring Pinterest's content moderation efforts for several months. The "inspiration board" website hosts millions of images and other content uploaded by users.
Pinterest's moderation efforts are somewhat unique. Very little content is actually removed, even when it might violate the site's guidelines. Instead, as OneZero researchers discovered, Pinterest has chosen to prevent the content from surfacing by blocking certain keywords for generating search results.
The problem, as OneZero noted, is that hiding content and blocking keywords doesn't actually prevent users from finding questionable content. Some of this content includes images that sexually exploit children.
While normal users may never see this using Pinterest's built-in search tools, users more familiar with how search functions work can still access content Pinterest feels violates its guidelines, but hasn't actually removed from its platform. By navigating to a user's page, logged-out users can perform searches that seem to bypass Pinterest's keyword-blocking. Using Google to search the site -- instead of the site's own search engine -- can also surface content hidden by Pinterest.
Pinterest's content moderation policy appears to be mostly hands-off. Users can upload nearly anything they want to with the company only deleting (and reporting) clearly illegal content. For everything else that's questionable (or potentially harms other users), Pinterest opts for suppression, rather than deletion.
“Generally speaking, we limit the distribution of or remove hateful content and content and accounts that promote hateful activities, false or misleading content that may harm Pinterest users or the public’s well-being, safety or trust, and content and accounts that encourage, praise, promote, or provide aid to dangerous actors or groups and their activities,” Pinterest’s spokesperson said of the company’s guidelines.
Unfortunately, users who manage to bypass keyword filters or otherwise stumble across buried content will likely find themselves directed to other buried content. Pinterest's algorithms surface content related to whatever users are currently viewing, potentially leading users even deeper into the site's "hidden" content.
Decisions to be made by Pinterest:
- Is hiding content effective in steering users away from subject matter/content Pinterest would rather they didn't access?
- Would deletion -- rather than hiding -- result in affected users leaving the platform?
- Is questionable content a severe enough problem the company should rethink its moderation protocols?
- Should "related content" algorithms be altered to prevent the surfacing of hidden content?
- Does hiding -- rather than removing -- content potentially encourage users to use this invisibility to engage in surreptitious distribution of questionable or illegal content?
- Does the possibility of hidden content resurfacing steer ad buyers away from the platform?
- Will this approach to moderation -- hidden vs. deletion -- remain feasible as pressure for sites to aggressively police misinformation and "fake news" continues to mount?
Filed Under: content moderation, search
Companies: pinterest