Putting Pinners First: How Pinterest Is Building Partnerships For Compassionate Content Moderation
from the rethinking-how-moderation-works dept
Last week, Santa Clara University hosted a gathering of tech platform companies to discuss how they actually handle content moderation questions. Many of the participants in the event have written essays about the questions that were discussed at the event. Last week we published five of those essays and this week we're continuing to publish more of them, including this one.
The way platforms develop content moderation rules can seem mysterious or arbitrary. At first glance, the result of this seemingly inscrutable process is varying guidelines across different platforms, with only a vague hint of an industry standard -- what might be banned on one platform seems to be allowed on another. While each platform may have nuances in the way they create meaningful content moderation rules, these teams generally seek to align with the platform's/company's purpose, and use policies and guidelines to support an overarching mission. Different platforms delivering unique value propositions to users' accounts for variations in content moderation approaches.
At Pinterest, our purpose is clear: we help people discover and do what they love by showing them ideas that are relevant, interesting, and personal. For people to feel confident and encouraged to explore new possibilities, or try new things on Pinterest, it's important that the Pinterest platform continues to prioritize an environment of safety and security. To accomplish that, a team of content policy professionals, skilled in collaborating across different technical and non-technical functions at the company, decide where we draw the lines on what we consider acceptable boundaries for content and behavior. Drawing upon the feedback of Pinterest users, and staying up to date on prevailing discourse about online content moderation, this team of dedicated content generalists brings diverse perspectives to bear upon the guidelines and processes that keep divisive, disturbing, or unsafe content off Pinterest.
We know how impactful Pinterest can be in helping people make decisions in their daily life, like what to eat or what to wear, because we hear directly from the Pinterest community. We've also heard how people use Pinterest to find resources to process illness or trauma they may have experienced. Sometimes, the content that people share during these difficult moments can be polarizing or triggering to others, and we have to strike the right balance of letting people rely on Pinterest as a tool for navigating these difficult issues, and living up to our goal of removing divisive, disturbing, or unsafe content. As a team, we have to consider the broad range of use cases for content on Pinterest. For example, important historical yet graphic images of war can be collected in the context of learning about world events, or to glorify violence. Our team takes different contextual signals into account during the review process in order to make meaningful content moderation choices that ensure a positive experience for our community. If we wish to have the impact we hope to have in people's lives, we must also take responsibility for their entire experience.
To be responsible for the online environment that our community experiences, and to be aware of how that experience connects in a concrete way to their life offline, means we cultivate the humility to realize our team's limitations. We can't claim to be experts in fields like grief counseling, eating disorder treatment, or suicide prevention -- areas that many groups and individuals have dedicated their careers to supporting -- so it's crucial that we partner with experts for the guidance, specialized skills, and knowledge that will enable us to better serve our community with respect, sensitivity, and compassion.
A couple years ago, we began reexamining our approach to one particularly difficult issue - eating disorders - to understand the way our image-heavy platform might contribute to perpetuating unhealthy stereotypes about the ideal body. We had already developed strict rules about content promoting self-harm, but wanted to ensure we were being thoughtful about content offering "thinspiration" or unhealthy diets from all over the internet. To help us navigate this complicated issue, we sought out the expertise of the National Eating Disorder Association (NEDA) to audit our approach, and understand all of the ways we might engage with people using the platform in this way.
Prior to reaching out to NEDA, we put together a list of search queries and descriptive keyword terms that we believed strongly signaled a worrying interest in self-harm behaviors. We limit the search results we show when people seek out content using these queries, and also use these terms as a guide for Pinterest's operational teams to decide if any given piece of self-harm-related content should be removed or hidden from public areas of the service. The subject matter experts at NEDA generously agreed to review our list to see if our bar for problematic terms was consistent with their expert knowledge, and they provided us with the feedback we needed to ensure we were aligned. We were relieved to hear that our list was fairly comprehensive, and that our struggle with grey area queries and terms was not unique. Since beginning that partnership with NEDA, they have developed a rich Pinterest profile to inspire people by sharing stories of recovery, content about body positivity, and tips for self-care and illness management. By maintaining a dialogue with NEDA, the Pinterest team has continued to consider and operationalize innovative features to facilitate possible early intervention on the platform. For example, we provide people seeking eating disorder content with an advisory that also links to specialized resources on NEDA's website, and supported their campaign for National Eating Disorder Awareness Week. Through another partnership and technical integration with Koko, a third party service that provides platforms with automated and peer-to-peer chat support for people in crisis, we're also able to provide people who may be engaging in self-harm behaviors with direct, in-the-moment crisis prevention.
Maintaining a safe and secure environment in which people can feel confident to try new things requires a multifaceted approach and multifaceted perspectives. Our team is well-equipped to grapple with broad online safety and content moderation issues, but we have to recognize when we might lack in-house expertise in more complex areas that require additional knowledge and sensitivity. We have much more work to do, but these types of partnerships help us adapt and grow as we continue to support people using Pinterest to discover and do the things they love.
Adelin Cai runs the Policy Team at Pinterest
Filed Under: compassion, content moderation, outside groups
Companies: pinterest