Techdirt Podcast Episode 275: The State Of Trust & Safety
from the what's-happening dept
For some reason, a lot of people who get involved in the debate about content moderation still insist that online platforms are "doing nothing" to address problems — but that's simply not true. Platforms are constantly working on trust and safety issues, and at this point many people have developed considerable expertise regarding these unique challenges. One such person is Alex Feerst, former head of Trust & Safety at Medium, who joins us on this weeks episode to clear up some misconceptions and talk about the current state of the trust and safety field.
Follow the Techdirt Podcast on Soundcloud, subscribe via Apple Podcasts, or grab the RSS feed. You can also keep up with all the latest episodes right here on Techdirt.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: alex feerst, content moderation, podcast, trust and safety
Reader Comments
Subscribe: RSS
View by: Time | Thread
Maybe but..
I find interesting the concept that the employees working in these positions are viewed internally as "helper" types.
This to me reminds me of police officers and politicians.. Basically the trouble I see is that positions of this type are totally the positions that would naturally attract helper types, but they are also the positions that would naturally attract people who want or think they deserve to have and assert power over others, which are, in my mind, the opposite of what you want in those positions.
In my mind it's not just about getting the right type of people in those positions, but about people knowing that you have the right people in those positions.
[ link to this | view in chronology ]
On transparency...
I am hopeful or idealistic that obfuscation is a just stopgap tool for moderation just as I see it for tech security and it eventually moderation policy will be solid enough that companies will be able to open about how they do it without worrying about the loopholes being so bad that they will be vastly exploited
[ link to this | view in chronology ]