Trust & Safety Professional Association Launches: This Is Important
from the exciting-news dept
One of the most frustrating things out there is the idea that content moderation choices made on various platforms are coming directly from the top. Too often, I've seen people blame Jack Dorsey or Mark Zuckerberg for content moderation decisions, as if they're sitting there at their laptops and twiddling their fingers over who gets blocked and who doesn't. Over the last decade or so, an entire industry has been built up to figure out how to make internet services as usable as possible, to deal with spam, and abuse, and more. That industry is generally called "trust and safety," and as a new industry it has grown up and professionalized quite a bit in the last decade -- though it rarely (if ever) gets the respect it deserves. As I mentioned on a recent episode of The Pivot podcast, many of the assumptions that people make about content moderation unfairly malign the large crew of people working in trust and safety who aren't interested in political bias, or silencing voices, but who legitimately are working very, very hard to figure out how to balance the many, many tradeoffs in trying to make internet services useful and welcoming to users.
That's why I'm really happy to see a new organization launch today, the Trust & Safety Professional Association, along with a sister organization, the Trust & Safety Foundation.
Today, we’re pleased to announce the Trust & Safety Professional Association (TSPA) and the Trust & Safety Foundation Project (TSF).* TSPA is a new, nonprofit, membership-based organization that will support the global community of professionals who develop and enforce principles and policies that define acceptable behavior online. TSF will focus on improving society’s understanding of trust and safety, including the operational practices used in content moderation, through educational programs and multidisciplinary research. Neither TSPA nor TSF are lobbying organizations, and will not advocate for public policy positions on behalf of corporate supporters or anyone else. Instead, we will support the community of people doing the work, and society’s understanding of it.
And I should note that the people behind this organization are incredible. If you told me about such an organization and asked me to suggest who should be involved, I would have included exactly the people who put this together, starting Adelin Cai and Clara Tsao, who both have tremendous experience in the trust and safety space, and the knowledge and thoughtful, balanced approach necessary to build organizations like the two launched today. If you ever need someone to talk through all the challenges to think through in building a successful trust and safety team, I'd highly recommend both Adelin and Clara. The board also includes some names you may recognize, including Professor Eric Goldman, former Twitter/Google lawyer and White House deputy CTO Alex Macgillivray, and former Mozilla Chief Legal Officer/COO and current Stellar Development Foundation CEO Denelle Dixon.
And... one of the initial projects that the Trust & Safety Foundation has launched is an ongoing series of trust and safety case studies written by... us. Techdirt's think tank arm, the Copia Institute, will be providing a series of trust and safety case studies to the Trust & Safety Foundation, which they'll be posting each week. We'll eventually be posting many of them to Techdirt as well, so you can expect those coming later this summer. The point of this library of case studies is to give people a better understanding of the impossible choices and tradeoffs that internet services need to make on a daily basis, and to highlight why what often seems like an "obvious" way to deal with some piece of content may not be so obvious once you explore it from all sides. Personally, I'm excited to get to help build out this library and to work with such a great team of people who are devoted to improving and professionalizing the space, while further educating everyone (both inside and outside the trust and safety space) how trust and safety efforts actually work.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: case studies, content moderation, education, professional association, trust & safety, trust and safety
Companies: copia institute, tsf, tspa
Reader Comments
Subscribe: RSS
View by: Time | Thread
So
[ link to this | view in thread ]
Re: So
wut?
[ link to this | view in thread ]
…fucking what
[ link to this | view in thread ]
Membership Resources
This sound like a really excellent idea. I hope, however, that amonst their activities they find a way to support those in the trenches and are suffering from the form of PTSD that comes from their being directly exposed to some of the crap they have to make decisions on. This could be by finding and listing resources local to their membership (in their various locals), or even instigating such resources (which might be referrals to professional counselors, or groups where they can just get things off their chest, a la AA). Showing such concern for the difficult job the membership is dealing with would go a long way toward gaining loyal followers. At the same time, it would educe recognition as to the conditions their members work in.
Along those lines, exposing a spectrum of the things they deal with to the public could help enormously. There are a few Senators and members of the Executive Branch (not to mention several Techdirt Troll Clowns) that could use some help in understanding what moderators do and are faced with. Elucidating what actually happens might also go a long way toward calming claims of bias. Unless they are under some kind of non disclosure agreement I think there is a great possibility that some exposure of what happens (where platforms won't) to what, and why certain decisions are made could enlighten even those who will deny even the facts relayed, as they will be taken as fake news, so some documentation would be appropriate.
[ link to this | view in thread ]
Re: So
What, and I mean this in all sincerity, the absolute fuck??
[ link to this | view in thread ]
Re: Membership Resources
I applaud the idea. It does sound like quite an undertaking. I look forward to seeing how the TSPA progresses.
Best of luck.
[ link to this | view in thread ]
The Respect it Deserves
though it rarely (if ever) gets the respect it deserves.
You globalist idiot assholes get more respect than you deserve. You hide behind your legal bullshit opinions and paid posters and phony pony names and idiotic recommendations and deserve to be SHOT full of BETTER ideas and BETTER politics and BETTER moderation and LESS censorship and MORE Great American traditions, values and points of view.
Amen.
[ link to this | view in thread ]
Re:
You are such a clueless idiot, Stone. He was making a quite coherent point, any idiot could understand it and appreciate it and promote it and give it a prize and a monetary award and public recognition and forward it to Bill Barr and Donald J. Trump and recommend it for the Presidential Medal of Freedom. I have one. You don't. Clear enough?
[ link to this | view in thread ]
Re: Re: So
It's NOT hard to UNDERSTAND. He was making a VERY CLEAR point and I Forwarded His Analysis to my personal friend Dan Bongino. Dan has been a little down, you leftie assholes gave him a Stomach Ulcer and when he ate Fatty Fish he had to Bend Over the Toilet while simultaneously Writhing in PAIN which is what I do every time I read your STUPID SHIT on this STUPID SITE frequented by Phony Pony idiots like YOU!
Clear enough?
[ link to this | view in thread ]
Re: Membership Resources
Elucidating what actually happens might also go a long way toward calming claims of bias.
Right. Elucidating. Might. Be. A. Word. That. Says. Exactly. Nothing. At. All. In. Your. Bullshit. Post. But. Sounds. Nice.
Elucidate for us now. Elucidate ANYTHING AT ALL other than your BULLSHIT monologue and EMPTY VACUOUS argument and then CENSOR MY POST and then KISS MY ASS you leftie clown.
[ link to this | view in thread ]
Re: Re: Membership Resources
Elucidate: Verb, to make lucid by explanation or analysis.
In English, it means to clarify.
How the person above wasn't being clear to you when they expressed their hopes for this association in shining a light on ensuring moderators have professional help on standby and what they deal with on a daily basis is baffling. Your response is either trollish or you genuinely cannot empathize with people who are not yourself in which case, I urge you to seek some help of your own.
[ link to this | view in thread ]
Re: Re: Re: So
So the more you read the higher chance of your ulcer dissolving you from the inside?
I can't see a downside to this, honestly. Here, have something else to read:
Shiva Ayyadurai didn't invent email, and he lost to a fake Indian, Elizabeth Warren. Oh SNAP!
[ link to this | view in thread ]
Good News
[ link to this | view in thread ]
New AI software????
I'm wondering if the ass clown who is commenting incoherently and profanely (thus getting flagged) is using this platform to try out some form of AI-generated comment shitpost??
Just sounds fake
....#FakeTroll
[ link to this | view in thread ]
re: attribution errors
Yeah!
THIS: I've seen people blame Jack Dorsey or Mark Zuckerberg for content
I saw lots of folk blame Hitler for shit too. Leadership is a bitch.
Mike, your cognitive dissonance is (predictable)
[ link to this | view in thread ]
Re: New AI software????
What a,paranoid, discrediting rant you are,spewing.
Maybe, stoo flagging these people, and let the "masses” decide for themselves? Oh, so Not-Techdirt!
But people like you are inherently “anti-democratic,” deploying tribal sectarians first, and AI/Moderator/FBI-bots after the fact.
Infragard, and you, suck dicks and dog nuts.
[ link to this | view in thread ]