It is the best that Facebook can do, given its size. Other similarly large services can’t/don’t fare much better.
Just excuses for FB. What they're doing now is totally not the best they could do, given their tech capabilities and financial resources. Are they really using their best tech resources on this problem? Hardly - those resources go to increasing user engagement (including users who spread hate), and improving ad effectiveness. Growth (aka user engagement) and profitability have been the no.1 and no.2 goals of FB since it started. Moderation, content standards and legal compliance far behind, though, of course enough resources devoted to those things so people can say, well, they're doing the best they can.
Take the white supremacist groups that FB claims it can't identify, even while it's matching these groups to users who are ready to engage with the supremacist content. It does this matching by using the data in a very sophisticated way that demonstrates it can indeed identify white supremacist content for purposes of user engagement. The problem isn't whether FB could moderate this more effectively, rather, moderation's just a lower priority for FB.
But yes, let's agree the problem is making moderation better not making it perfect, so we should expect FB to make good progress improving moderation constantly, not just when it gets bad publicity for its failures in this area. How about, FB provide regular audits of its moderation efforts, and report how much it's spending and what tech resources it's applying to this problem. We're talking about hate speech advocating violence, and that stuff can cost people's lives.
/div>
Techdirt has not posted any stories submitted by kshear.
Re:
Just excuses for FB. What they're doing now is totally not the best they could do, given their tech capabilities and financial resources. Are they really using their best tech resources on this problem? Hardly - those resources go to increasing user engagement (including users who spread hate), and improving ad effectiveness. Growth (aka user engagement) and profitability have been the no.1 and no.2 goals of FB since it started. Moderation, content standards and legal compliance far behind, though, of course enough resources devoted to those things so people can say, well, they're doing the best they can.
Take the white supremacist groups that FB claims it can't identify, even while it's matching these groups to users who are ready to engage with the supremacist content. It does this matching by using the data in a very sophisticated way that demonstrates it can indeed identify white supremacist content for purposes of user engagement. The problem isn't whether FB could moderate this more effectively, rather, moderation's just a lower priority for FB.
But yes, let's agree the problem is making moderation better not making it perfect, so we should expect FB to make good progress improving moderation constantly, not just when it gets bad publicity for its failures in this area. How about, FB provide regular audits of its moderation efforts, and report how much it's spending and what tech resources it's applying to this problem. We're talking about hate speech advocating violence, and that stuff can cost people's lives.
/div>Techdirt has not posted any stories submitted by kshear.
Submit a story now.