Rights Groups Demand Facebook Set Up Real Due Process Around Content Moderation
from the seems-like-a-good-idea dept
For quite some time now, when discussing how the various giant platforms should manage the nearly impossible challenges of content moderation, one argument I've fallen back on again and again is that they need to provide real due process. This is because, while there are all sorts of concerns about content moderation, the number of false positives that lead to "good" content being taken down is staggering. Lots of people like to point and laugh at these, but any serious understanding of content moderation at scale has to recognize that when you need to process many many thousands of requests per day, often involving complex or nuanced issues, many, many mistakes are going to be made. And thus, you need a clear and transparent process that enables review.
A bunch of public interest groups (including EFF) have now sent an open letter to Mark Zuckerberg, requesting that Facebook significantly change its content removal appeal process, to be much clearer and much more accountable. The request first covers how clear the notice should be concerning what content caused the restriction and why:
Notice: Clearly explain to users why their content has been restricted.
- Notifications should include the specific clause from the Community Standards that the content was found to violate.
- Notice should be sufficiently detailed to allow the user to identify the specific content that was restricted, and should include information about how the content was detected, evaluated, and removed.
- Individuals must have clear information about how to appeal the decision.
And then it goes into many more details on how an appeal should work, involving actual transparency, more detailed explanations, and knowledge that an appeal actually goes to someone who didn't make the initial decision:
Appeals: Provide users with a chance to appeal content moderation decisions.
- The appeals mechanism should be easily accessible and easy to use.
- Appeals should be subject to review by a person or panel of persons not involved in the initial decision.
- Users must have the right to propose new evidence or material to be considered in the review.
- Appeals should result in a prompt determination and reply to the user.
- Any exceptions to the principle of universal appeals should be clearly disclosed and compatible with international human rights principles.
- Facebook should collaborate with other stakeholders to develop new independent self-regulatory mechanisms for social media that will provide greater accountability.
Frankly, I think this is a great list, and am dismayed that the large platforms haven't implemented something like this alread. For example, we recently wrote about Google deeming our blog post on the difficulty of content moderation to be "dangerous or derogatory." In that case, we initially got no further information other than that claim. And the appeals process was totally opaque. The first time we appealed, the ruling was overturned (again with no explanation) and a month later when that article got dinged again, the appeal was rejected.
After we published that article, we had an employee from the Adsense team eventually reach out to us to explain that it was "likely" that some of the comments on that article were what triggered the problems. After pointing out that there were well over 300 comments on the article, we were eventually pointed to one particular comment that used some slurs, though the comment used them to demonstrate the ridiculousness of automated filters, rather than as derogatory epithets.
However, as I noted in my response, my main complaint was not Google's silly setup, but the fact that it provided no actual guidance. We were not told that it was a comment that was to blame until after our published article resulted in someone higher up on the AdSense team reaching out. I pointed out that it seemed only reasonable that Google should share with us specifically what term it felt we had violated and which content was the problem so that we could then make an informed decision. Similarly, the appeals process was entirely opaque.
While the reasons that Google and Facebook have not yet created this kind of due process are obvious (it would be kinda costly, for one), it does seem like such a system will be increasingly important, and it's good to see these groups pushing Facebook on this in particular.
Of course, earlier this year, Zuckerberg had floated an idea of an independent (i.e. outside of Facebook) third party board that could handle these kinds of content moderation appeals, and... a bunch of people freaked out, falsely claiming that Zuckerberg wanted to create a special Facebook Supreme Court (even as he was actually advocating for having a body outside of Facebook reviewing Facebook's decisions).
No matter what, it would be good for the large platforms to start taking these issues seriously, not only for reasons of basic fairness and transparency, but because it would also serve to better make the public comfortable with how this process works. When it is, as currently construed, a giant black box, that leads to a lot more anger and conspiracy thinking over how content moderation actually works.
Update: It appears that shortly after this post went out, Zuckerberg told reporters that Facebook is now going ahead with creating an independent body to handle appeals. We'll have more on this once some details are available.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: content moderation
Companies: aclu, eff, facebook
Reader Comments
Subscribe: RSS
View by: Time | Thread
Contradicting own statement Section 230 gives arbitrary power.
"And, I think it's fairly important to state that these platforms have their own First Amendment rights, which allow them to deny service to anyone."
https://www.techdirt.com/articles/20170825/01300738081/nazis-internet-policing-content -free-speech.shtml
So where do you find these new user's "rights" in the FLAT unqualified statement you made there?
[ link to this | view in chronology ]
Re: Contradicting own statement Section 230 gives arbitrary power.
[ link to this | view in chronology ]
Re: Contradicting own statement Section 230 gives arbitrary power.
Facebook (or Google, or Twitter, or any other platform) has the absolute legal right to moderate content in any way, on any basis, and with any (or no) degree of transparency they wish.
These two statements are entirely consistent with each other. But for some reason you seem to believe they contradict each other.
[ link to this | view in chronology ]
Re: Contradicting own statement Section 230 gives arbitrary power.
Nor did he ever use the term "user rights." Because these are not "rights" that the user has which independently provide power over facebook, but terms of a contract that facebook might (or might not) voluntarily agree to.
[ link to this | view in chronology ]
Re: Contradicting own statement Section 230 gives arbitrary power.
Your failure to address this, combined with your combative tone, is why you get a flag.
[ link to this | view in chronology ]
Re: Contradicting own statement Section 230 gives arbitrary power.
[ link to this | view in chronology ]
Let's have some details on your own alleged "voting system".
The KEY one of course is whether an Administrator okays the censoring with added editorial warning that you euphemize as "hiding".
You call for others of vastly larger scale to be transparent but to say the least, don't lead by example.
[ link to this | view in chronology ]
Re: Let's have some details on your own alleged leaving forever
[ link to this | view in chronology ]
Re: Let's have some details on your own alleged "voting system".
[ link to this | view in chronology ]
Re: Let's have some details on your own alleged "voting system".
I flagged you because you're a belligerent, often incomprehensible fool with massive holes in your understanding of everything and yet insult others for what you perceive (wrongly) to be holes in their understanding.
There ya go. Now piss off.
[ link to this | view in chronology ]
Re: Let's have some details on your own alleged "voting system".
I flagged you because you never have anything useful to contribute, and your lack of substance has ceased to be amusing and moved into the realm of the tiresome.
Put simply, I'm telling you to shut up and go away. You won't listen, of course, which is why the flag is there.
[ link to this | view in chronology ]
Re: Techdirt has a great voting system
Blue demands "Transparency" and everyone laughs.
Because you can't email a transparency report to a nameless troll. How would Mike track the stats for the cowards and let them see the info without a logon and an a working email?
Blue lies, and doesn't know what Common Law is.
[ link to this | view in chronology ]
The point being missed here is...
They don't want to boot ANYONE off for any reason - that's an inventory loss each time they do so.
BIG markets out there for every possible "group", even the most radical hate groups.
Facebook is a *company*. Companies exist to make money.
It's not that difficult to figure out.
[ link to this | view in chronology ]
Re: The point being missed here is...
[ link to this | view in chronology ]
Re: The point being missed here is...
[ link to this | view in chronology ]
Re: Re: The point being missed here is...
Facebook isn't going to kick off high profit users/groups, no matter if they start a Nuke the Gay Whales organization.
If Neo-Nazis suddenly stop buying tons of "memorabilia" and such crap, they'll get parsed out as well. If gays suddenly stop buying from Facebook ads, they'll get weeded out as well.
Facebook is a BUSINESS.
[ link to this | view in chronology ]
Re: Re: Re: The point being missed here is...
The bizarre notion of how you think a business should be run reflects more on you than it does on Facebook.
[ link to this | view in chronology ]
Re: Re: Re: The point being missed here is...
Your strain of thought is a reason for why they should do moderation, pruning off content that would drive away advertisers. But 'should we do moderation' is not in debate. The debate is around how that moderation occurs, because of clear and obvious variance. Your assertion seems to be that the variance happens because they are this nebulous business entity that is just pumping short term metrics i guess? EFF, Techdirt, and people who understand that a business is a collection of people, believe this variance is occurring due to the need for individuals to make snap value judgments often without the context to understand the content at issue. This forces personal biases to the forefront. The EFF is proposing a system that requires transparency so that appeals can occur, or corrective action taken.
[ link to this | view in chronology ]
Re: Re: Re: Re: The point being missed here is...
[ link to this | view in chronology ]
Re: Re: Re: The point being missed here is...
Facebook is a BUSINESS."
A business that seeks to prosper long term will always have to consider issues other than immediate cash flow.
For instance, YouTube must have lost plenty of money after implementing the decision a year or two ago to ban paid advertising from firearm manufacturers, which specifically targeted viewers of Youtube's many gun channels. Youtube might not have realized that this policy change would have the ultimate effect of turning most gun reviews into paid promotions when manufacturers switched from running YouTube ads to paying video makers directly. Youtube then cracked down a second time by banning links on YouTube pages going to external gun sites. One thing that Youtube has never taken any interest in is whether a product reviewer is actually a paid shill, an issue that cuts right to the core of ethical conduct, and one type of Youtube's moderation efforts that viewers would actually welcome.
Through a continuous game of cat and mouse, it's rather obvious that Youtube is determined to kill off a highly profitable community of consumers rather than profiting from their interests, presumably to show the world (as well as placate its leftist activist employees) that Youtube is on the "correct" side of a highly divisive political issue.
[ link to this | view in chronology ]
Re: Re: Re: Re: The point being missed here is...
"gun nuts" are a relatively small part of their user population, and a relatively small part of their advertising revenue.
Youtube has the same problem as any large newspaper or TV station did in the past: they can't afford to piss off the vast majority of their viewers or advertisers.
I think youtube, as a corporation, is large enough to have become fairly amoral, and is mostly interested in not angering the majority.
The same can be said for a "fair" moderation process: unfairness, or the perception thereof, can drive away business.
[ link to this | view in chronology ]
I'm not criticizing. Just amused.
[ link to this | view in chronology ]
Re:
I get your point though. It's not a black and white situation.
[ link to this | view in chronology ]
You know you can use the user agent in an http request to permit/deny or even modify the response, right? It should be near-trivial to omit comments from requests made by spiders.
[ link to this | view in chronology ]
Re: Omitting user comments from search
And by the way, there's a decent chance I wrote the comment involved, using an example to point out that whether a given sentence would be acceptable or not depended heavily on context, which computers are bad at. Just consider your favorite hate speech, then consider someone complaining, quite correctly, about me saying it and quoting it. Quotes of bad stuff are part of a journalist's stock in trade.
[ link to this | view in chronology ]
So, nobody at FB/Google is a software dev
The search flags are set, and the wide swatch of the web in Google's case and the subset of the web that is FB. If something is flagged, the reason for the flag is KNOWN at that time. Proper error returns, or error logging must show what (and hopefully where) the error was as well as the type of error.
TechDirt article X, Hate speech flag, comments section.
Based on Google's use of Go and their penchant for ML driven solutions one would be surprised if they ever managed to do it correctly.
A detailed error might be difficult for their solution; Which is still fn-ing wrong.
[ link to this | view in chronology ]
Re: So, nobody at FB/Google is a software dev
Guess what happens when the customer does not like the product ... who gets yelled at - never gets old does it?
[ link to this | view in chronology ]
Re: Re: So, nobody at FB/Google is a software dev
But I suspect that the end user is often left out of these setups as a stakeholder, remember, users don't write checks to Google.
[ link to this | view in chronology ]
Gab itself was a byproduct of Twitter censorship. There is, of course, always USENET for those who truly want free speech. USENET's sharp decline in the past decade or two shows that this is simply not a big priority for a public that seems to want to be spoonfed its information, "fake news" or not.
Is there any American who could be trusted with absolute censorship power?
[ link to this | view in chronology ]
Re:
Have an Article 13 vote.
[ link to this | view in chronology ]
Re:
When did Twitter stop people from posting on other websites?
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Buy ibogaine online
[ link to this | view in chronology ]
Typo
I think that should be "already"
[ link to this | view in chronology ]