Why We Filed A Comment With Facebook's Oversight Board
from the less-is-more dept
Back when Facebook's Oversight Board was just getting organized, a colleague suggested I represent people before it as part of my legal practice. As a solo lawyer, my entrepreneurial ears perked up at the possibility of future business opportunities. But the rest of me felt extremely uncomfortable with the proposition. I defend free speech, but I am a lawyer and I defend it using law. If Facebook removes you or your content that is an entirely lawful choice for it to make. It may or may not be a good decision, but there is nothing for law to defend you from. So it didn't seem a good use of my legal training to spend my time taking issue with how a private entity made the moderation decisions it was entirely within its legal rights to make.
It also worried me that people were regarding Facebook's Oversight Board as some sort of lawmaking body, and I was hesitant to use my lawyering skills to somehow validate and perpetuate that myth. No matter how successful the Board turns out to be, it is still limited in its authority and reach, and that's a good thing. What is not good is when people expect that this review system should (a) have the weight of actual law or (b) be the system that gets to evaluate all moderation decisions on the Internet.
Yet here I am, having just written a comment for the Copia Institute in one of its cases. Not because I changed my mind about any of my previous concerns, but because that particular high-profile case seemed like a good opportunity to help reset expectations about the significance of the Oversight Board's decisions.
As people who care about the online ecosystem we want those decisions to be as good as they can be because they will have impact, and we want that impact to be as good as it can be. With our comment we therefore tried to provide some guidance on what a good result would look like. But whether the Board gets its decisions right or wrong, it does no good for the public, or even the Board itself, to think its decisions mean more than they do. Nor is it necessary: the Oversight Board already has a valid and even valuable role to play. And it doesn't need to be any more than what it actually is for it to be useful.
It's useful because every platform makes moderation decisions. Many of these decisions are hard to make perfectly, and many are made at incredible scale and speed. Even with the best of intentions it is easy for platforms to make moderation decisions that would have been better decided the other way.
And that is why the basic idea of the Oversight Board is a good one. It's good for it to be able to provide independent review of Facebook's more consequential decisions and recommend how to make them better in the future. Some have alleged that the board isn't sufficiently independent, but even if this were true, it wouldn't really matter, at least insofar as Facebook goes. What is important is that there is any operational way to give Facebook's moderation decisions a second look, especially in a way that can be informed by additional considerations that may not have been included in the original decision. That the Oversight Board is designed to provide such review is an innovation worth cheering.
But all the Oversight Board can do is decide what moderation decision might have been better for Facebook and its user community. It can't articulate, and it certainly can't decree, a moderation rule that could or should apply at all times on every platform anywhere, including platforms that are much different, with different reaches, different purposes, and different user communities than Facebook has. It would be impossible to come up with a universally applicable rule. And it's also not a power this Board, or any similar board, should ever have.
As we said in our comment, and have explained countless times on these pages, platforms have the right to decide what expression to allow on their systems. We obviously hope that platforms will use this right to make these decisions in a principled way that serves the public interest, and we stand ready to criticize them as vociferously as warranted when they don't. But we will always defend their legal right to make their moderation choices however perfectly or imperfectly they may make them.
What's important to remember in thinking about the Oversight Board is that this is still Facebook making moderation decisions. Not because the Board may or may not be independent from Facebook, but because Facebook's decision to defer to the Board's judgment is itself a moderation decision. It is not Facebook waiving its legal right to make moderation choices but rather it exercising that very right to decide how to make those choices, and this is what it has decided. Deferring to the Board's judgment does not obviate real-world law protecting its choice; it's a choice that real world law pointedly allows Facebook to make (and, thanks to Section 230, even encourages Facebook to try).
The confusion about the mandate of the Oversight Board seems to stem in part from the way the Board has been empowered and operates. In many ways it bears the hallmarks of a self-contained system of private law, and in and of itself that's fine. Private law is nothing new. For instance, when you hear the term "arbitration," that's basically what arbitration is: a system of private law. Private law can exist alongside regular, public, democratically-generated law just fine, although sometimes there are tensions because for it to work all the parties need to agree to abide by it instead of public law, and sometimes that consent isn't sufficiently voluntary.
But consent is not an issue here: before the Oversight Board came along Facebook users had no legal leverage of any kind over Facebook, so this is now a system of private law that Facebook has agreed can give them some. We can and should of course care that this system of private law is a good one, well-balanced and equitable, and thus far we've seen no basis for any significant concern. We instead see a lot of thoughtful people working very hard to try to get it right and open to being nudged to do better if such nudging should happen to be needed. But even if they were getting everything all wrong, in the big picture it doesn't really matter either, because ultimately it is only Facebook's oversight board, inherently limited in its authority and reach to that platform.
The misapprehension that this Board can or should somehow rule over all moderation decisions on the Internet is also not helped by the decision to call it the "Oversight Board," rather than the "Facebook Oversight Board." Perhaps it could become a model for other platforms to use, and maybe, just maybe, if it really does become a fully spun-off independent, sustainable, self-contained private law system it might someday be able to supply review services to other platforms too—provided, of course, that the Board is equipped to address these platforms' own particularities and priorities, which may differ significantly from Facebook's.
But right now it is only a solution for Facebook and only set up to consider the unique nature of the Facebook platform and what Facebook and its user community want from it. It is far from a one-size-fits-all solution for Internet content moderation generally, and our comment said as much, noting that the relative merit of the moderation decision in question ultimately hinged on what Facebook wanted its platform to be.
Nevertheless, it is absolutely fine for it to be so limited in its mission, and far better than if it were more. Just as Facebook had the right to acquiesce to this oversight board, other platforms equally have the right, and need to have the right, to say no to it or any other such board. It won't stop being important for the First Amendment to protect this discretion, regardless of how good a job this or any other board might do. While the Oversight Board can, and likely should, try to incorporate First Amendment values into its decisions to the extent it can, actual First Amendment law operates on a different axis than this system of private law ever would or could, with different interests and concerns to be balanced.
It is a mistake to think we could simply supplant all of those considerations with the judgment of this Oversight Board. No matter how thoughtful its decisions, nor how great the impact of what it decides, the Oversight Board is still not a government body. Neither it (nor even Facebook) has the sort of power the state has, nor any of the Constitutional limitations that would check it. Facebook remains a private actor, a company with a social media platform, and Facebook's Oversight Board simply an organization built to help it make its platform better. We should be extremely wary of expecting it to be anything other than that.
Especially because that's already plenty for it to be in order for it to be able to do some good.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: arbitration, content moderation, private law
Companies: facebook, oversight board
Reader Comments
Subscribe: RSS
View by: Time | Thread
Private Law Oversight
How much does (sovereign, national?) law regulate private law?
Obviously law supersedes private law and I expect that in most cases private law is based on contract law, but is there any more oversight, policy-setting than that?
For example, I think it is widely believed that arbitration in the case of unequal parties (think customer and corporation, or employee and employer, but not two divorcing spouses, say) that the arbitrator is in the pocket of the more powerful (funding) party.
Can law do better at delegating some of the operations of law by enforcing standards on private law?
[ link to this | view in chronology ]
Re: Private Law Oversight
Perhaps I should say, in the final sentence:
Can law do better at delegating some of the operations of dispute settlement by enforcing standards on private law?
[ link to this | view in chronology ]
Re: Re: Private Law Oversight
It may also be misleading to label it "private law". Even the state DMV or federal communications commission create regulations and rules, not law.
[ link to this | view in chronology ]
Re: Re: Re: Private Law Oversight
Huh? Public agency regulations are public law. The Facebook Oversight Board is not a public agency.
[ link to this | view in chronology ]
Re: Re: Private Law Oversight
Public law can rein in private law, but US law tends to generally permits quite a bit. Still, while courts have been especially generous in recent years, there's nothing about this set-up that seems anywhere near any sort of close call where some courts might see reason to try to check it. The point I was trying to make in the post is that there's really nothing objectionable about this arrangement.
[ link to this | view in chronology ]
Re: Re: Re: Private Law Oversight
I am not a lawyer.
I had not previously heard of public/private law terminology, is this similar to criminal/civil law? If not, what is different?
[ link to this | view in chronology ]
How much power is too much?
Many online companies and platforms are going to be watching this experiment very carefully. They will ultimately need to decide whether they want to invest in an expensive board of their own that creates rules (please, let’s not call them laws) that are aligned to their own mission. I anticipate many companies will instead, take the lead from FB on controversial content moderation, suspension, or account closure decisions. Not that it shouldn’t be their right to do that. I’m concerned about the impacts of an already powerful voice on the internet having an even greater sway on content moderation overall. Break up Facebook? Lower the barrier of entry for governance boards? Educate the public on the importance of transparency in moderation so it becomes something companies can assign dollars to?
[ link to this | view in chronology ]
Re: How much power is too much?
How the hell "breaking up" Facebook would help in any way? The main social media platform is the only part which matters for that objection and people can choose what they want. Facebooks A through Z would just converge onto a single winner. Not to mention the premise is nonsensical - it wasn't a matter of antitrust in the golden days when nearly everyone had a subscription to say "The New York Times".
There also isn't any more or less impact on decision making. That was the whole point of the article. Seriously people act so goddamned stupid about "Big Tech" that they fail a Turing test - no coherent thought, no novel ideas as to how what they think theybwant would be implemented(however ill considered they may be), just the same stupid buzzwords, soundbites, and view from another reality everywhere.
[ link to this | view in chronology ]
Re: Re: How much power is too much?
"How the hell "breaking up" Facebook would help in any way?"
It wouldn't, really. You could break up Facebook into its component pieces - ads, Instagram, Oculus and WhatsApp as separate entities. But, that wouldn't help with the problems people have with Facebook. Both FB and IG would continue to be large ad focussed networks because a) that's how they make their money and b) that main reason people use them is because a lot of people are there. There's no logic way to make those individual services smaller without destroying the network effect of why people use them at all, and separating the ad from the platform doesn't change the way ads are used there.
It might make sense in some ways to break up the monolithic approach from a purely business standpoint and to maybe calm some complaints with people who like to use Oculus but are unhappy that they have to agree to use Facebook accounts they have been trying to avoid (for example). But it would do nothing about things like content moderation or algorithm matching. In fact, it could be argued that it would encourage them to moderate harder, since the smaller business units would not be as protected by large warchests if they were to be held liable for something.
I don't know exactly what the solutions for these problems are (other than people growing up and realising that when they're kicked off someone's property for offending other guests, the correct response is to go elsewhere, not to try bullying their way back in through a side door). But "break them up because they're big" doesn't make any sense in these arguments on its own merits.
[ link to this | view in chronology ]
Re: Re: Re: How much power is too much?
In my opinion Facebook shouldn't have been allowed to buy up those entities. Whatsup and instagram were indirect alternatives to facebook. Sure they don't have exactly the same features, but that doesn't really matter, people were still using them as an alternative. You can't let the biggest player just buy up any decent competition that ever comes around.
Having them as actual separate companies competing with each other certainly help because they would be competing, so they wouldn't be able to take advantage of their customers as badly without consequences.
Breaking them up afterward is kindof silly in my opinion since they allowed it in the first place but they should have prevented them from buying up the competition in the first place, and they shouldn't let them do that anymore.
[ link to this | view in chronology ]