Why We Filed A Comment With Facebook's Oversight Board
from the less-is-more dept
Back when Facebook's Oversight Board was just getting organized, a colleague suggested I represent people before it as part of my legal practice. As a solo lawyer, my entrepreneurial ears perked up at the possibility of future business opportunities. But the rest of me felt extremely uncomfortable with the proposition. I defend free speech, but I am a lawyer and I defend it using law. If Facebook removes you or your content that is an entirely lawful choice for it to make. It may or may not be a good decision, but there is nothing for law to defend you from. So it didn't seem a good use of my legal training to spend my time taking issue with how a private entity made the moderation decisions it was entirely within its legal rights to make.
It also worried me that people were regarding Facebook's Oversight Board as some sort of lawmaking body, and I was hesitant to use my lawyering skills to somehow validate and perpetuate that myth. No matter how successful the Board turns out to be, it is still limited in its authority and reach, and that's a good thing. What is not good is when people expect that this review system should (a) have the weight of actual law or (b) be the system that gets to evaluate all moderation decisions on the Internet.
Yet here I am, having just written a comment for the Copia Institute in one of its cases. Not because I changed my mind about any of my previous concerns, but because that particular high-profile case seemed like a good opportunity to help reset expectations about the significance of the Oversight Board's decisions.
As people who care about the online ecosystem we want those decisions to be as good as they can be because they will have impact, and we want that impact to be as good as it can be. With our comment we therefore tried to provide some guidance on what a good result would look like. But whether the Board gets its decisions right or wrong, it does no good for the public, or even the Board itself, to think its decisions mean more than they do. Nor is it necessary: the Oversight Board already has a valid and even valuable role to play. And it doesn't need to be any more than what it actually is for it to be useful.
It's useful because every platform makes moderation decisions. Many of these decisions are hard to make perfectly, and many are made at incredible scale and speed. Even with the best of intentions it is easy for platforms to make moderation decisions that would have been better decided the other way.
And that is why the basic idea of the Oversight Board is a good one. It's good for it to be able to provide independent review of Facebook's more consequential decisions and recommend how to make them better in the future. Some have alleged that the board isn't sufficiently independent, but even if this were true, it wouldn't really matter, at least insofar as Facebook goes. What is important is that there is any operational way to give Facebook's moderation decisions a second look, especially in a way that can be informed by additional considerations that may not have been included in the original decision. That the Oversight Board is designed to provide such review is an innovation worth cheering.
But all the Oversight Board can do is decide what moderation decision might have been better for Facebook and its user community. It can't articulate, and it certainly can't decree, a moderation rule that could or should apply at all times on every platform anywhere, including platforms that are much different, with different reaches, different purposes, and different user communities than Facebook has. It would be impossible to come up with a universally applicable rule. And it's also not a power this Board, or any similar board, should ever have.
As we said in our comment, and have explained countless times on these pages, platforms have the right to decide what expression to allow on their systems. We obviously hope that platforms will use this right to make these decisions in a principled way that serves the public interest, and we stand ready to criticize them as vociferously as warranted when they don't. But we will always defend their legal right to make their moderation choices however perfectly or imperfectly they may make them.
What's important to remember in thinking about the Oversight Board is that this is still Facebook making moderation decisions. Not because the Board may or may not be independent from Facebook, but because Facebook's decision to defer to the Board's judgment is itself a moderation decision. It is not Facebook waiving its legal right to make moderation choices but rather it exercising that very right to decide how to make those choices, and this is what it has decided. Deferring to the Board's judgment does not obviate real-world law protecting its choice; it's a choice that real world law pointedly allows Facebook to make (and, thanks to Section 230, even encourages Facebook to try).
The confusion about the mandate of the Oversight Board seems to stem in part from the way the Board has been empowered and operates. In many ways it bears the hallmarks of a self-contained system of private law, and in and of itself that's fine. Private law is nothing new. For instance, when you hear the term "arbitration," that's basically what arbitration is: a system of private law. Private law can exist alongside regular, public, democratically-generated law just fine, although sometimes there are tensions because for it to work all the parties need to agree to abide by it instead of public law, and sometimes that consent isn't sufficiently voluntary.
But consent is not an issue here: before the Oversight Board came along Facebook users had no legal leverage of any kind over Facebook, so this is now a system of private law that Facebook has agreed can give them some. We can and should of course care that this system of private law is a good one, well-balanced and equitable, and thus far we've seen no basis for any significant concern. We instead see a lot of thoughtful people working very hard to try to get it right and open to being nudged to do better if such nudging should happen to be needed. But even if they were getting everything all wrong, in the big picture it doesn't really matter either, because ultimately it is only Facebook's oversight board, inherently limited in its authority and reach to that platform.
The misapprehension that this Board can or should somehow rule over all moderation decisions on the Internet is also not helped by the decision to call it the "Oversight Board," rather than the "Facebook Oversight Board." Perhaps it could become a model for other platforms to use, and maybe, just maybe, if it really does become a fully spun-off independent, sustainable, self-contained private law system it might someday be able to supply review services to other platforms too—provided, of course, that the Board is equipped to address these platforms' own particularities and priorities, which may differ significantly from Facebook's.
But right now it is only a solution for Facebook and only set up to consider the unique nature of the Facebook platform and what Facebook and its user community want from it. It is far from a one-size-fits-all solution for Internet content moderation generally, and our comment said as much, noting that the relative merit of the moderation decision in question ultimately hinged on what Facebook wanted its platform to be.
Nevertheless, it is absolutely fine for it to be so limited in its mission, and far better than if it were more. Just as Facebook had the right to acquiesce to this oversight board, other platforms equally have the right, and need to have the right, to say no to it or any other such board. It won't stop being important for the First Amendment to protect this discretion, regardless of how good a job this or any other board might do. While the Oversight Board can, and likely should, try to incorporate First Amendment values into its decisions to the extent it can, actual First Amendment law operates on a different axis than this system of private law ever would or could, with different interests and concerns to be balanced.
It is a mistake to think we could simply supplant all of those considerations with the judgment of this Oversight Board. No matter how thoughtful its decisions, nor how great the impact of what it decides, the Oversight Board is still not a government body. Neither it (nor even Facebook) has the sort of power the state has, nor any of the Constitutional limitations that would check it. Facebook remains a private actor, a company with a social media platform, and Facebook's Oversight Board simply an organization built to help it make its platform better. We should be extremely wary of expecting it to be anything other than that.
Especially because that's already plenty for it to be in order for it to be able to do some good.
Filed Under: arbitration, content moderation, private law
Companies: facebook, oversight board