Facebook's Oversight Board Can't Intervene, So Stop Asking
from the find-the-money-card dept
As Facebook employees stage a digital walk-out and make their thoughts known about the social media giant’s choice to not intervene in any way on “political posts”, especially those of President Donald Trump, some have called for the newly-created Oversight Board to step up and force a change in Facebook. While the official answer is that they can’t start (because supposedly they haven’t given out laptops yet), the real and very simple reason why the Facebook Oversight Board won’t get involved is because it can’t. It’s not created to function that way, it’s not staffed for something like this, and ultimately, due to its relationship with Facebook, anything it would say on this matter right now would be taken in an advisory capacity at best. Facebook, understandably not wanting to actually give any of its power away, played confidence games with the idea of external, independent oversight, and it’s clear that they fooled a lot of people. Let me explain.
In three-card-monte, the huckster keeps shuffling three playing cards until the victim is likely to guess wrong on where the “money card” may be hiding, and proceeds to flop the cards one by one. For Facebook’s prestidigitation on content moderation, last month’s announcement of the initial 20 highly-regarded experts tapped as members for its independent oversight board is the second card flop, and predictably, the money card is not there.
The ongoing sleight of hand performed by Facebook is subtle but fundamental. The board was set up as truly independent, in every way, from member to case selection and to the board’s internal governance. In terms of its scope and structure, it is guided by previously-released bylaws to primarily handle a small set of content removal cases (which come up to the board after exhausting the regular appeals process), and dictate Facebook to change its decisions in those cases. To a much lesser extent, the Board can, although time and resources are not allocated for this, provide input, or recommendations about Facebook’s content moderation policies, however, Facebook is not obligated in any way to follow those policy recommendations, but to simply respond in 30 days and talk about any action it may take.
In the pages of the San Francisco Chronicle’s Open Forum, and elsewhere, I and others have called attention to this empty action as far back as September 2019, at the first card flop, the public release of the Board’s charter and bylaws. The project continued unabated and unchanged as friendly experts extolled the hard work of the team and preached optimism. Glaring concerns over the Board’s advisory-at best, non-binding overall power, not only weren’t addressed, but actually dismissed by cautioning that board member selection, last month’s flop, would be where the money card is. Can you spot the inconsistency? It doesn’t matter if you have the smartest independent advisors if you’re not giving them the opportunity to actually impact what you do. Of course, the money card wasn’t there.
In early May, the Menlo Park-based company released the list of its Oversight Board membership, with impressive names (former heads of state, Nobel Prize laureates and subject matter experts from around the world). Because the Board is truly independent, Facebook’s role was minimal, beyond coming up with said structure and bylaws with the consultation of experts from around the world (full disclosure: the author was also involved in one round of consultations in mid 2019), it only directly chose the 4 co-chairs who then were heavily involved in the choice of the other 16 members. A lot of chatter around this announcement focused, predictably, on who the people are; is the board diverse; is it experienced enough, etc, while some, have even focused on how independent the board truly is. As the current crisis is showing, none of that matters.
As we witness the Board’s institutionalized, structural and political inability to perform oversight it is becoming entirely clear that Facebook is not, at all, committed to fixing its content moderation problems in any meaningful way, and that political favor is more important than consistently applied policies. There is no best case scenario anymore as the Board can only fail or infect the rest of the industry. And what is a lose-lose for all of us will likely still be a win-win for Facebook.
The bad case scenario is the likeliest: the Board is destined to fail. While Zuckerberg’s original ideas of transparency and openness were great on paper, the Board quickly turned into just a potential shield against loud government voices (such as Big Tech antagonist Sen. Hawley). Not only is that not working, Sen. Hawley responded to the membership list with even harsher rhetoric, but the importance placed on the optics versus the reality of solving this problem is even more obvious now. Giving the Board few, if any, real leverage mechanisms over the company can at most build a shiny Potemkin village and not an oversight body. If we dispense with all the readily-available evidence to the contrary, and give Facebook the benefit of the doubt that it tried, the alternative reasons for this rickety and impotent construction are not much better. It may be because giving a final say over difficult cases, the Board’s main job, is not something Facebook was comfortable with doing by itself anyway (and who can blame them given the pushback the platform gets with any high-profile decision). Or it may be because of a bizarre allegiance to the flawed constitutional law perspective that Facebook can build itself a Supreme Court, which makes the Board act as an appellate court of sorts, with a vague potential for creating precedent rather than truly providing oversight.
If the Board’s failure doesn’t tarnish the perspective of a legitimate private governance model for content moderation, there’s a lot to learn on how to avoid unforced errors. First, we can safely say that while corporations may be people, they are definitely not states. Creating a pseudo judiciary without any of the accouterments of a liberal-democratic state, such as a hard-to-change constitution, co-equal branches and some sort of social contract is a recipe for disaster. Second is a fact that theory, literature and practice have long argued: structure fundamentally dictates how this type of private governance institution will run. And with an impotent Board left to mostly bloviate after the fact, without any real means to make changes to the policies themselves, this structure clearly points to a powerless but potentially loud “oversight” mechanism, pushed to the front, as a PR stunt, but unequipped to deal with the real problems of the platform. Finally, we see that even under intense pressure from numerous and transpartisan groups, and a potential openness to fixing a wicked problem, platforms are very unwilling to actually give up, even partly, their role and control in moderating content, but will gladly externalize their worst headaches. If their worst headaches were aligned with the concerns of their users, that would be great, but creating “case law” for content moderation is an exercise in futility, as the company struggles to reverse-engineer Trump-friendly positions with its long-standing processes. We don’t have lower court judges who get to dutifully decide whether something is inscribed in the board’s previous actions. We have either underworked, underpaid and scarred people making snap decisions every minute, or irony and nuance illiterate algorithms who are poised to interpret these decisions mechanically. And more to the point, we have executives deciding to provide political cover to powerful players rather than enforce their own policies, knowing full well they’re not beholden to any oversight, since even if already up and running, by the time the Board ruled on this particular case, if ever, the situation would have since no longer been of national importance.
As always, there still is a solution. The Oversight Board may be beyond salvaging, but the idea of a private governance institution, where members of the public, civil society, industry and even government officials, can come together and try to reach a common ground for what the issues are and what the solutions might be, should still flourish, and should not be thrown away simply because Facebook’s initial attempt was highly flawed. Through continued vigilance and genuine, honest critiques of its structure and real role in the Facebook ecosystem, the Oversight Board can, at best, register as just one experiment of many, not a defining one, and we can soldier on with more diverse, inclusive, transparent, and flexible, industry-wide dialogues and initiatives.
The worst case scenario is if the Board magically coasts through without any strong challenge to its shaky legitimacy, or its impotent role. The potential for this to happen is there, since there are more important things in the world to worry about than whether Facebook’s independent advisory body has any teeth. In that case Facebook intends to, one way or another, franchise it to the rest of the industry. And that would be the third, and final flop. However, as I hope you figured it out by now, the money card wouldn't be there either. The money card, the card that Facebook never actually intended on giving away or even showing us, the power over content moderation policies, was never embedded in the structure of the board, its membership or any potential industry copycats that could legitimize it. This unexpected event allowed us to take a peek at the cards, the money card is still where it was all along, in Facebook’s back pocket.
David Morar is Associate Researcher at the Big Data Science Lab at the West University of Timisoara, Romania
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: content moderation, donald trump, facebook oversight board, governance, mediation, oversight board
Companies: facebook
Reader Comments
Subscribe: RSS
View by: Time | Thread
The teeth would be in public transparency and facebook's inherent accountability to it's customers. The board's teeth come from publicly disclosing problems with Facebook's practices and leaving facebook to face the backlash if they ignore the advice.
They still need to balance what is good for their business against what the board recommends. It would certainly be possible to hamstring facebook completely using content moderation practices alone and giving up final control over their content moderation practices is basically giving up control of their business.
[ link to this | view in thread ]
Checkbook "apologies"
[ link to this | view in thread ]
Do Something
If you can't really do something to cure the problem, then just give them a placebo. If it buys enough time, then eventually it won't be your problem anymore. I have to admit, it's a nerfariously genius solution.
[ link to this | view in thread ]
Re:
Thanks for the comment! I would say that currently there isn't any inherent accountability mechanism for Facebook to its customers that actually has any real effect. Public disclosure of issues with Facebook is what a lot of the people now on the board have devoted their professional careers to, and there hasn't been any major structural change to how FB works wrt to CoMo.
I totally agree and believe that is a very fair critique (your second paragraph) but the difference is that a) they promised they would, and b) they are bastardizing and improperly putting into practice a potential solution to this, which would be multistakeholder governance bodies (internal, external, industry, etc. many flavors), thus delegitimizing it further. And that's where I stand as someone that has studied this form of governance very closely, and has told the FB people (otherwise very capable, hard-working and sharp people) as much, trying to make them understand the pitfalls of creating it as such.
[ link to this | view in thread ]
Re: Checkbook "apologies"
Just wanted to say, the make-up of the board is actually incredibly impressive in terms of almost every metric (I would also say that maybe having the Vice-President of Cato, a scholar who has studied and written on free speech for a while maybe counts as a libertarian).
[ link to this | view in thread ]
Facebook will never willingly do a thing that will cost them money, the sooner people realise that the better. They would not put neutral parties in a position to set company policy and will never, ever act in the interest of the greater good until people do more than simply ignoring Uncle Qberts Qanon conspiracies and targeted political hit pieces from the latest spiritual successor to Cambridge Analytica and start boycotting advertisers or leaving the service.
[ link to this | view in thread ]
Re: Re:
Being someone who just plain stopped using their services because I didn't agree with their policies I have a hard time swallowing that they have no meaningful accountability to their customers, but if that is truly the case I think it needs to be addressed through anti-trust mechanisms before anything else could be solved.
Realistically I think if the board came out publicly and said something like "Look we are seeing anti-conservative bias in your moderation.. we recommend these changes here" and facebook said something like "yeah we looked at this and we decided we aren't doing any of that" they would have both the public and the government down their throat so fast that it being "their final decision" wouldn't mean much
[ link to this | view in thread ]
Re: Checkbook "apologies"
Have you ever been right in any single one of your osts here? I can't recall one that was based on fact?
"the Blaze"
Lol, Glenn Beck's vanity project that he set up after he got fired for losing Fox too much money with his trolling is the best non-Fox example of a conservative voice you can come up with?
I tell you what, we'll take you seriously when extremist trolls are not the only examples you have of people you want us to take seriously. There are serious conservative voices out there, maybe you should start listening to them instead of the con artists?
[ link to this | view in thread ]
Re: Do Something
Actually, it is a bit insane.
[ link to this | view in thread ]
Do private companies have a say in what they host or not?
And when people complain about Twitter and YouTube deleting videos we're told they have every right and STFU.
So which is it?
[ link to this | view in thread ]
Re: Re: Re:
Who is it that you think are Facebook's customers? You, mom, dad, aunt Karen, Uncle Joe? You're not paying to use the service, so think again.
Regardless, this thread has the false notion that Facebook choosing to do content moderation (any way they see fit) is actually illegal in any way. It's not, and we all know why even if we choose not to accept it.
The fact that they threw together some board while also dictating how they would respond to that same board really means nothing. Enough people shouted "DO SOMETHING" so that's what Facebook did. See where moral panic gets us?
I'm not defending them, but as a company they really do get to choose how they operate. I too dislike a lot of their business practices and have privacy concerns.
[ link to this | view in thread ]
Re: Re: Re: Re:
Facebook's money comes from their users. The users leave and facebook crumbles. Feel free to substitute users for customers if you don't like the semantics of the term customer here as it makes no difference.
[ link to this | view in thread ]
Re: Checkbook "apologies"
Appoint a bunch of people to a board with no power, no attempt at political diversity (anyone at Fox News, the Blaze, etc.), not even libertarians or free speech advocates.
This is simply false. There are multiple people who are "free speech advocates" and certainly some libertarians in the bunch.
Must you always lie? We've pointed out your kneejerk tendencies in the past, and yet you continue. Why?
[ link to this | view in thread ]
Re: Checkbook "apologies"
A large publicly traded company takes an action to clean up it's image just for the PR?
Fetch me That One Guy's fainting couch!
Seriously though it's still a good idea to criticize moves like these, because theater or not there's a chance some good can come of this if done well.
[ link to this | view in thread ]
Re:
How does one boycott an advertiser?
[ link to this | view in thread ]
Re:
Whether they are allowed to do something is different from whether they should. It is very easy to acknowledge something is legal while criticizing it for being immoral.
[ link to this | view in thread ]
Re: Re:
So taking free speech away from people is moral?
I'm glad you're not in charge of who can speak and who can't.
[ link to this | view in thread ]
Re: Re: Re:
Nobody's having their free speech taken away. The only ones demanding such are the ones insisting that FB et al should lose the right to control what happens on their own property. Those using those sites retain their free speech rights, even the ones kicked off that property. You have a right to speech, not a right to use someone else's stuff to make it louder.
If this is still too difficult for you to understand, you're way behind on the conversation here.
[ link to this | view in thread ]