Facebook Oversight Board's First Decisions... Seem To Confirm Everyone's Opinions Of The Board
from the take-a-deep-breath dept
Last week, the Oversight Board -- which is the official name that the former Facebook Oversight Board wants you to call it -- announced decisions on the first five cases it has heard. It overturned four Facebook content moderation decisions and upheld one. Following the announcement, Facebook announced that (as it had promised) it followed all of the Oversight Board's decisions and reinstated the content on the overturned cases (in one case, involving taking down a breast cancer ad that had been deemed to violate the "no nudity" policy, Facebook actually reinstated the content last year, after the Board announced it was reviewing that decision). If you don't want to wade into the details, NPR's write-up of the decisions and policy recommendations is quite well done and easily digestible.
If you want a more detailed and thoughtful analysis of the decisions and what this all means, I highly recommend Evelyn Douek's detailed analysis of the key takeaways from the rulings.
What I'm going to discuss, however, is how the decisions seem to have only reinforced... absolutely everyone's opinions of the Oversight Board. I've said before that I think the Oversight Board is a worthwhile experiment, and one worth watching, but it is just one experiment. And, as such, it is bound to make mistakes and adapt over time. I can understand the reasoning behind each of the five decisions, though I'm not sure I would have ruled the same way.
What's more interesting to me, though, is how so many people are completely locked in to their original view of the board, and how insistent they are that the first decisions only confirm their position. It's no secret that many people absolutely hate Facebook and view absolutely everything the company does as unquestionably evil. I'm certainly not a fan of many of the company's practices, and don't think that the Oversight Board is as important as some make it out to be, but that doesn't mean it's not worth paying attention to.
But I tended to see a few different responses to the first rulings, which struck me as amusing, since the positions are simply not disprovable:
1. The Oversight Board is just here to rubberstamp Facebook's decisions and make it look like there's some level of review.
This narrative is slightly contradicted by the fact that the Oversight Board overturned four decisions. However, people who believe this view retort that "well, of course the initial decisions have to do this to pretend to be independent." Which... I guess? But seems like a lot of effort for no real purpose. To me, at least, the first five decisions are not enough to make a judgment call on this point either way. Let's see what happens over a longer time frame.
2. The Oversight Board is just a way for Facebook and Zuckerberg not to take real responsibility
I don't see how this one is supportable. It's kind of a no-win situation either way. Every other company in the world that does content moderation has a final say on their decisions, because it's their website. Facebook is basically the first and only site so far to hand off those decisions to a 3rd party -- and it did so after a ton of people whined that Facebook had too much power. And the fact that this body is now pushing back on Facebook's decisions suggests that there's at least some initial evidence that the Board might force Zuckerberg to take more responsibility. Indeed, the policy recommendations (not just the decisions directly on content moderation) suggest that the Board is taking its role as being an independent watchdog over how Facebook operates somewhat seriously. But, again, it's perhaps too early to tell, and this will be a point worth watching.
3. The Oversight Board has no real power, so it doesn't matter what they do.
The thing is, while this may be technically true, I'm not sure it matters. If Facebook actually does follow through and agree to abide by the Board's rulings, and the Board continues the initial path it's set of being fairly critical of Facebook's practices, then for all intents and purposes it does have real power. Sometimes, the power comes just from the fact that Facebook may feel generally committed to following through, rather than through any kind of actual enforcement mechanism.
4. The Oversight Board is only reviewing a tiny number of cases, so who cares?
This is clearly true, but again, the question is how it will matter in the long run. At least from the initial set of decisions, it's clear that the Oversight Board is not just taking a look at the specific cases in front of it, but thinking through the larger principles at stake, and making recommendations back to Facebook about how to implement better policies. That could have a very big impact on how Facebook operates over time.
As for my take on all of this? As mentioned up top, I think this is a worthwhile experiment, though I've long doubted it would have that big of an impact on Facebook itself. I see no reason to change my opinion on that yet, but I am surprised at the thoroughness of these initial decisions and how far they go in pushing back on certain Facebook policies. I guess I'd update my opinion to say I've moved from thinking the Oversight Board had a 20% chance of having a meaningful impact, to now it being maybe 25 to 30% likely. Some will cynically argue that this is all for show, and the first cases had to be like that. And perhaps that's true. I guess that's why no one is forced to set their opinion in stone just yet, and we'll have plenty of time to adjust as more decisions come out.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: appeals, breast cancer, content moderation, free speech, myanmar, nudity, review
Companies: facebook, oversight board
Reader Comments
Subscribe: RSS
View by: Time | Thread
I disagree with the rulings in the France/COVID-19 “cure” and Myanmar/hate speech cases — pretty much for reasons that stand as the opposite of the Oversight Board’s — but I do admit that I’m happy to see the Board not merely rubber-stamp Zuckerberg’s bullshit.
[ link to this | view in thread ]
whatever...
As I don't have a dog in this particular horse show all I can say is;
¯_(ツ)_/¯
[ link to this | view in thread ]
Aftermath
In today's day and age of social media, censorship for even a few hours makes a difference for events. They may decide to reinstate the posts, but it's water under the bridge now.
[ link to this | view in thread ]
Re: Aftermath
Not censorship.
[ link to this | view in thread ]
Facebook is investigating Facebook, to see if Facebook did anything wrong.
Facebook will find that Facebook did nothing "wrong", but Facebook will make recommendations that Facebook will follow.
[ link to this | view in thread ]
Zuckerberg
Facebook is a public corporation with stock. Its stockholder are not personally liable for what the corporation does -- that's the whole point of a corporation, public or private.
Mark Zuckerberg is a shareholder, and in that respect is also not personally liable. He's also the CEO and even in that respect is not personally liable.
There's no "real responsibility" other than the management team (day to day) which includes MZ as its head, or the BoD (long term management).
Attempting to "put this on" MZ (or JB and Amazon) ignores the whole point of a corporate structure. Could JB give his personal fortune to increase AMZ employees' pay? Sure. Is he required to? No. Could MZ effect changes to FB? Sure. Is he required to? No.
If any CEO of a publicly traded corporation unilaterally made changes which appeared to reduce shareholder value... that person would be responsible for that loss. When discussing FB, the "value" of FB (not a member for years) to the stockholders can be vastly different than the "value" of what FB does with its content.
If the argument is that MZ should change FB, then FB (Inc.) should put out PR saying roughly that "In N months we'll be focusing more on X and less on Y" and giving those stockholders an opportunity to exit that market. THEN and ONLY THEN make changes.
This might make sense except that FB makes money doing what they do. This makes money for the shareholders. This makes money for the management team. Changing this would require a reduction in the top line and the bottom line. Shareholders (including #1 MZ) wouldn't want that, as "growth" is the golden goose.
Summarizing, if "society" wants FB to change the way it does business, "society" can de-reward FB by not participating, by not viewing ads, by not allowing FB tracking on other sites. "Society" has failed to do so for over a decade, and calling on one man -- in the unenviable position of being a market maker, CEO, and shareholder -- to do so is hypocritical.
I own IBM stock. IBM bought Redhat. Redhat got CentOS. CentOS got switched to a rolling distribution by RH. Am I somehow responsible for this? Of course not. What can I do? Divest from IBM? Write a letter? If I was CEO of IBM and divested CentOS would shareholders sue me for reducing their value?
Yeah, they sure would.
E
[ link to this | view in thread ]
Re:
Facebook is investigating Facebook, to see if Facebook did anything wrong.
Facebook will find that Facebook did nothing "wrong", but Facebook will make recommendations that Facebook will follow.
To be fair, that is how EVERY website views content moderation, and the interesting thing here is actually how it's DIFFERENT from what you describe. While FB did put the money in to start the Oversight Board, it only hired the initial people (and they were chosen in part because of their independence from FB) and then everyone else has been hired not by FB and so it's NOT Facebook investigating Facebook.
[ link to this | view in thread ]
Re: Zuckerberg -- failure of the market
Ehud:
You have aptly described a market failure, with some peculiar and novel features:
a) We have a corporation, so no individual is responsible for social transgressions.
b) Powerful individuals tend to get sued for repairing social transgressions when they have significant net costs to the corporation, and rewarded for growth at any cost.
c) FB is paid for advertising screens, more or less, by anonymous customers,
d) They "mine" their supply of screens from the public, willing or not, and avoid the costs of negative externalities they cause, such as radicalization and popularization of conspiracy theories.
Now, they do have to pay attention to their public image, as they have competition... and this is definitely an interesting effort. I'm not hopeful for it, though, because context...adding "This is horrible.." to the front of your favorite "Bad" content substantially changes whether it should be removed or not, and so might sharing it with a reasonable person such as yourself, who might take action to preventing whatever from repeating.
I also see limited success in this regard without, as with youtube, addressing the "recommendation" algorithms, that tend to promote progressively more (but only marginally) antisocial content to those that might then act antisocially as a result. [Note: assumes an agreed definition for antisocial exists; see your favorite protest for proof it doesn't]
[ link to this | view in thread ]
Re: whatever...
Just wait for the idiots to come out of the woodwork claiming Mike is now also shilling for Facebook.
[ link to this | view in thread ]
Re: Re: Zuckerberg -- failure of the market
"We have a corporation, so no individual is responsible for social transgressions."
Are you complaining about capitalism?
"failure of the market"
I've seen this rather nebulous claim elsewhere and was unable to determine its meaning at that time, perhaps you could explain what this means within the context of your comment.
. What market? The social media market? - lol
. How is it a failure, just because you say so? What is it supposed to be doing that it is not?
" addressing the "recommendation" algorithms, that tend to promote progressively more (but only marginally) antisocial content to those that might then act antisocially as a result."
Perhaps if you were to rephrase this then others might be able to parse it.
[ link to this | view in thread ]
The oversight board is a publicity stunt more than anything else. A way for Facebook to pretend they consider moderation decisions carefully and take context into account instead of hitting the delete button at the first sight of a bad word or phrase.
As a research project to (try to) make the list of forbidden words slightly less arbitrary, it may have some value. But since it's physically impossible to scale anything resembling careful consideration to the size of Facebook, it's not going to change the underlying nature of its moderation.
Whether some rando's insignificant post gets "reinstated" months later is of no real consequence to anyone. Least of all Facebook. The Trump ban is the only interesting decision they could even comment on.
[ link to this | view in thread ]
Re: Re: Re: Zuckerberg -- failure of the market
No individual being responsible for social costs of a corporation is a fact of life, like it's raining today. Or maybe its just an axiom.
In some sort of theory, Adam Smith's invisible hand is supposed to lead to social benefits.
Definition: Market Failure is the when Adam Smith's hand leads to significant social costs.
Claim: Facebook's recommendation algorithms have lead people to antisocial behavior, as an associated effect of maximizing engagement. The content recommended may not be sufficiently antisocial to ban per se. (Think flat earth theory, which has been a sort of in-joke among engineering students). Define antisocial however you need to, but recognize that it's like "good" content -- the definition changes with the beholder.
[ link to this | view in thread ]
the Oversight Board -- which is the official name that the former Facebook Oversight Board wants you to call it
Because it's the only oversight board in existence or because it oversees everything in existence?
Everyone shall henceforth refer to this comment as "the comment"..
[ link to this | view in thread ]
Re:
There is this magical thing called "context". They are called "the Oversight Board" since there is only one which applies specifically to Facebook. I mean when you refer to "the house on the hill" you aren't saying it is thr only hill in existence with the only house on it.
[ link to this | view in thread ]
Re: Re: Re: Re: Zuckerberg -- failure of the market
Forget corporations - there isn't even a law being broken or a legal responsibility to make /anybody/ liable here which the would be authoritarians keep on forgetting. Nullum crimen sine lege isn't exactly a new legal principle as hinted by being in Latin! If lacking perfect insulation over nebulous social consequences counts as a market failure then every government is a failed state and words have lost their meaning such that I am actually a girrafe wearing ice skates and a pricklehelm.
[ link to this | view in thread ]