Facebook Ranking News Sources By Trust Is A Bad Idea... But No One At Facebook Will Read Our Untrustworthy Analysis
from the you-guys-are-doing-it-wrong-again dept
At some point I need to write a bigger piece on these kinds of things, though I've mentioned it here and there over the past couple of years. For all the complaints about how "bad stuff" is appearing on the big platforms (mainly: Facebook, YouTube, and Twitter), it's depressing how many people think the answer is "well, those platforms should stop the bad stuff." As we've discussed, this is problematic on multiple levels. First, handing over the "content policing" function to these platforms is, well, probably not such a good idea. Historically they've been really bad at it, and there's little reason to think they're going to get any better no matter how much money they throw at artificial intelligence or how many people they hire to moderate content. Second, it requires some sort of objective reality for what's "bad stuff." And that's impossible. One person's bad stuff is another person's good stuff. And almost any decision is going to get criticized by someone or another. It's why suddenly a bunch of foolish people are falsely claiming that these platforms are required by law to be "neutral." (They're not).
But, as more and more pressure is put on these platforms, eventually they feel they have little choice to do something... and inevitably, they try to step up their content policing. The latest, as you may have heard, is that Facebook has started to rank news organizations by trust.
Facebook CEO Mark Zuckerberg said Tuesday that the company has already begun to implement a system that ranks news organizations based on trustworthiness, and promotes or suppresses its content based on that metric.
Zuckerberg said the company has gathered data on how consumers perceive news brands by asking them to identify whether they have heard of various publications and if they trust them.
“We put [that data] into the system, and it is acting as a boost or a suppression, and we’re going to dial up the intensity of that over time," he said. "We feel like we have a responsibility to further [break] down polarization and find common ground.”
But, as with the lack of an objective definition of "bad," you've got the same problem with "trust." For example, I sure don't trust "the system" that Zuckerberg mentions above to do a particularly good job of determining which news sources are trustworthy. And, again, trust is such a subjective concept, that lots of people inherently trust certain sources over others -- even when those sources have long histories of being full of crap. And given how much "trust" is actually driven by "confirmation bias" it's difficult to see how this solution from Facebook will do any good. Take, for example, (totally hypothetically), that Facebook determines that Infowars is untrustworthy. Many people may agree that a site famous for spreading conspiracy theories and pushing sketchy "supplements" that you need because of conspiracy theory x, y or z, is not particularly trustworthy. But, for those who do like Infowars, how are they likely to react to this kind of thing? They're not suddenly going to decide the NY Times and the Wall Street Journal are more trustworthy. They're going to see it as a conspiracy for Facebook to continue to suppress the truth.
Confirmation bias is a hell of a drug, and Facebook trying to push people in one direction is not going to go over well.
To reveal all of this, Zuckerberg apparently invited a bunch of news organizations to talk about it:
Zuckerberg met with a group of news media executives at the Rosewood Sand Hill hotel in Menlo Park after delivering his keynote speech at Facebook’s annual F8 developer conference Tuesday.
The meeting included representatives from BuzzFeed News, the Information, Quartz, the New York Times, CNN, the Wall Street Journal, NBC, Recode, Univision, Barron’s, the Daily Beast, the Economist, HuffPost, Insider, the Atlantic, the New York Post, and others.
We weren't invited. Does that mean Facebook doesn't view us as trustworthy? I guess so. So it seems unlikely that he'll much care about what we have to say, but we'll say it anyway (though you probably won't be able to read this on Facebook):
Facebook: You're Doing It Wrong.
Facebook should never be the arbiter of truth, no matter how much people push it to be. Instead, it can and should be providing tools for its users to have more control. Let them create better filters. Let them apply their own "trust" metrics, or share trust metrics that others create. Or, as we've suggested on the privacy front, open up the system to let third parties come in and offer up their own trust rankings. Will that reinforce some echo chambers and filter bubbles? Perhaps. But that's not Facebook's fault -- it's part of the nature of human beings and confirmation bias.
Or, hey, Facebook could take a real leap forward and move away from being a centralized silo of information and truly disrupt its own setup -- pushing the information and data out to the edges, where the users could have more control over it themselves. And not in the simplistic manner of Facebook's other "big" announcement of the week about how it'll now let users opt-out of Facebook tracking them around the web (leaving out that they kinda needed to do this to deal with the GDPR in the EU). Opting out is one thing -- pushing the actual data control back to the end users and distributing it is something entirely different.
In the early days of the web, people set up their own websites, and had pretty much full control over the data and what was done there. It was much more distributed. Over time we've moved more and more to this silo model in which Facebook is the giant silo where everyone puts their content... and has to play by Facebook's rules. But with that came responsibility on Facebook's part for everything bad that anyone did on their platform. And, hey, let's face it, some people do bad stuff. The answer isn't to force Facebook to police all bad stuff, it should be to move back towards a system where information is more distributed, and we're not pressured into certain content because that same Facebook thinks it will lead to the most "engagement."
Push the content and the data out and focus on the thing that Facebook has always been better at at it's core: the connection function. Connect people, but don't control all of the content. Don't feel the need to police the content. Don't feel the need to decide who's trustworthy and who isn't. Be the protocol, not the platform, and open up the system so that anyone else can provide a trust overlay, and let those overlays compete. It would take Facebook out of the business of having to decide what's good and what's bad and would give end users much more control.
Facebook, of course, seems unlikely to do this. The value of the control is that it allows them to capture more of the money from the attention generated on their platform. But, really, if it doesn't want to keep dealing with these headaches, it seems like the only reasonable way forward.
Filed Under: confirmation bias, fake news, mark zuckerberg, metrics, news, ranking, trust, trustworthy
Companies: facebook