Facebook Ranking News Sources By Trust Is A Bad Idea... But No One At Facebook Will Read Our Untrustworthy Analysis
from the you-guys-are-doing-it-wrong-again dept
At some point I need to write a bigger piece on these kinds of things, though I've mentioned it here and there over the past couple of years. For all the complaints about how "bad stuff" is appearing on the big platforms (mainly: Facebook, YouTube, and Twitter), it's depressing how many people think the answer is "well, those platforms should stop the bad stuff." As we've discussed, this is problematic on multiple levels. First, handing over the "content policing" function to these platforms is, well, probably not such a good idea. Historically they've been really bad at it, and there's little reason to think they're going to get any better no matter how much money they throw at artificial intelligence or how many people they hire to moderate content. Second, it requires some sort of objective reality for what's "bad stuff." And that's impossible. One person's bad stuff is another person's good stuff. And almost any decision is going to get criticized by someone or another. It's why suddenly a bunch of foolish people are falsely claiming that these platforms are required by law to be "neutral." (They're not).
But, as more and more pressure is put on these platforms, eventually they feel they have little choice to do something... and inevitably, they try to step up their content policing. The latest, as you may have heard, is that Facebook has started to rank news organizations by trust.
Facebook CEO Mark Zuckerberg said Tuesday that the company has already begun to implement a system that ranks news organizations based on trustworthiness, and promotes or suppresses its content based on that metric.
Zuckerberg said the company has gathered data on how consumers perceive news brands by asking them to identify whether they have heard of various publications and if they trust them.
“We put [that data] into the system, and it is acting as a boost or a suppression, and we’re going to dial up the intensity of that over time," he said. "We feel like we have a responsibility to further [break] down polarization and find common ground.”
But, as with the lack of an objective definition of "bad," you've got the same problem with "trust." For example, I sure don't trust "the system" that Zuckerberg mentions above to do a particularly good job of determining which news sources are trustworthy. And, again, trust is such a subjective concept, that lots of people inherently trust certain sources over others -- even when those sources have long histories of being full of crap. And given how much "trust" is actually driven by "confirmation bias" it's difficult to see how this solution from Facebook will do any good. Take, for example, (totally hypothetically), that Facebook determines that Infowars is untrustworthy. Many people may agree that a site famous for spreading conspiracy theories and pushing sketchy "supplements" that you need because of conspiracy theory x, y or z, is not particularly trustworthy. But, for those who do like Infowars, how are they likely to react to this kind of thing? They're not suddenly going to decide the NY Times and the Wall Street Journal are more trustworthy. They're going to see it as a conspiracy for Facebook to continue to suppress the truth.
Confirmation bias is a hell of a drug, and Facebook trying to push people in one direction is not going to go over well.
To reveal all of this, Zuckerberg apparently invited a bunch of news organizations to talk about it:
Zuckerberg met with a group of news media executives at the Rosewood Sand Hill hotel in Menlo Park after delivering his keynote speech at Facebook’s annual F8 developer conference Tuesday.
The meeting included representatives from BuzzFeed News, the Information, Quartz, the New York Times, CNN, the Wall Street Journal, NBC, Recode, Univision, Barron’s, the Daily Beast, the Economist, HuffPost, Insider, the Atlantic, the New York Post, and others.
We weren't invited. Does that mean Facebook doesn't view us as trustworthy? I guess so. So it seems unlikely that he'll much care about what we have to say, but we'll say it anyway (though you probably won't be able to read this on Facebook):
Facebook: You're Doing It Wrong.
Facebook should never be the arbiter of truth, no matter how much people push it to be. Instead, it can and should be providing tools for its users to have more control. Let them create better filters. Let them apply their own "trust" metrics, or share trust metrics that others create. Or, as we've suggested on the privacy front, open up the system to let third parties come in and offer up their own trust rankings. Will that reinforce some echo chambers and filter bubbles? Perhaps. But that's not Facebook's fault -- it's part of the nature of human beings and confirmation bias.
Or, hey, Facebook could take a real leap forward and move away from being a centralized silo of information and truly disrupt its own setup -- pushing the information and data out to the edges, where the users could have more control over it themselves. And not in the simplistic manner of Facebook's other "big" announcement of the week about how it'll now let users opt-out of Facebook tracking them around the web (leaving out that they kinda needed to do this to deal with the GDPR in the EU). Opting out is one thing -- pushing the actual data control back to the end users and distributing it is something entirely different.
In the early days of the web, people set up their own websites, and had pretty much full control over the data and what was done there. It was much more distributed. Over time we've moved more and more to this silo model in which Facebook is the giant silo where everyone puts their content... and has to play by Facebook's rules. But with that came responsibility on Facebook's part for everything bad that anyone did on their platform. And, hey, let's face it, some people do bad stuff. The answer isn't to force Facebook to police all bad stuff, it should be to move back towards a system where information is more distributed, and we're not pressured into certain content because that same Facebook thinks it will lead to the most "engagement."
Push the content and the data out and focus on the thing that Facebook has always been better at at it's core: the connection function. Connect people, but don't control all of the content. Don't feel the need to police the content. Don't feel the need to decide who's trustworthy and who isn't. Be the protocol, not the platform, and open up the system so that anyone else can provide a trust overlay, and let those overlays compete. It would take Facebook out of the business of having to decide what's good and what's bad and would give end users much more control.
Facebook, of course, seems unlikely to do this. The value of the control is that it allows them to capture more of the money from the attention generated on their platform. But, really, if it doesn't want to keep dealing with these headaches, it seems like the only reasonable way forward.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: confirmation bias, fake news, mark zuckerberg, metrics, news, ranking, trust, trustworthy
Companies: facebook
Reader Comments
Subscribe: RSS
View by: Time | Thread
Everything old is new again
[ link to this | view in thread ]
It's pretty simple, if it's a crime then hide it (or remove if it's one of those universal things like child porn) and go after the goddamn source. Otherwise just let people hide it and make their own 'feeds' rosy unicorn filled things.
If countries like Germany want to push ethereal 'fake news' or 'hate speech' laws then filter the heck out of their content but let other more mature countries have the thing censorship free.
Is it that hard?
[ link to this | view in thread ]
[ link to this | view in thread ]
Human moderators
[ link to this | view in thread ]
MSM Trust
Most MSM outlets have a reputation for being trustworthy - which survives right up to the point where they report on something where you have actual first hand knowledge about - then you realise how bad they actually are - and you never trust them again.
It is not a matter of trusting some outlets - it is a matter of using the output of multiple sources, where you know at least roughly what their bias and expertise is - and then forming your own opinion.
[ link to this | view in thread ]
Re: Everything old is new again
You kidding me?
Facebook learned everything from AOL.
[ link to this | view in thread ]
They Openly Admit It.
Pretty much says it all.
[ link to this | view in thread ]
Re: They Openly Admit It.
[ link to this | view in thread ]
Re: Re: They Openly Admit It.
Googling the source of that "quote" certainly does say it all.
Dismissing a story because of the messenger without bothering to check further says something about you.
[ link to this | view in thread ]
Re: Re: They Openly Admit It.
In addition you would expect them to do exactly that - because that is what everyone (including the conservative media) does exactly that.
More worrying is how Facebook is cowtowing to the most illiberal regimes on the planet:
https://www.theguardian.com/technology/2017/jul/19/facebook-pakistan-blasphemy-laws-censorsh ip
[ link to this | view in thread ]
Do you EVER, for even ten seconds, worry about GOOGLE?
No, and there's a reason.
[Repeated from next story after seeing this above:
Astute enough to spot which GIANT "platform" is left out? As always here.]
[ link to this | view in thread ]
Context, context....
Alex Jones posts something "dangerous" on infowars.
Techdirt writes about Alex Jones calling him a "paid troll", and links to the original.
What should stay "up"? Yes, some function has to decide what we see....but deleting bad stuff isn't it!
[ link to this | view in thread ]
Re: Do you EVER, for even ten seconds, worry about GOOGLE?
[ link to this | view in thread ]
"trustworthy" MSM can easily become war propaganda cheerleaders
There were many small American news outlets --including Infowars -- that were reporting a much more accurate analysis of the Iraq invasion than any of the "trusted" American mainstream media were reporting. While the well-worn line "if we only knew then what we know now" got thrown around an awful lot in the aftermath of not finding any of the claimed weapons that the invasion was supposed to justify, but the sad fact is that everything was already known "then" (before the invasion) and was all over the internet for anyone who cared to do their own research and make their own conclusions. It was primarily only the US mainstream media that got it wrong, while the rest of the world and the "conspiracy" minded websites were almost universally correct in their skepticism and their reporting of critical facts that the MSM refused to touch.
Many foreign newspapers, including the UK's Independent and Guardian, dared to question the official war narrative, but not a single U.S. based news outlet dared stray from the official Bush-Cheney war propaganda. The American mainstream press had completely stopped being journalists and were instead acting as cheerleaders for a bogus war that ultimately claimed millions of lives and trillions of dollars.
I can't believe 15 years have already passed, and I'm still literally seething with rage about the mainstream media leading this country over a cliff -- and then blaming it on everyone else but themselves.
The 2003 Iraq invasion is the single biggest reason why I don't trust any of the American mainstream media news outlets -- and I'm sure I never will for as long as I live.
[ link to this | view in thread ]
Re: MSM Trust
Not that I don't think there hasn't been a disconnect between what shows up in MSM and most people's direct experience. Tell me about the people in your life who have died, and compare that to what you see in the media. I know some that have wrecked cars and died, but this is not news. We'll lose a few hundred to fentanyl, but this is not on the news. 20,000 or so will shoot themselves and die this year, but this is not news.
[ link to this | view in thread ]
Re: Do you EVER, for even ten seconds, worry about GOOGLE?
Astute enough to spot which GIANT "platform" is left out? As always here.]
Er, YouTube is Google. The reason I included YouTube rather than Google in general is that YouTube is Google's primary user generated content platform, and thus the main platform that people are demanding have their content moderated on...
[ link to this | view in thread ]
Re: Re: MSM Trust
This "getting stuff all screwed up" isn't exactly unique to traditional "main stream media". It's a function of the people in the system.
True but what is unique to MSM is that some people DO trust them.
[ link to this | view in thread ]
Re: "trustworthy" MSM can easily become war propaganda cheerleaders
[ link to this | view in thread ]
I have no problem with this
It's no different from a search engine ranking search results based on their best guess as to how "useful" the search result is instead of just counting the number of times the keywords show up on the page. Sometimes they'll guess well, sometimes they won't and useful results will be ranked lower than some pathetic spam.
[ link to this | view in thread ]
Re: Re: Re: They Openly Admit It.
Whine all you want.
[ link to this | view in thread ]
The dog person reads the article and rates it as accurate.
The cat person reads the article and rates it as questionable.
Six months go by and both revisit the article in light of new information.
The dog person rates it as accurate and credible.
The cat person rates it as debunked and not-credible.
Therein lies the challenge with building out algorithms to automate this process. Let's say you assign a credibility score based on the historic human-based scoring of credibility by source. How frequently that source, or author, accurately reports the news. With as polarized as sources are today, the dog person and the cat person training an algorithm will come away with completely different automated processes reflecting their subjective world view.
That is where we are today.
We have automated algorithms scoring high credibility to sources that report the CrowdStrike analysis of DNC servers suggest it was done by Russians.
We have Bill Binney and associated security researchers claiming what is known about the event has to be an insider from the DNC.
Huge political stakes are at play so expect both sides of that coin to fight tooth and nail to force their will over control of the algorithms that do get deployed.
[ link to this | view in thread ]
Re: Re: "trustworthy" MSM can easily become war propaganda cheerleaders
It's interesting (but not unexpected) that even Trump's harshest critics in the press loudly applauded him when he bombed the Syrian government, both last year and this year. And what could be better for newspaper sales and cable news ratings than war against Russia?
[ link to this | view in thread ]
What about country bias?
[ link to this | view in thread ]
Re:
Therein lies the challenge with building out algorithms to automate this process. Let's say you assign a credibility score based on the historic human-based scoring of credibility by source. How frequently that source, or author, accurately reports the news. With as polarized as sources are today, the dog person and the cat person training an algorithm will come away with completely different automated processes reflecting their subjective world view.
Actually it is far more complicated than that.
There is no straight left - right axis anymore
Foreign policy is even more complicated.
Compare for example George Galloway's take on the world with that of Peter Hitchens.
On the Israel vs Arab (Muslim) axis they would violently disagree
On the Russia vs the West axis they agree (with each other) and both disagree with the western establishment view.
The fact is that most people (and news outlets) are a mixture of different biases on different topics - and they may change with time.
Generally I tend to trust the BBC and the Guardian - but on some issues I know that they have a bias and so I will look at other sources too.
[ link to this | view in thread ]
Re: Re: Everything old is new again
[ link to this | view in thread ]
Re:
How frequently that source, or author, accurately reports the news.
A much better criterion would be " how often do they report or give publicity to, alternative opinions.
[ link to this | view in thread ]
Re: MSM Trust
[ link to this | view in thread ]
Re: What about country bias?
[ link to this | view in thread ]
Re: Do you EVER, for even ten seconds, worry about GOOGLE?
The easier thing to do is to not give you what you want because giving you what you want is an endless chore akin to filling a bottomless pit...
[ link to this | view in thread ]
That is not nearly enough what should happen to FB!
Look at Cambridge Analytica, there the nuclear option has run it's course, and rightfully so.
Therefore FB is next for extermination!
Cheers, oliver
[ link to this | view in thread ]
Trust for Sale
[ link to this | view in thread ]
Re: Re: Do you EVER, for even ten seconds, worry about GOOGLE?
[ link to this | view in thread ]
Re:
If by "nuclear option" you mean "the founders ditched the company and started a new one", sure.
[ link to this | view in thread ]